[go: up one dir, main page]

WO2018101746A2 - Appareil et procédé de reconstruction d'une zone bloquée de surface de route - Google Patents

Appareil et procédé de reconstruction d'une zone bloquée de surface de route Download PDF

Info

Publication number
WO2018101746A2
WO2018101746A2 PCT/KR2017/013839 KR2017013839W WO2018101746A2 WO 2018101746 A2 WO2018101746 A2 WO 2018101746A2 KR 2017013839 W KR2017013839 W KR 2017013839W WO 2018101746 A2 WO2018101746 A2 WO 2018101746A2
Authority
WO
WIPO (PCT)
Prior art keywords
point
logic
color
occlusion area
virtual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2017/013839
Other languages
English (en)
Korean (ko)
Other versions
WO2018101746A3 (fr
Inventor
전영재
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hyundai AutoEver Corp
Original Assignee
Hyundai Mnsoft Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hyundai Mnsoft Inc filed Critical Hyundai Mnsoft Inc
Publication of WO2018101746A2 publication Critical patent/WO2018101746A2/fr
Publication of WO2018101746A3 publication Critical patent/WO2018101746A3/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/143Segmentation; Edge detection involving probabilistic approaches, e.g. Markov random field [MRF] modelling
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/56Particle system, point based geometry or rendering

Definitions

  • the present invention relates to an apparatus and method for restoring a road occlusion area, and more particularly, to an apparatus and method for restoring a road occlusion area of a road surface three-dimensional point group using multi-view images.
  • 3D spatial information has been applied to various application fields such as urban landscape planning, disaster management system, navigation and internet map service.
  • the most important three-dimensional building model in the three-dimensional spatial information is a method of using the digital map and the floor information of the building, using stereo stereoscopic images with sensor modeling, observation of the shadow or relief displacement of the building.
  • LiDAR light detection and ranging
  • the method of using LiDAR data which directly acquires the height values of the points forming the ground surface using a distance measuring device, uses projection stitching or warping for color restoration, but it is difficult to estimate the color clearly. There was a problem in that it was difficult to restore due to lack of possible image or effective color information.
  • the present invention was devised to solve the above-described problem, and an object according to an aspect of the present invention is to restore lane occlusion area of a three-dimensional point group on a road surface by using multi-view images, which can be utilized for lane extraction and map construction. And a road surface occlusion area restoration apparatus and method.
  • An apparatus for restoring a road occlusion area includes an abnormal point group removal logic for removing an abnormal point group from a three-dimensional point group obtained by a mobile mapping system (MMS); An occlusion area detection logic for detecting an occlusion area in an image of the abnormal point group removed by the abnormal point group removal logic; A point group generation logic for generating a virtual point group in the occlusion area detected by the occlusion area detection logic; And occlusion area reconstruction logic for projecting each virtual point generated by the point group generation logic into a multiview image, and changing the color of each virtual point in the multiview image to gray scale to restore the occlusion area. It features.
  • the occlusion area detection logic is a point alignment logic for aligning the order of the surrounding point group in a clockwise or counterclockwise direction based on any one point;
  • An outer point detection logic that detects an outer point of the occlusion area using an angle between two consecutive points centered on any one of the points aligned by the point alignment logic;
  • an occlusion area division logic for generating an outline connecting the outer points detected by the outer point detection logic to distinguish the occlusion area.
  • the outer point detection logic is characterized in that the current point is detected as an outer point if the angle between two consecutive points about any one point is more than the set value than the other angles around.
  • the point group generation logic is characterized in that to generate a virtual point group with a uniform density.
  • the occlusion area restoration logic comprises: projection logic for projecting coordinates of a virtual point in the occlusion area into two-dimensional images; An effective color determination logic that determines whether the color is an effective color based on the color of each point in each image corresponding to the virtual point projected by the projection logic; And an average color application logic for changing the color of each point in the image to gray scale according to the determination result of the effective color determination logic and setting the changed color to the color of the virtual point.
  • the effective color determination logic is characterized in that it is determined as the effective color if the dispersion of the color of each point in each image is within the set dispersion value.
  • the effective color determination logic is characterized by averaging the colors of the points in each image, changing the average color to grayscale, and setting the changed grayscale color to the color of the virtual point.
  • an example-based restoration logic for setting the color of the virtual point using the point group information around the virtual point is further added. It is characterized by including.
  • the example-based restoring logic detects another point within the set area centered on the virtual point, and adjusts the color of the virtual point according to the similarity of the reflection intensity and the color between the detected other point and the virtual point. It is characterized by setting.
  • Road occlusion area restoration method comprises the steps of removing the abnormal point group from the three-dimensional point group obtained by the MMS (Mobile Mapping System) logic removal; Detecting an occlusion area in an image in which an abnormal point group is removed by the abnormal point group removing logic; Generating a virtual point group in the occlusion area detected by the occlusion area detection logic by a point cloud generation logic; And occlusion restoring logic projects each virtual point generated by the point group generating logic into a multi-view image, and changes the color of each virtual point in the multi-view image to gray scale to restore the occlusion area. Characterized in that.
  • MMS Mobile Mapping System
  • the occlusion area detection logic aligns the order of the surrounding point group in a clockwise or counterclockwise direction based on any one point, and the angle between two consecutive points centered on any one of the aligned points is After detecting the outer point of the occlusion area according to whether or not the set angle or more, it is characterized in that to generate an outline connecting the detected outer point to distinguish the occlusion area.
  • the point group generation logic is characterized in that to generate a virtual point group with a uniform density.
  • the occlusion area restoration logic projects the coordinates of the virtual point in the occlusion area as two-dimensional images, and determines whether it is an effective color based on the dispersion of the color of each point in each image corresponding to the projected virtual point. According to the determination result, the color of the point in each image is changed to the gray scale, and the changed color is set to the color of the virtual point.
  • the effective color determination logic is characterized by averaging the colors of points in each image, changing the average color to gray scale, and setting the changed gray scale color to the color of the virtual point.
  • the example-based reconstruction logic detects another point within a set area centered on the virtual point, and sets the color of the virtual point according to the similarity of the reflection intensity and the color between the detected other point and the virtual point. It further comprises a step.
  • the present invention provides a computer readable medium recording a computer program, wherein the computer program, when executed by a processor, at least the abnormal point group in the three-dimensional point group obtained by the Mobile Mapping System (MMS) Removing; Detecting an occlusion area in the image from which the abnormal point group is removed; Generating a virtual point group in the detected occlusion area; And reconstructing the occlusion area by projecting each virtual point of the generated virtual point group into a multi-view image to provide a computer readable medium having recorded thereon a computer program.
  • MMS Mobile Mapping System
  • An apparatus and method for restoring a road occlusion area may completely acquire lane information on a road surface by restoring three-dimensional information that cannot be obtained due to an occlusion, and based on this, it is necessary to construct a high-precision map of a lane level.
  • the lane information may be utilized, and the lane information may be utilized as high quality input data such as an automatic lane extraction module.
  • Road occlusion area restoration apparatus and method according to another aspect of the present invention by using the accurate projection and the multi-color image information, it is possible to make a clear color estimation than the conventional method using the stitching or warping (Warping).
  • Road occlusion area restoration apparatus and method according to another aspect of the present invention can significantly reduce the cost of obtaining lane information on the road surface.
  • FIG. 1 is a block diagram of a road occlusion area restoration apparatus according to an embodiment of the present invention.
  • FIG. 2 is a plan view of a mobile mapping system according to an embodiment of the present invention.
  • FIG 3 is a side view of a mobile mapping system according to an embodiment of the present invention.
  • FIG. 4 is a diagram illustrating a road surface definition and an abnormal point group removal method from a 3D point group according to an embodiment of the present invention.
  • FIG. 5 is a view showing a method for detecting a blockage area of a road point group according to an embodiment of the present invention.
  • FIG. 6 is a view showing a color correction method through the generation and projection of a three-dimensional point group of uniform density according to an embodiment of the present invention.
  • FIG. 7 is a diagram illustrating an example-based color restoration method according to an embodiment of the present invention.
  • FIG. 8 is a diagram illustrating a road surface occlusion area restoration method according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a road occlusion area restoration apparatus according to an embodiment of the present invention
  • Figure 2 is a plan view of a mobile mapping system according to an embodiment of the present invention
  • Figure 3 is an embodiment of the present invention
  • Figure 4 is a side view of the mobile mapping system according to the present invention
  • Figure 4 is a view showing a road surface regulation and abnormal point group removal method from the three-dimensional point group according to an embodiment of the present invention
  • Figure 5 is a road surface according to an embodiment of the present invention
  • FIG. 6 is a view illustrating a method for detecting an occlusion area of a point group
  • FIG. 6 is a view illustrating a method for detecting an occlusion area of a point group
  • FIG. 6 is a view illustrating a color correction method by generating and projecting a three-dimensional point group of uniform density according to an embodiment of the present invention
  • FIG. 7 is an embodiment of the present invention. Is a diagram illustrating an example-based color restoration method according to FIG.
  • an apparatus for restoring a road occlusion area operates by a processor, receives various measurement information from a mobile mapping system (MMS) 10, and uses the measurement information. To restore the occlusion area of the road surface.
  • MMS mobile mapping system
  • the MMS 10 is mounted on a vehicle and acquires measurement information on a surrounding area by capturing or scanning an image of a surrounding road.
  • the MMS 10 generates a plurality of cameras 11 for capturing surrounding images to obtain image information, a plurality of GPS (Global Positioning System) 13 for detecting the position of the vehicle, and a pulse laser to generate points of the ground surface.
  • GPS Global Positioning System
  • a plurality of riders Light Detection and Ranging (LiDAR) 12) for directly obtaining a height value
  • IMU Inertial Measurement Unit
  • a plurality of cameras 11 are provided so that each photographs the surroundings, so that a multi-view image can be obtained even at the same point.
  • the road surface occlusion area restoration apparatus 20 is to restore the occlusion area of the road surface by using the measurement information transmitted from the MMS 10 is operated by the processor
  • the processor includes an abnormal point group removal logic 21, a closed area detection logic 22, a point group generating logic 23, and a closed area restoring logic 24.
  • the abnormal point group removal logic 21 removes the abnormal point group from the three-dimensional point group of the image acquired by the rider 12 of the MMS 10.
  • the abnormal point group removal logic 21 extracts only the point group corresponding to the road surface from the three-dimensional point group obtained through the rider 12 as shown in FIG. In this case, the abnormal point group removal logic 21 estimates the height value of the vehicle and the altitude above sea level using the measurement information of the GPS 13 and the IMU 14, and recognizes the point group around the vehicle position as the point group on the road surface. .
  • the abnormal point group removal logic 21 irradiates the point group in various directions based on the vehicle position, as shown in FIG. If the direction angle with the irradiation point group is more than the set angle, the three-dimensional point group investigation is stopped and the point group irradiated up to that point is merged into the road surface.
  • the abnormal point group removal logic 21 can define the point group of the road surface as shown in (c) of FIG. 4 and can remove the abnormal point group of the object that caused the occlusion area.
  • the occlusion area detection logic 22 detects the occlusion area in the image removed by the abnormal point group removal logic 21.
  • the occlusion area detection logic 22 includes a point alignment logic 221, an outer point detection logic 222, and a occlusion area separation logic 223.
  • the point alignment logic 221 detects any one of three-dimensional point groups as shown in (a) of FIG. 5 and aligns the sequence number of the neighboring point groups clockwise or counterclockwise based on the detected current point. do.
  • the current point is a point currently detected among a plurality of points and subject to processing.
  • the outer point detection logic 222 detects an angle between two consecutive points centered on the current point among the points aligned by the point alignment logic 221, and the detected angle is set to a value other than the surrounding angle. If it is abnormal, the current point is recognized as the outer point.
  • FIG. 5B shows that the current point is recognized as an outer point when the angle between the current points is significantly larger than the other angles.
  • the occlusion area division logic 223 generates an outline connecting the outer points detected by the outer point detection logic 222 to distinguish the occlusion area.
  • FIG. 5C illustrates an example in which a convex hull is generated based on an outline formed by connecting outer points to distinguish the occlusion region.
  • the point group generating logic 23 generates an imaginary point group in the occlusion area detected by the occlusion area detection logic 22 as shown in FIG. In this case, the point group generating logic 23 generates an imaginary point group with a uniform density in the occlusion area.
  • the point group generating logic 23 defines three-dimensional coordinates for each virtual point, and does not define reflection intensity and color.
  • the occlusion area restoration logic 24 projects each virtual point generated by the point group generating logic 23 into a two-dimensional multiview image, and changes the color of each virtual point in the multiview image to gray scale to occlude. Restore the area. In this case, the occlusion area restoration logic 24 performs the above process for all of the virtual points in the occlusion area.
  • the occlusion area reconstruction logic 24 includes a projection logic 241, an effective color determination logic 242, an average color application logic 243, and an example-based reconstruction logic 244.
  • the projection logic 241 projects the three-dimensional coordinates of each virtual point through the camera 11 to the images obtained at different viewpoints.
  • the virtual point is located in each of the plurality of multi-view images. That is, three-dimensional coordinates are defined in each of the virtual points generated by the point group generating logic 23.
  • the three-dimensional coordinates are projected as a two-dimensional multi-view image, the points corresponding to the corresponding coordinates in each image are recognized. As a result, one virtual point can be recognized as a multi-view.
  • the effective color determination logic 242 determines whether it is an effective color based on the color of each point in each image corresponding to the virtual point. In this case, the effective color determination logic 242 determines the effective color if the dispersion of the color of each point in each image is within the set dispersion value. For example, when the point is normally photographed by each camera 11, the color of the corresponding point in each image appears similar to each other, whereas when abnormally photographed, the color of the corresponding point in each image is the image. Each one is different from each other. For example, if the dispersion is relatively large, it is likely that the color of the road surface is not accurately obtained because different objects are photographed at various points in time.
  • the effective color determination logic 242 determines the color of the corresponding point as the effective color when the dispersion of the color of each point in the image is within the set dispersion value. It is determined that the color of the spot is not an effective color.
  • the average color application logic 243 sets the color of the virtual point using the color of the point in each image according to the determination result of the effective color determination logic 242.
  • the average color application logic 243 averages the color values of these colors and changes the average color to the grail scale.
  • the point where the paint is applied on the road surface is relatively high, so it appears bright when the color is converted to gray scale, and the point where the paint is not applied is dark due to the relatively low reflection strength.
  • the average color application logic 243 converts the color to gray scale, it appears bright or dark depending on whether the paint is applied to the corresponding point.
  • the average color application logic 243 changes the average color obtained by averaging the color of the corresponding point in the image to gray scale, and sets the changed color to the color of the corresponding virtual point.
  • the virtual point is the point where the paint is applied, the virtual point is displayed in a bright color, and if the point is not applied, the virtual point is displayed in a dark color. This process is performed for all imaginary points, and as a result, the occlusion area can be restored.
  • the example-based restoration logic 244 uses virtual point group information centering on the virtual point to simulate the virtual color. Sets the color of the point.
  • the example-based restoration logic 244 sets an arbitrary area around the virtual point, and detects another point within this setting area.
  • the example-based reconstruction logic 244 then sets the color of the virtual point according to the similarity of the reflection intensity and the color between the other points detected and the virtual point.
  • the other point detected is a virtual point that has already been restored or a pre-detected point existing outside the occlusion area, and is not an unrestored virtual point in the occlusion area.
  • virtual point groups in the occlusion area are set to a uniform density, and in particular, virtual points according to the reflection intensity and similarity between the virtual points and other points in the setting area (target restoration area) are shown.
  • the setting area is restored as shown in FIG.
  • FIG. 8 is a diagram illustrating a road surface occlusion area restoration method according to an embodiment of the present invention.
  • the rider 12 inside the MMS 10 detects a three-dimensional point group.
  • the abnormal point group removing logic 21 extracts the point group corresponding to the road surface from the three-dimensional point group obtained through the rider 12 (S10).
  • the abnormal point group removal logic 21 defines the point group on the road surface and removes the abnormal point group of the object that causes the occlusion area (S20).
  • the occlusion area detection logic 22 detects the occlusion area in the image removed by the abnormal point group removal logic 21 (S30).
  • the occlusion area detection logic 22 detects any one of the three-dimensional point groups, and aligns the order of the peripheral point groups in the clockwise or counterclockwise direction based on the detected current point. Subsequently, the occlusion area detection logic 22 detects an angle between two consecutive points centered on the current point, and recognizes the current point as an outer point when the detected angle is more than a predetermined value. As the outer point is recognized as described above, the occlusion area detection logic 22 generates an outline connecting the outer points to distinguish the occlusion area. In this case, a convex hull composed of outer points may be generated to distinguish the occlusion area.
  • the point group generation logic 23 As the occlusion detection area is detected by the occlusion area detection logic 22, the point group generation logic 23 generates a virtual point group of uniform density in the occlusion area detected by the occlusion area detection logic 22 (S40). ). In this case, three-dimensional coordinates are defined for each virtual point, whereas reflection intensity and color are not defined.
  • the occlusion area restoring logic 24 sets the camera 11 to three-dimensional coordinates of each of the virtual points generated by the point group generating logic 23.
  • the color of each point in each image corresponding to the projected virtual point is detected (S50).
  • the occlusion area restoration logic 24 determines whether it is an effective color based on the color of each point in each image corresponding to the virtual point projected by the projection logic 241 (S60). In this case, the effective color judgment logic 242 judges the color of the point as the effective color if the color dispersion of each point in the image is within the set dispersion value. It is determined that the color of the spot is not an effective color.
  • the average color applied logic 243 averages the color values of these colors, changes the average color to the grail scale, and changes the changed color to the corresponding virtual color.
  • the average color application logic 243 sets an arbitrary area around the virtual point and detects another point within this setting area. After that, the occlusion area is restored by setting the color of the virtual point according to the similarity of the reflection intensity and the similarity of the color between the other detected points and the virtual point (S80).
  • the apparatus and method for restoring a road occlusion area may completely acquire lane information on a road surface by restoring three-dimensional information that cannot be obtained due to occlusion. It can be used to construct high-level maps at the lane level, and the lane information can be used as high-quality input data such as an automatic lane extraction module.
  • the apparatus and method for restoring the road surface occlusion area enables accurate color estimation than the conventional stitching or warping method by using accurate projection and multiple color image information.
  • the cost of obtaining lane information on the road surface can be greatly reduced.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Image Generation (AREA)

Abstract

La présente invention concerne un appareil et un procédé de reconstruction d'une zone bloquée de surface de route. L'appareil de reconstruction d'une zone bloquée de surface de route selon la présente invention comprend : une logique de suppression de nuages de points anormaux destinée à éliminer des nuages de points anormaux présents dans des nuages de points tridimensionnels acquis au moyen d'un système de mappage mobile (MMS) ; une logique de détection de zone bloquée destinée à détecter une zone bloquée dans une image à partir de laquelle des nuages de points anormaux ont été supprimés par la logique de suppression de nuages de points anormaux ; une logique de génération de nuage de points destinée à générer un nuage de points virtuels dans la zone bloquée détectée au moyen de la logique de détection de zone bloquée ; et une logique de reconstruction de zone bloquée destinée à projeter chaque point virtuel généré par la logique de génération de nuage de points en tant qu'image de point en pointillés, et à changer la couleur de chaque point virtuel au sein de l'image de point en pointillé en une échelle de gris pour reconstruire la zone bloquée.
PCT/KR2017/013839 2016-11-30 2017-11-29 Appareil et procédé de reconstruction d'une zone bloquée de surface de route Ceased WO2018101746A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2016-0161367 2016-11-30
KR1020160161367A KR102790812B1 (ko) 2016-11-30 2016-11-30 도로면 폐색 영역 복원 장치 및 방법

Publications (2)

Publication Number Publication Date
WO2018101746A2 true WO2018101746A2 (fr) 2018-06-07
WO2018101746A3 WO2018101746A3 (fr) 2018-08-16

Family

ID=62241576

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/013839 Ceased WO2018101746A2 (fr) 2016-11-30 2017-11-29 Appareil et procédé de reconstruction d'une zone bloquée de surface de route

Country Status (2)

Country Link
KR (1) KR102790812B1 (fr)
WO (1) WO2018101746A2 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594486A (zh) * 2020-12-04 2022-06-07 上海禾赛科技有限公司 滤除雷达点云中的拖点的方法、处理器以及激光雷达系统
US20220258733A1 (en) * 2021-02-12 2022-08-18 Honda Motor Co., Lt.D Division line recognition apparatus
CN116844131A (zh) * 2022-03-22 2023-10-03 比亚迪股份有限公司 车辆控制方法、服务器、车辆和存储介质
US20230401728A1 (en) * 2022-06-08 2023-12-14 GM Global Technology Operations LLC System and method for occlusion reconstruction in surround views using temporal information

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102150954B1 (ko) * 2020-04-06 2020-09-02 주식회사 맥스트 점군 정보 가공 장치 및 방법
KR102568907B1 (ko) 2020-08-20 2023-08-23 (주)신한항업 폐색영역 검출을 위한 기계학습용 데이터셋 생성 방법과 데이터셋 생성시스템
KR102747679B1 (ko) 2022-11-30 2025-01-02 (주)신한항업 폐색영역 영상 자동 복원을 위한 mms 연계 방법과 영상처리시스템

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101367284B1 (ko) * 2008-01-28 2014-02-26 삼성전자주식회사 시점 변화에 따른 영상 복원 방법 및 장치
KR101289885B1 (ko) * 2009-10-22 2013-07-24 한국전자통신연구원 건물의 모델링을 위한 장치 및 방법
KR100995400B1 (ko) * 2010-06-29 2010-11-19 한진정보통신(주) 지상 라이다를 이용한 건물 외곽선 추출 시스템 및 그 방법
KR101683164B1 (ko) * 2010-09-10 2016-12-05 삼성전자주식회사 폐색 영역 복원 장치 및 방법
US8429195B2 (en) * 2011-05-13 2013-04-23 Hntb Holdings Ltd Managing large datasets obtained through a survey-data-acquisition process
KR101219767B1 (ko) * 2011-07-04 2013-01-17 (주)아세아항측 수치지형도 작성을 위한 차량 모바일 매핑 시스템을 이용한 도로 레이어 현지조사 방법
KR101408719B1 (ko) * 2012-09-11 2014-06-18 (주)리얼디스퀘어 3차원 영상의 스케일 변환 장치 및 그 방법
KR101677972B1 (ko) * 2016-07-26 2016-11-21 주식회사 싸인텔레콤 촬영영역줌카메라모듈·다차선 촬영영역 생성모듈로 이루어진 스마트 다차선 차량 번호인식 장치 및 방법

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114594486A (zh) * 2020-12-04 2022-06-07 上海禾赛科技有限公司 滤除雷达点云中的拖点的方法、处理器以及激光雷达系统
US20220258733A1 (en) * 2021-02-12 2022-08-18 Honda Motor Co., Lt.D Division line recognition apparatus
CN116844131A (zh) * 2022-03-22 2023-10-03 比亚迪股份有限公司 车辆控制方法、服务器、车辆和存储介质
US20230401728A1 (en) * 2022-06-08 2023-12-14 GM Global Technology Operations LLC System and method for occlusion reconstruction in surround views using temporal information
US12475582B2 (en) * 2022-06-08 2025-11-18 GM Global Technology Operations LLC System and method for occlusion reconstruction in surround views using temporal information

Also Published As

Publication number Publication date
KR20180061803A (ko) 2018-06-08
KR102790812B1 (ko) 2025-04-04
WO2018101746A3 (fr) 2018-08-16

Similar Documents

Publication Publication Date Title
WO2018101746A2 (fr) Appareil et procédé de reconstruction d'une zone bloquée de surface de route
CN114485579B (zh) 海面测量系统、海面测量方法以及存储介质
CN105279372B (zh) 一种确定建筑物高度的方法和装置
JP6363863B2 (ja) 情報処理装置および情報処理方法
WO2012176945A1 (fr) Appareil destiné à synthétiser des images tridimensionnelles pour visualiser des environnements de véhicule et procédé associé
WO2011112028A2 (fr) Procédé de génération d'image stéréoscopique et dispositif associé
CN114862973B (zh) 基于固定点位的空间定位方法、装置、设备及存储介质
WO2012108721A2 (fr) Procédé et dispositif pour produire une réalité augmentée au moyen de données images
WO2020235734A1 (fr) Procédé destiné à estimer la distance à un véhicule autonome et sa position au moyen d'une caméra monoscopique
WO2014035103A1 (fr) Appareil et procédé de surveillance d'objet à partir d'une image capturée
WO2010076988A2 (fr) Procédé d'obtention de données d'images et son appareil
CN106767526A (zh) 一种基于激光mems振镜投影的彩色多线激光三维测量方法
WO2012148025A1 (fr) Dispositif et procédé servant à détecter un objet tridimensionnel au moyen d'une pluralité de caméras
KR100934904B1 (ko) 거리 추정 장치 및 추정 방법
WO2011136407A1 (fr) Appareil et procédé de reconnaissance d'image à l'aide d'un appareil photographique stéréoscopique
WO2016204402A1 (fr) Procédé d'inspection de défaut de composant, et appareil associé
JP2015072577A (ja) 移動体検知装置
WO2020071849A1 (fr) Procédé de production d'une image détaillée à 360° à l'aide d'informations de profondeur réelle de mesure
WO2013025011A1 (fr) Procédé et système de suivi d'un corps permettant de reconnaître des gestes dans un espace
WO2017195984A1 (fr) Dispositif et procédé de numérisation 3d
WO2019098421A1 (fr) Dispositif de reconstruction d'objet au moyen d'informations de mouvement et procédé de reconstruction d'objet l'utilisant
WO2011071313A2 (fr) Procédé et appareil d'extraction d'une image de texture et d'une image de profondeur
WO2019139441A1 (fr) Dispositif et procédé de traitement d'image
CN110634136B (zh) 一种管道壁破损检测方法、装置及系统
WO2014046325A1 (fr) Système de mesure tridimensionnel et son procédé

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17875728

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17875728

Country of ref document: EP

Kind code of ref document: A2