[go: up one dir, main page]

WO2019151704A1 - Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation - Google Patents

Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation Download PDF

Info

Publication number
WO2019151704A1
WO2019151704A1 PCT/KR2019/000927 KR2019000927W WO2019151704A1 WO 2019151704 A1 WO2019151704 A1 WO 2019151704A1 KR 2019000927 W KR2019000927 W KR 2019000927W WO 2019151704 A1 WO2019151704 A1 WO 2019151704A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
depth
continuity
visibility
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2019/000927
Other languages
English (en)
Korean (ko)
Inventor
오원영
양승환
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GEOSPATIAL INFORMATION TECHNOLOGY Co Ltd
Original Assignee
GEOSPATIAL INFORMATION TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GEOSPATIAL INFORMATION TECHNOLOGY Co Ltd filed Critical GEOSPATIAL INFORMATION TECHNOLOGY Co Ltd
Publication of WO2019151704A1 publication Critical patent/WO2019151704A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/017Detecting movement of traffic to be counted or controlled identifying vehicles
    • G08G1/0175Detecting movement of traffic to be counted or controlled identifying vehicles by photographing vehicles, e.g. when violating traffic rules
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/095Traffic lights
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis
    • H04N2013/0081Depth or disparity estimation from stereoscopic image signals

Definitions

  • An embodiment of the present invention relates to a three-dimensional visibility measurement method using a traffic monitoring camera, and more particularly, a three-dimensional visibility measurement method using a traffic monitoring camera that can calculate the visibility information of visibility measurement while performing traffic monitoring. And a system using the same.
  • CCTVs closed circuit televisions
  • CCTVs for simple monitoring
  • CCTVs for preventive purposes
  • CCTVs for analysis and tracking for certain purposes.
  • Weather monitoring CCTVs can also be used to prevent drivers from facing the risk of traffic accidents in the event of visibility disturbances such as haze, fog and fog.
  • CCTV combined weather monitoring system is installed weather measurement equipment on the side of the road to prevent accidents caused by fog, to measure the visibility at all times, and to maintain the proper driving speed when the visible distance decreases. It is also used to guide.
  • the visibility system currently used is an optical visibility system that calculates visibility based on light scattering or transmission.
  • this type of visibility system has a disadvantage in that accuracy is poor when there is a visibility disorder phenomenon.
  • the optical visibility system due to the characteristics of the optical visibility system, it is not a whole area to be measured, but is a method of representing the visibility of the area by the measured value at a single point, so it is only 1 m away from the location of the visibility system. Even if fog occurs, there is a serious problem that can not determine the fog itself. For this reason, when the visibility system is installed, the CCTV system is installed in the same place to perform monitoring or reflect the observation result obtained by the person's direct visit. Can be.
  • the weather observation system and method of the Republic of Korea Patent Publication No. 10-2009-0051785 is known.
  • the prior art measures visibility using a visual line and a road model obtained from the moving area of the vehicle.
  • the related art is a communication unit for receiving an image signal from a camera installed on a road and transmitting a result, an image of extracting a moving region of the vehicle from the received image signal and determining a visual line using the extracted moving region. And a processing unit, and a control unit for calculating visibility corresponding to the determined line of sight using the correction calculation function.
  • the present invention has been made to solve the problems of the prior art, and by adding only one camera to the existing traffic monitoring system, a three-dimensional visibility measurement method and system that can accurately perform two functions of monitoring and visibility measurement The purpose is to provide.
  • Another object of the present invention is to create a depth map through the first image and the second image, create an iso-depth line connecting the same depth in the depth map, and then connect the iso-depth lines. Read the continuity to calculate the visible distance, provide the calculated visibility information to the driver through the display device, operate the safety indicator, etc. when the visible distance is less than a certain distance, or display the image to the driver through various display devices inside and outside the vehicle To provide a three-dimensional visibility measurement method and system that can provide.
  • the three-dimensional visibility measurement system for solving the above technical problem, the first camera to shoot the first image and the second camera to shoot a second image from a different angle from the first camera And a controller configured to calculate visible distance information from an input of a first image photographed by the first camera and a second image photographed by the second camera; A storage unit which stores the first image, the second image, and the visible distance information in association with the controller; And a communication module for supporting network interworking between the controller, the first camera, the second camera, and the control server, wherein the controller is configured to generate a depth map using the first image and the second image.
  • An image conversion module, and a visibility measurement module for generating an iso-depth line from the depth map, and detects the connection continuity of the iso-depth line to determine whether there is a certain level or more of continuity.
  • a three-dimensional visibility measurement method comprising: acquiring a stereo image from two cameras and generating a depth map; Creating a depth line in a manner similar to a contour line from the depth map; Determining the connection continuity at the equi-depth line; And when the connection continuity of the iso-depth line has a certain level or more of continuity, acquiring a maximum value of the iso-depth line among the images having the continuity, wherein the maximum value corresponds to a current viewing distance.
  • the three-dimensional visibility measurement method may further comprise determining the authenticity of the image in the acquiring step.
  • the 3D visibility measurement method may further include extracting an edge from an original image and analyzing the edge by overlapping the depth line in order to increase the accuracy of the continuity. have.
  • the three-dimensional visibility measurement method because it is impossible to extract the edge (edge) in the area when the image (image) due to the visibility obstacles, such as fog, overlapping the depth line of the plurality of images And further analyzing the data. In this case, the error of matching due to the fog can be eliminated, so that the more accurate viewing distance can be determined.
  • a method for measuring 3D visibility including a first image photographed by a first camera and a second image photographed by a second camera photographed at a different angle from the first camera.
  • a depth map is generated through the first image and the second image, the visible distance to the object is calculated, the visible distance is provided to the driver through the display, and the visible distance is If the distance is less than a certain distance to turn on the safety indicator light and provide an image through the display, it is more economical than the conventional visibility system can provide more accurate visibility information to the driver, thereby contributing to reducing traffic accidents caused by fog.
  • FIG. 1 is a block diagram of a three-dimensional visibility measurement system using a traffic monitoring camera according to an embodiment of the present invention.
  • FIG. 2 is a diagram illustrating normal images of the first and second cameras and images at the time of fog generation that may be used in the 3D visibility measurement system according to an exemplary embodiment of the present invention.
  • FIG. 3 is a diagram illustrating a depth map during normal and fog generation that can be used in a three-dimensional visibility measurement system according to an embodiment of the present invention.
  • FIG. 4 is an exemplary view of an image extracted from the depth lines of the normal and the fog occurs that can be used in the three-dimensional visibility measurement system according to an embodiment of the present invention
  • FIG. 5 is an exemplary diagram of boundary image extraction images during normal and fog generation that can be used in a 3D visibility measurement system according to an exemplary embodiment of the present invention.
  • FIG. 6 is an exemplary view of an image obtained by superimposing an extracted depth line image and a boundary line extracted image during normal and fog generation that can be used in a 3D visibility measurement system according to an exemplary embodiment of the present invention.
  • FIG. 7 is a view showing the operation of the display unit and the safety display device according to an embodiment of the present invention.
  • FIG. 8 is a flowchart illustrating a three-dimensional visibility measurement method according to an embodiment of the present invention.
  • FIG. 1 is a block diagram of a three-dimensional visibility measurement system using a traffic monitoring camera according to an embodiment of the present invention.
  • the 3D visibility measuring system 100 includes a first image receiving unit 111, a second image receiving unit 112, an infrared light transmitting unit 114, and a pan / tilt driving unit 116.
  • the controller 120 may include a storage unit 130, a communication unit 140, and a display unit 150.
  • the first image receiver 111 is connected to the first camera to obtain a first image from the first camera.
  • the first image receiving unit 111 may include a receiving port that receives the first image from the first camera, or may be connected to the first camera according to a preset configuration or under the control of the controller 120 to display the first image of the first camera. It may include a means for reading or a component for performing an operation corresponding to the means.
  • the second image receiver 112 is connected to the second camera similarly to the first image receiver 111 to obtain a second image from the second camera.
  • the second image receiving unit 112 includes a receiving port that receives the second image from the second camera, or connects to the second camera according to a preset configuration or under the control of the controller 120 to store the second camera. It may include a means for reading the stored second image or a component for performing an operation corresponding to the means.
  • the first camera and the second camera may be spaced apart by a predetermined interval to generate a stereo image.
  • the first camera and the second camera may be stereo cameras depending on the implementation.
  • the first and second image receivers 111 and 112 may be separate components, but are not limited thereto and may be implemented as a single image receiver.
  • the infrared light emitter 114 may include an infrared light emitter, and may be coupled to the first camera and the second camera so that the first camera and the second camera may be used for photographing at night or during fog.
  • the infrared light emitter 114 may include an infrared light emitter driver for controlling the operation of the infrared light emitter. In this case, the infrared light emitter driver may be controlled by the controller 200.
  • the pan / tilt drive unit 116 includes a pan / tilt drive unit, which is a single hardware installed in a single housing of the first camera and the second camera, and includes a first image and a second image in the first and second cameras. It is made to obtain a stereo image.
  • the pan / tilt driver 116 may include a single drive shaft coupled to the housing such that a pan operation or a tilt operation of the first camera and the second camera is interlocked with each other.
  • the pan / tilt driver 116 may include a driving circuit or a software driving module for controlling the pan / tilt driving unit.
  • the pan / tilt driver 116 may include a first zoom driver coupled to the first camera and a second zoom driver coupled to the second camera.
  • the controller 120 controls the operation of each component of the 3D visibility measurement system.
  • the controller 120 may include a processor or a microprocessor.
  • the controller 120 may include a stereo image conversion module 121, a visibility measurement module 122, and a PTZ control module 123.
  • the stereo image conversion module 121 is a combination of the first image of the first camera and the second image of the second camera or stereo vision image data (hereinafter, simply referred to as 'image data', 'source data' or 'first image data' ) Can be pre-processed side by side or in pairs, vertical errors of pre-processed image data can be removed, geometric correction can be performed, or depth maps can be generated using geometrically corrected image data.
  • the stereo image conversion module 121 extracts a boundary line corresponding to an edge in the image and generates image data (hereinafter, 'correction measurement image data' or 'second image data') overlapping the depth line and the boundary line. Can be.
  • the stereo image conversion module 121 may extract the depth map from the first image and the second image and convert the depth information into the RGB value from the depth map.
  • the visibility measurement module 122 may calculate or obtain a visible distance from the second image data.
  • the information calculated or acquired by the visibility measurement module 122 may be used to operate a safety display device installed as a traffic support device or provide safety driving information to a roadside display device or an in-vehicle display device when fog occurs. .
  • the information may include visibility information.
  • the PTZ control module 123 is a means for controlling the operation of the first camera and the second camera.
  • the PTZ control module 123 may capture a detailed view of the target or the target area through the pan, tilt, and zoom functions.
  • the PTZ control module 123 may control a pan or tilt operation of the pan / tilt driver 116 coupled to the first camera and the second camera so as to focus a desired position in the target area where the first camera and the second camera are set. Can be.
  • the PTZ control module 123 controls the pan / tilt driver 116 to rotate the first camera and the second camera up, down, left, and right to perform 360-degree omnidirectional measurements from the point where the camera is installed.
  • the PTZ control module 123 may simultaneously control the zoom magnifications of the first camera and the second camera at the same magnification or individually adjust the zoom magnifications to different magnifications.
  • the visibility measurement module 122 may automatically operate the zoom lens of the pan / tilt driver 116 and the camera at a predetermined time interval through the PTZ control module 123, but is not limited thereto. It may be implemented to operate as.
  • the stereo image conversion module 121, the visibility measurement module 122, and the PTZ control module 123 described above may be implemented in at least a part of hardware in the controller 120 or the controller 120 may be stored in the storage 130.
  • the module may be implemented to mount at least one of the stereo image conversion module 121, the visibility measurement module 122, and the PTZ control module 123 and execute a function thereof.
  • the storage unit 130 may include storage means such as a semiconductor memory such as a RAM or a ROM, a flash memory, a hard disk, an optical disk, or the like.
  • the storage unit 130 may store at least one or more of the stereo image conversion module 121, the visibility measurement module 122, and the PTZ control module 123 in the form of a software module.
  • the communication unit 140 may support transmission and reception of signals and data between the control unit 120 and each component in the 3D visibility measurement system. That is, the communication unit 140 may include a means or a component for providing or supporting a connection for interworking between the input terminal and the output terminal between the process units in the system.
  • the communication unit 140 may include at least one communication subsystem for supporting a wired communication network, a wired network, a wireless communication network, a wireless network, or a combination thereof
  • the control unit 120 may include the safety display device 160, It may support transmitting and receiving signals and data with the control server 170 or the display device 180.
  • the safety display device 160 may include a fog generation warning device using sound, light, and the like, a vehicle induction lamp, a flashing light, or the like installed in a center line, a lane, a guide rail, and the like in a fog area.
  • the safety display device 160 interlocks with the 3D visibility measurement system 100 and may operate according to the control signal of the system.
  • the control server 170 may collect and monitor the signals or information produced or output from the 3D visibility measurement system 100 and the status of peripheral devices operating in accordance with the signals or information.
  • the control server 170 may classify a fog area or detect a change in a fog area or spread or reduce the fog area based on information of a plurality of adjacent three-dimensional visibility measurement systems, and based on the obtained information.
  • An additional traffic safety system can be operated.
  • the control server 170 may receive the visibility information, the first and second images 201, 201a, 202, 202a or the information of the safety display device 160, and the first and second cameras, the safety.
  • the display device 160 or the display device 180 may be controlled.
  • the display device 180 may include a device installed at a roadside and displaying only text or both text and an image.
  • the display device 180 may include all image display devices operating in an intelligent traffic system (ITS).
  • ITS intelligent traffic system
  • the display device 180 may include various image display devices that are located on a road or installed in a vehicle driving a road.
  • the three-dimensional visibility measurement system 100 is implemented to further include a first camera, a second camera, a safety display device 160, a control server 17 and an external display device 180 in a broad sense. Can be.
  • the first camera photographs the first image 201 and the second camera photographs the second image 202.
  • the first camera may photograph the first image 201a and the second camera may photograph the second image 202a.
  • the camera used in the existing traffic monitoring system is the first camera, it can be implemented to perform two functions of monitoring and visibility measurement by adding only one camera like the second camera.
  • the first camera and the second camera may be configured to capture an image in 360 ° by using a zoom lens having a variable focal length and a pan / tilt device capable of rotating up, down, left, and right.
  • the first camera and the second camera may simultaneously provide a first image and a second image of roads and streets to monitor traffic at different zoom magnifications.
  • each of the first image and the second image may be an enlarged image including an object such as a vehicle.
  • the first camera may be zoomed in and the second camera may be operated in the same manner as the zoom out.
  • the first camera may be implemented to display a general image and the second camera may display an image to which the fog removing function is applied.
  • the controller 120 interoperates with the first camera and the second camera and acquires, as an input, first images 201 and 201a captured by the first camera and second images 202 and 202a captured by the second camera. As illustrated in (a) and (b) of FIG. 3, images 203 and 204 including depth map information may be generated. The depth map can be used to calculate the line of sight information.
  • the controller 120 may include a stereo image conversion module 121, a visibility measurement module 122, and a PTZ (pan / tilt / zoom) control module 123.
  • the stereo image conversion module 121 generates a depth map by using the first image photographed by the first camera and the second image photographed by the second camera, and creates a depth line from the depth map (205 of FIG. 4). And 206, a boundary line may be extracted from the original image (source data) (see 207 and 208 of FIG. 5).
  • the stereo image conversion module 121 may correct the first and second images 201 or 201a, 202 or 202a, and then generate a depth map using the corrected first and second images. By performing the calibration process, a more accurate depth map of the data can be generated.
  • the stereo image conversion module 121 determines a conjugate point from the intersection of the projection center and the reference point on the position plane of the first and second images 201, 201a, 202, and 202a, and uses the conjugate point to determine the three-dimensional image. It is implemented to generate a depth map using Epipolar Geometry which generates.
  • the visibility measurement module 122 is implemented to detect the connection continuity using the depth line and the boundary line, and determine whether the connection continuity has a certain level or more of continuity.
  • the visibility measurement module 122 extracts an edge from the original image of the first image photographed by the first camera and the second image photographed by the second camera, and extracts an edge and the iso-depth line.
  • the analysis can be superimposed (see images 209 and 210 of FIG. 6).
  • the visibility measurement module 122 may be implemented to be optimized by additionally applying a method of extracting a plurality of consecutive images and a statistical method in order to reduce a determination error at a boundary where visibility is unclear when fog occurs.
  • the depth line means a line connecting a point representing a distance and a closeness of a distance on the depth map, similar to a contour line connecting a point having the same height based on the average surface of the sea. If an image is formed so that one object is accurately distinguished from the surrounding background, it may be determined that the image has continuity between contiguous image elements (pixels).
  • the visibility measurement module 122 may examine the continuity in the iso-depth line, and may determine that the value is the visible distance only when the maximum value is acquired only when the continuity is higher than a predetermined level.
  • the visibility distance at that time can be said to be the theoretical maximum that can be obtained from stereo images.
  • the visibility measurement module 122 if the iso-depth line has a continuity of a predetermined level or more, the maximum value in the iso-depth line having continuity is obtained, and the maximum value is calculated as the visible distance information.
  • the iso-depth line in the visibility measurement module 122 does not have a continuity of a predetermined level or more, the minimum value of the iso-depth line is calculated as visual distance information.
  • the display 150 or the display device 180 may display the visual distance calculated by the controller 120 as an image, and display the captured first image 201 and 201a or the second image 202 and 202a. You may.
  • the safety display device 160 may perform a preset operation or output preset information or a screen corresponding to the visible distance information calculated by the controller 120.
  • the safety display device 160 may include a device for providing a safety display function installed on a road such as a road electric light display (VMS), a fog display device, a variable speed limit display device, a directional speaker, and a fog lamp.
  • VMS road electric light display
  • the safety display device 160 when the safety display device 160 is a fog light, when the visible distance information is a predetermined distance or more and does not correspond to the safety distance, it is not lit when the visible distance information is a certain distance or less to notify the driver on the road. Can be operated.
  • the visible distance may be set in a predetermined range so that the brightness and the flashing speed may be adjusted in stages.
  • the display device 180 may provide a visual distance as a numerical value, and may also provide an image of a road taken by the first camera or the second camera.
  • control server 170 displays the visible distance information through the display device 180, when the visibility information calculated by the visibility measurement module 122 is 50m, and at the same time provide an image of the road One or more safety indicators 160 may be activated.
  • the storage unit 130 may store the first image and the second image
  • the control unit 120 may operate the display device 180 and the safety display device 160 through the communication unit 140. Control their behavior.
  • FIG. 8 is a flowchart illustrating a control method of a 3D visibility measurement system according to an embodiment of the present invention.
  • the three-dimensional visibility measurement method includes a series of steps performed by the above-described three-dimensional visibility measurement system.
  • a first image of the first camera is acquired, and the first A second image of the second camera photographed at an angle different from that of the camera may be acquired (S802).
  • the first camera and the second camera simultaneously provide a first image and a second image photographing a road and a street for monitoring traffic from different angles.
  • the first image and the second image may be images including an object such as a vehicle.
  • a first image photographed by the first camera and a second image photographed by the second camera are acquired as inputs, and a depth map is generated using three-dimensional coordinates in the first image and the second image. (S804).
  • the depth map may be generated using the 3D coordinates obtained from the corrected first image and the second image.
  • the step S804 may include generating a depth map using Epipolar Geometry for generating a stereoscopic image through the conjugate point determined from the first image and the second image. According to this method, a more accurate depth map of the data can be generated.
  • This step (S808) extracts an edge from the original image of the first image taken by the first camera and the second image taken by the second camera, and analyzes by overlapping the edge and the iso-depth line. It may include the process of doing.
  • the iso-depth line has a continuity of a predetermined level or more, the maximum value is obtained among the iso-depth lines having continuity, that is, the minimum parallax value is calculated as the visible distance information (S810).
  • step S808 it may be determined whether the photographing center point by the camera is the minimum distance measuring point (S812). If the photographing center point is the minimum distance measuring point, the minimum value or the lowest value among the depth lines may be calculated or applied as the visible distance information (S814).
  • the control unit or the visibility measurement module of the controller may move the photographing center point step by step (S816).
  • the controller may move the photographing center point to a minimum distance measuring point or a position adjacent thereto from a distance to a near distance by a preset step-by-step zoom out method. Then, when the photographing center point is located at a predetermined minimum distance measuring point or the area, the process may return to obtaining the images of the first and second cameras.
  • the safety display device may be operated in correspondence with the visual field information selectively calculated in the above steps (S810 and S814) (S818).
  • this step it is determined whether the visible distance is greater than or equal to a certain distance, and when the visible distance information is a reference value less than or equal to the predetermined distance, the safety display device or the like can be operated.
  • the controller may display the viewing distance through the display device or store the first image, the second image, and the viewing distance information in the storage unit (S820).
  • a depth map is generated based on the first image and the second image, an iso-depth line is formed according to the depth map, the connection continuity of the iso-depth line is examined, and continuity of a predetermined level or more is determined.
  • the visible distance information may be extracted or the visible distance information to an object may be calculated.
  • the calculated information is used to provide visibility information to the driver through the display device, or to provide various safe driving environments such as operating a safety display device and providing an image through the display device when the visible distance information is below a certain distance. It can be used, and by such a configuration and action effect is more economical than the conventional visibility system, there is an effect that can provide more accurate visibility information to the driver.
  • the above-described three-dimensional visibility measurement method may be implemented as a submodule in the form of a software module of the controller, in particular, the visibility measurement module.
  • the visibility measurement module may include a plurality of submodules that perform each function in response to steps S802 to S820, and for example, the plurality of submodules may correspond to an acquisition module (corresponding to an acquisition submodule, which is the same below). ), A generation module, an extraction module, a first determination module, a second determination module, a view distance calculation module, an optimum value applying module, a moving module, an operation module, a display module, and a storage module.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

L'invention concerne un procédé et un système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation. Dans le procédé, une carte de profondeur est générée par l'intermédiaire d'une première et d'une seconde image acquises par une première et une seconde caméra; une ligne de contour est formée d'après la carte de profondeur; la continuité de la ligne de contour est examinée; et des informations de distance de visibilité sont extraites d'après un certain niveau de continuité de celle-ci ou des informations sur une distance de visibilité à un objet sont calculées. Au moyen du procédé, des informations plus exactes de distance de visibilité peuvent être fournies à un conducteur par l'intermédiaire d'une unité d'affichage, ou lorsque les informations de distance de visibilité se situent dans la limite d'une certaine distance, un dispositif d'affichage de sûreté ou similaire fonctionne ou des informations d'alarme peuvent être fournies au conducteur.
PCT/KR2019/000927 2018-01-31 2019-01-22 Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation Ceased WO2019151704A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020180012145A KR101893368B1 (ko) 2018-01-31 2018-01-31 교통 모니터링 카메라를 이용하는 3차원 시정 측정 방법 및 시스템
KR10-2018-0012145 2018-01-31

Publications (1)

Publication Number Publication Date
WO2019151704A1 true WO2019151704A1 (fr) 2019-08-08

Family

ID=63598249

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2019/000927 Ceased WO2019151704A1 (fr) 2018-01-31 2019-01-22 Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation

Country Status (2)

Country Link
KR (1) KR101893368B1 (fr)
WO (1) WO2019151704A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627056A (zh) * 2020-05-14 2020-09-04 清华大学 基于深度估计的行车能见度确定方法及装置

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11301974B2 (en) * 2019-05-27 2022-04-12 Canon Kabushiki Kaisha Image processing apparatus, image processing method, image capturing apparatus, and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012026927A (ja) * 2010-07-26 2012-02-09 Astron Inc Kk 気象測定装置
KR20170007098A (ko) * 2015-07-08 2017-01-18 고려대학교 산학협력단 투영 이미지 생성 방법 및 그 장치, 이미지 픽셀과 깊이값간의 매핑 방법
KR101760323B1 (ko) * 2010-01-13 2017-07-21 삼성전자주식회사 장면의 3차원 뷰들을 렌더링 하기 위한 시스템 및 방법
KR101783379B1 (ko) * 2010-02-02 2017-09-29 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 깊이 카메라 호환성
KR101797035B1 (ko) * 2010-02-09 2017-11-13 삼성전자주식회사 오버레이 영역의 3d 영상 변환 방법 및 그 장치

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101357600B1 (ko) 2007-11-19 2014-02-11 대한민국(기상청장) 기상 관측 시스템 및 그 방법
KR101032160B1 (ko) 2009-10-06 2011-05-02 충주대학교 산학협력단 카메라를 이용한 도로 시정 측정 시스템 및 그 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101760323B1 (ko) * 2010-01-13 2017-07-21 삼성전자주식회사 장면의 3차원 뷰들을 렌더링 하기 위한 시스템 및 방법
KR101783379B1 (ko) * 2010-02-02 2017-09-29 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 깊이 카메라 호환성
KR101797035B1 (ko) * 2010-02-09 2017-11-13 삼성전자주식회사 오버레이 영역의 3d 영상 변환 방법 및 그 장치
JP2012026927A (ja) * 2010-07-26 2012-02-09 Astron Inc Kk 気象測定装置
KR20170007098A (ko) * 2015-07-08 2017-01-18 고려대학교 산학협력단 투영 이미지 생성 방법 및 그 장치, 이미지 픽셀과 깊이값간의 매핑 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111627056A (zh) * 2020-05-14 2020-09-04 清华大学 基于深度估计的行车能见度确定方法及装置
CN111627056B (zh) * 2020-05-14 2023-09-01 清华大学 基于深度估计的行车能见度确定方法及装置

Also Published As

Publication number Publication date
KR101893368B1 (ko) 2018-09-04

Similar Documents

Publication Publication Date Title
KR101343975B1 (ko) 돌발검지 시스템
KR101666466B1 (ko) 단안 카메라를 이용한 해상 객체 거리측정 시스템을 이용한 해상 위험관리 시스템 및 해상 위험 관리방법
US20160379066A1 (en) Method and Camera System for Distance Determination of Objects from a Vehicle
WO2010134680A1 (fr) Procédé et appareil de détection de déviation par rapport à une voie utilisant des images entourant un véhicule
EP3470780B1 (fr) Dispositif de détection de distance d'objet
KR102484691B1 (ko) 스테레오 카메라 및 레이더를 이용한 차량 감지 시스템 및 차량 감지 방법
CN106060452A (zh) 用来控制监视系统的方法和装置
WO2017183915A2 (fr) Appareil d'acquisition d'image et son procédé
WO2016072625A1 (fr) Système de contrôle d'emplacement de véhicule pour parc de stationnement utilisant une technique d'imagerie, et son procédé de commande
JP2005024463A (ja) ステレオ広視野画像処理装置
KR101381580B1 (ko) 다양한 조명 환경에 강인한 영상 내 차량 위치 판단 방법 및 시스템
JP2019146012A (ja) 撮像装置
WO2020070650A1 (fr) Cible multidimensionnelle optique et procédé de détection et de suivi d'objets multiples
WO2013022153A1 (fr) Appareil et procédé de détection de voie
WO2019172500A1 (fr) Dispositif de mesure de visibilité par analyse vidéo utilisant l'intelligence artificielle
WO2019151704A1 (fr) Procédé et système de mesure de visibilité tridimensionnelle à l'aide d'une caméra de surveillance de la circulation
US10223592B2 (en) Method and associated apparatus for performing cooperative counting with aid of multiple cameras
RU164432U1 (ru) Устройство автоматической фотовидеофиксации нарушений не предоставления преимущества пешеходу на нерегулируемом пешеходном переходе
JP3491029B2 (ja) 自動監視装置
JPH1149100A (ja) エプロン監視装置
CN110070724A (zh) 一种视频监控方法、装置、摄像机及图像信息监管系统
KR102660024B1 (ko) 도로 이벤트 탐지 방법 및 그 장치
CN113902805A (zh) 标定路侧设备的方法、系统、路侧设备及标定车辆
KR102612658B1 (ko) 레이더와 카메라 좌표 정합 방법
KR101446545B1 (ko) 위치기반 교차로 차량정보 표시시스템

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19747672

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19747672

Country of ref document: EP

Kind code of ref document: A1