[go: up one dir, main page]

WO2005107261A1 - Visualisation de circonférence d’un véhicule - Google Patents

Visualisation de circonférence d’un véhicule Download PDF

Info

Publication number
WO2005107261A1
WO2005107261A1 PCT/JP2005/007606 JP2005007606W WO2005107261A1 WO 2005107261 A1 WO2005107261 A1 WO 2005107261A1 JP 2005007606 W JP2005007606 W JP 2005007606W WO 2005107261 A1 WO2005107261 A1 WO 2005107261A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
viewpoint
obstacle
unit
display device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2005/007606
Other languages
English (en)
Japanese (ja)
Inventor
Tomoya Nakanishi
Akira Ishida
Yutaka Watanabe
Toru Ichikawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP2006512752A priority Critical patent/JPWO2005107261A1/ja
Priority to EP05734727A priority patent/EP1748654A4/fr
Priority to US10/573,685 priority patent/US7369041B2/en
Publication of WO2005107261A1 publication Critical patent/WO2005107261A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • B60R2300/602Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint
    • B60R2300/605Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective with an adjustable viewpoint the adjustment being automatic
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/70Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by an event-triggered choice to display a specific image among a selection of captured images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9314Parking operations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/932Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles using own vehicle data, e.g. ground speed, steering wheel direction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93272Sensor installation details in the back of the vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93274Sensor installation details on the side of the vehicles

Definitions

  • a vehicle periphery display device as described above (hereinafter, referred to as a conventional display device), a plurality of imaging devices, a plurality of laser range finders, a three-dimensional virtual unit, an image conversion unit, And a display unit.
  • a plurality of imaging devices are installed in the vehicle 1, and the surroundings are imaged by these imaging devices.
  • a plurality of laser range finders measure the distance to an object in the field of view (the subject of the imaging device).
  • One imaging device and one radar range finder are arranged close to each other.
  • the three-dimensional virtual part obtains a distance image (see the upper left part of FIG. 18) of the visual field based on distance information from the laser range finder. Performs object recognition in the field of view from the original image (see the upper right side of Fig. 18). Based on these two pieces of image information, the original image and the distance image, and knowledge from object recognition, the field of view is reproduced three-dimensionally assuming a hidden part of the subject, which cannot be seen by the imaging device.
  • a bird's-eye view viewed from a virtual camera set above the vehicle is displayed. Therefore, when a vehicle approaches an obstacle, the obstacle enters a blind spot formed by the vehicle in a bird's-eye view, making it difficult for the driver to visually recognize the obstacle.
  • a first aspect of the present invention is directed to a vehicle periphery display device that selectively displays at least two types of images around a vehicle.
  • the vehicle surrounding display device includes a measuring unit for measuring a distance and an azimuth to an obstacle around the vehicle with respect to the vehicle, a comparing unit for comparing the measured distance of the measuring unit with a predetermined threshold value, While the result indicates that the measurement distance is large, the first viewpoint determined in advance is determined, and while the comparison result of the comparison unit does not indicate that the measurement distance is large, the measurement by the measurement unit is performed.
  • a second aspect is directed to a vehicle periphery display method for selectively displaying at least two types of images on a display device around a vehicle.
  • the vehicle surrounding display method includes: a measuring step of measuring a distance and an azimuth to an obstacle around the vehicle with reference to the vehicle; a comparing step of comparing the measured distance in the measuring step with a predetermined threshold value; Is larger, the first viewpoint determination step for determining the first viewpoint determined in advance and the first viewpoint power determined in the first viewpoint determination step represent the surroundings of the vehicle as viewed.
  • a first image generating step for generating a first image; and a first image generating step.
  • the second viewpoint determining step of determining the second viewpoint based on the measurement orientation in the measuring step, and the peripheral force of the second viewpoint determined in the second viewpoint determining step are determined.
  • both the first viewpoint and the second viewpoint are represented by three-dimensional coordinate values, and the magnitude of the horizontal component of the second viewpoint is larger than that of the horizontal component of the first viewpoint.
  • the magnitude of the vertical component of the second viewpoint is smaller than that of the vertical component of the first viewpoint.
  • the three-dimensional coordinates of the first viewpoint correspond to the position directly above the vehicle
  • the three-dimensional coordinates of the second viewpoint correspond to the direction of the vehicle and obstacles starting from itself. And make a predetermined depression angle with respect to the horizontal plane.
  • the second viewpoint is set to a point included in a vertical plane perpendicular to a straight line connecting the vehicle and the obstacle.
  • the vertical plane is set to a vertical bisector of a straight line connecting the vehicle and the obstacle.
  • the height of the obstacle is measured, and whether or not the vehicle can travel without contacting the obstacle is determined based on the measured height of the obstacle.
  • the second viewpoint is determined in consideration of the detected steering angle.
  • the second viewpoint is preferably set to a three-dimensional coordinate value that allows the driver to visually recognize both the obstacle and the part of the vehicle that contacts the obstacle.
  • the distance and the azimuth having the highest possibility of contacting the vehicle are selected.
  • the selected distance is compared with a predetermined threshold, and the second viewpoint is determined based on the selected azimuth while the comparison result indicates that the measured distance is not large.
  • an obstacle existing around the vehicle is detected by a plurality of active sensors, each of which is installed on the front, rear, or side of the vehicle.
  • the distance and the azimuth to surrounding obstacles are measured with reference to the vehicle. If the measured distance is not larger than the predetermined threshold, that is, if the vehicle is close to the obstacle, the vehicle and the vehicle as viewed from the vicinity of the second viewpoint determined based on the measurement direction are determined. Second image data representing an obstacle is generated and displayed. Since such a second viewpoint is set, in the second image data, the obstacle is less likely to enter a blind spot formed by the vehicle. Accordingly, it is possible to provide a vehicle surrounding display device that can display an image in which a driver can easily recognize an obstacle.
  • FIG. 1 is a schematic diagram showing a block configuration of a vehicle periphery display device 1 according to one embodiment of the present invention.
  • FIG. 2 is a schematic diagram showing an example of an arrangement of active sensors 111 included in the measurement unit 11 shown in FIG.
  • FIG. 4 is a flowchart showing an operation of the vehicle periphery display device 1 shown in FIG. 1.
  • FIG. 5 is a schematic diagram showing an obstacle B existing in an area in front, behind, or side of the vehicle A shown in FIG. 1.
  • FIG. 6A shows that the active sensor 111 and the obstacle B shown in FIG.
  • FIG. 1 A first figure.
  • FIG. 7 is a schematic diagram for explaining a method of deriving a distance C when an obstacle B is present in an area oblique to the vehicle A shown in FIG. 1.
  • FIG. 8A shows the second viewpoint P determined by the viewpoint determining unit 13 shown in FIG. 1 when viewed from above.
  • FIG. 8B shows the second viewpoint P determined by the viewpoint determining unit 13 shown in FIG. 1 when viewed from behind.
  • FIG. 9 is a diagram illustrating a case where an obstacle B is present in an oblique area with respect to the vehicle A shown in FIG. When viewed from
  • FIG. 1 A first figure.
  • FIG. 10 is a schematic diagram showing a specific location image displayed on display unit 15 shown in FIG. 1.
  • FIG. 11 is a schematic diagram showing a bird's-eye view image displayed on the display unit 15 shown in FIG. 1.
  • FIG. 12 is a block diagram of a vehicle periphery display device la according to a first modification of the present invention.
  • FIG. 13 is a flowchart showing an operation of the vehicle surrounding display device la shown in FIG. 1.
  • FIG. 14 is a schematic diagram showing a height H derived by the contact determination unit 21 shown in FIG.
  • FIG. 15 is a schematic diagram showing a block configuration of a vehicle periphery display device lb according to a second modification of the present invention.
  • FIG. 16 is a flowchart showing an operation of the vehicle periphery display device lb shown in FIG.
  • FIG. 17A shows a preferable second viewpoint P determined by the viewpoint determining unit 13 shown in FIG.
  • FIG. 2 is a first schematic diagram shown in FIG.
  • FIG. 17B shows a preferable second viewpoint P determined by the viewpoint determining unit 13 shown in FIG.
  • FIG. 2 is a second schematic diagram shown in FIG.
  • FIG. 18 is a schematic diagram showing a schematic process of a conventional vehicle periphery display device. Explanation of symbols
  • FIG. 1 is a schematic diagram illustrating an example of a block configuration of a vehicle periphery display device 1 according to an embodiment of the present invention.
  • a vehicle periphery display device 1 is mounted on a vehicle A, As a typical configuration, it includes a measurement unit 11, a comparison unit 12, a viewpoint determination unit 13, an image generation unit 14, and a display unit 15. Further, as a preferred configuration, the vehicle periphery display device 1 includes a data storage unit 16.
  • the measuring unit 11 measures at least the distance C and the direction D to the obstacle B around the vehicle A.
  • the measuring unit 11 includes a plurality of active sensors 111 (16 shown).
  • the active sensor 111 a laser radar, a millimeter-wave radar, or a quasi-millimeter-wave radar is typical.
  • Each active sensor 111 alone can detect an obstacle B existing in its own detection range by scanning a range of about ⁇ 45 degrees in the horizontal direction and about ⁇ 20 degrees in the vertical direction.
  • four such active sensors 111 are provided at the front part, the rear part, and both side parts of the vehicle A so as to detect the obstacle B existing around the vehicle A at 360 °. It is attached.
  • the comparison unit 12 compares the distance C measured by the measurement unit 11 with a threshold value E held in advance, and generates a comparison result F.
  • the threshold value E is a reference value used for determining whether or not the vehicle A is approaching the obstacle B.
  • the comparison result F is “F”; otherwise, the comparison result F is “F”.
  • the viewpoint determination unit 13 looks down on the vehicle A while the comparison result F of the comparison unit 12 indicates "F".
  • the viewpoint determination unit 13 determines the vehicle A based on the azimuth D obtained from the measurement unit 11.
  • the image generating unit 14 When obtaining the first viewpoint P from the viewpoint determining unit 13, the image generating unit 14 generates first image data I representing a situation around the vehicle A when the obtained first viewpoint P force is viewed. . That a
  • the image generation unit 14 obtains the second viewpoint P from the viewpoint determination unit 13
  • the image generation unit 14 obtains the second viewpoint P from the viewpoint determination unit 13
  • the comparison unit 12, the viewpoint determination unit 13 and the image generation unit 14 are typically composed of a combination of a CPU, a ROM, and a RAM. This is realized by the CPU executing a computer program using the RAM.
  • the display unit 15 obtains the first image data I from the image generation unit 14, the display unit 15
  • the display unit 15 displays the image b of the specific location of the vehicle A according to the obtained second image data I.
  • the driver of the vehicle A receives the bird's-eye view image of the vehicle A based on the first image data Ia and the specific location image of the vehicle A based on the second image data Ib.
  • the display unit 15 can apply a monitor provided in an in-vehicle navigation system or a television receiver, or a head-up display or a head-mounted display.
  • the data storage unit 16 is typically composed of an HDD (Hard Disc Drive), a DVD (Digital Versatile Disc), or a semiconductor memory. Various data are stored in the data storage unit 16 as shown in FIG. First, the height H of the vehicle A (hereinafter referred to as vehicle height information) and shape data M representing the outer shape of the vehicle A are stored. Other,
  • the data storage unit 16 stores the width of the vehicle A (hereinafter referred to as vehicle width information) W and the total length of the vehicle A.
  • total length information L (Hereinafter referred to as total length information) L is stored.
  • the data storage unit 16 stores the mounting position P of each active sensor 111.
  • shape data M representing the outline of an obstacle B, such as a person, a wall, or a tree, is
  • the data is stored in the data storage unit 16.
  • the various data stored in the data storage unit 16 as described above is mainly used for creating the first image data I and the second image data I.
  • the measuring unit 11 detects the obstacle B (Step S11). Specifically, the measuring unit 11 derives a distance C to the obstacle B and a direction D of the obstacle B based on the vehicle A. Further, the measuring unit 11 estimates what the obstacle B detected this time is, and thereby obtains the estimation result G.
  • FIG. 5 is a schematic diagram showing an obstacle B existing in an area in front of, behind, or on the side of vehicle A (see the hatched portion).
  • Fig. 5 schematically shows vehicle A when viewed from above, and also shows an example where obstacle B is present to the left of vehicle A and obstacle B is behind vehicle A. Have been.
  • the active sensor (hereinafter, referred to as a proximity active sensor) 111 has a positional relationship as shown in FIGS. 6A and 6B.
  • FIG. 6A is an enlarged view when the proximity active sensor 111 and the obstacle B (or B) are viewed from directly above them.
  • FIG. 6A is an enlarged view when the proximity active sensor 111 and the obstacle B (or B) are viewed from directly above them.
  • FIG. 1 A first figure.
  • the proximity active sensor 111 has a distance d to the obstacle B (or B) with respect to itself, a horizontal azimuth ⁇ , and a vertical azimuth ⁇ And
  • the measuring unit 11 is provided between the proximity active sensor 111 and the obstacle B (or B).
  • FIG. 7 shows that the obstacle B does not exist in front of, behind, or on the side of the vehicle A, and the obstacle B is located in an area oblique to the vehicle A (see the hatched portion).
  • FIG. 9 is a schematic diagram for explaining a method of deriving a distance C when exists.
  • FIG. 7 schematically shows a vehicle A and an obstacle B when viewed from above, and further illustrates an example in which an obstacle B is present on the right rear side of the vehicle A.
  • the active sensor 111 on the rear left side of the vehicle A is the proximity active sensor 111.
  • the proximity active sensor 111 detects the distance d to the obstacle B based on itself, the horizontal azimuth ⁇ , and the vertical azimuth ⁇ in the same manner as described above. Therefore, the shortest distance C and the vertical azimuth ⁇ can be obtained by the measuring unit 11 in the same manner as described above.
  • the proximity active sensor 111 since the proximity active sensor 111 is installed obliquely rearward with respect to the vehicle A, the horizontal azimuth ⁇ The orientation is converted into the azimuth D with respect to the vehicle A using the attachment position of the proximity active sensor 111 stored in the unit 16.
  • the estimation of the obstacle B is performed, for example, by using the active sensor 11 regardless of the area where the obstacle B is located.
  • the comparison unit 12 compares the distance C stored in the RAM with the threshold value E held by itself, and stores the comparison result F in the RAM (Step S12). Specifically, if the distance C is large, “F” as the comparison result F is stored in the RAM, and if not, the comparison result is
  • the threshold value E1 is, for example, a force selected to be 1 meter. This value may be changeable according to the driver's specification or according to the design specification of the vehicle surrounding display device 1. With such a threshold value E, the comparison result F indicates whether or not the obstacle B exists near the vehicle A or not.
  • the viewpoint determining unit 13 determines whether or not the comparison result F in the RAM is “F” (step
  • the viewpoint determination unit 13 determines the second viewpoint P for the specific location (step
  • the viewpoint determining unit 13 uses the direction D currently stored in the RAM for determining the second viewpoint P. As a result,
  • the fixed part 13 can recognize in which direction the obstacle B exists in the direction of the vehicle A. Further, the depression angle R with respect to the horizontal plane becomes a predetermined value (for example, 45 °), and the vicinity of the proximity active sensor 111 (that is, the possibility that the vehicle A is likely to come into contact with the obstacle B is high). Is set as the second viewpoint P as shown in. Also,
  • the depression angle R is the angle between the horizontal plane and the line that extends from the second viewpoint P to the horizontal plane.
  • This line is required to extend in the direction of the proximity active sensor ill or its sensing range.
  • the value of the depression angle R may be changeable in accordance with the specification of the driver or in accordance with the design specifications of the vehicle surrounding display device 1.
  • the viewpoint determining unit 13 is preferably stored in the RAM in addition to the direction D.
  • the distance C is used to determine the second viewpoint P.
  • FIG. 1 As the second viewpoint P, FIG.
  • FIG.8B it is orthogonal to the line L representing the shortest distance between the vehicle A and the obstacle B.
  • Three-dimensional coordinate values that are included in the vertical plane P and have a predetermined angle of depression R with respect to the horizontal plane are selected.
  • the vertical plane P is a plane that vertically bisects the line L.
  • the viewpoint determination unit 13 passes the second viewpoint P set as described above to the image generation unit 14. Picture
  • the image generator 14 Upon receiving the second viewpoint P, the image generator 14 creates second image data I (step S).
  • shape data M representing a detected obstacle B and shape data M representing a vehicle A corresponding to the estimation result G stored in the data storage unit 16 are extracted.
  • the component 14 arranges the object of the obstacle B and the object of the vehicle A in the positional relationship of the distance C and the direction D stored in the RAM, and further, based on the received position of the second viewpoint P.
  • the second image data I representing the appearance when both are viewed is created.
  • the value representing the distance C and the value representing the height H of the obstacle are combined with the second image data I.
  • the image generation unit 14 transfers the second image data I created as described above to the display unit 15, and b
  • the display unit 15 displays a specific location image based on the second image data I as shown in FIG.
  • Step SI6 the process returns to step S11.
  • the viewpoint determining unit 13 determines that the vehicle A and the obstacle B are not sufficiently close to each other, and determines the first viewpoint P (step S13). 17).
  • the first viewpoint P is set directly above the vehicle A.
  • the first viewpoint P is set to three-dimensional coordinate values (0, 0, z)
  • the second viewpoint P is set to three-dimensional coordinate values (X, y,
  • the second viewpoint P needs to be shifted more horizontally than the first viewpoint P
  • the magnitude of the horizontal component of the first viewpoint P is equal to the magnitude of the horizontal component of the second viewpoint P. (That is, (x 2 + y 2 ))
  • the first viewpoint P set as described above is passed from the viewpoint determination unit 13 to the image generation unit 14.
  • the image generation unit 14 creates the first image data I (step S).
  • the first image data I is created in the same way as the second image data I, but ab
  • the numerical value representing the distance C of the obstacle B and the numerical value representing the height H of the obstacle B may be combined.
  • the image generation unit 14 transfers the first image data I created in this way to the display unit 15, and
  • the display unit 15 displays a bird's-eye view image based on the first image data I as shown in FIG.
  • Step SI 9 Then, the process returns to step S11.
  • the distance C from the vehicle A to the obstacle B becomes larger than the threshold E. Is larger, the bird's-eye view image based on the first image data I is displayed on the display unit 15.
  • the overhead image is a
  • the specific location image based on the second image data I is displayed on the display unit 15.
  • the specific location image is the obstacle b
  • the second viewpoint P set based on the bearing D detected for the object B is used,
  • the obstacle B Since the area around the proximity active sensor 111 that has detected the obstacle B is enlarged in the vehicle A, the obstacle B is less likely to enter the blind spot created by the vehicle A. In A, it is easy to visually recognize a location that is likely to contact obstacle B.
  • the preferred second viewpoint P is the vehicle A
  • a three-dimensional coordinate value at which the depression angle R with respect to the horizontal plane becomes a predetermined value is selected.
  • the vertical plane P is a plane that vertically bisects the line L. Setting such a second viewpoint P This makes it easier for both the proximity active sensor 111 and the obstacle B to appear on the specific location image. Therefore, it is possible to provide a vehicle surrounding display device that can display a specific location image that makes it easier for the driver to visually recognize the positional relationship between the vehicle A and the obstacle B. Displaying the distance C on the display unit 15 also makes it easier for the driver to grasp the positional relationship between the vehicle A and the obstacle B.
  • both the first image data I and the second image data I are identical to each other.
  • the data is generated by using the shape data M.
  • the shape data M is generated by using the shape data M.
  • the measuring unit 11 may detect a plurality of obstacles B in such a case. In such a case, the measuring unit 11 may detect the closest obstacle B in the traveling direction of the vehicle A as described above. It is preferable to apply the following processing.
  • FIG. 12 is a schematic diagram showing an example of a block configuration of a first modified example of the vehicle surrounding display device 1 (hereinafter, referred to as a vehicle surrounding display device la).
  • the vehicle surrounding display device la is different from the vehicle surrounding display device 1 shown in FIG. Other than that, there is no difference between the two vehicle surrounding display devices 1 and la. Therefore, in FIG. 12, the components corresponding to the configuration shown in FIG. 1 are denoted by the same reference numerals, and description thereof is omitted.
  • the contact determination section 21 derives the height H of the bottom surface of the obstacle B, and
  • the height of the vehicle A stored in the data storage unit 16 (hereinafter referred to as vehicle height information) H is compared.
  • the contact determination unit 16 determines whether or not the vehicle A can pass under the obstacle B. Then, it is determined whether or not the vehicle A can pass under the obstacle B, and the determination result [is generated. In the present embodiment, when the height H is larger than the vehicle height information H, the contact determination unit 16
  • the above-described contact determination unit 16 is also typically a combination of the above-described CPU, ROM, and RAM. Mating force is composed.
  • FIG. 13 the flowchart shown in FIG. 13 is different from that of FIG. 4 in that steps S21 and S22 are further provided. Other than that, there is no difference between the two flowcharts. Therefore, in FIG. 13, steps corresponding to those in FIG. 4 are denoted by the same step numbers, and the description thereof is omitted.
  • the contact determination unit 21 determines the height H of the bottom surface of the obstacle B as shown in FIG.
  • the height H is the distance C and the vertical orientation currently stored in RAM.
  • the contact determination unit 16 calculates the height H derived this time and the vehicle A stored in the data storage unit 16.
  • the RAM stores “J” as a judgment result [
  • step S21 the viewpoint determining unit 13 determines whether or not the result of the determination in the RAM is "J".
  • Step S22 Since the judgment of Yes means that the vehicle A cannot pass under the obstacle B, the viewpoint determining unit 13 determines whether or not to create the specific location image as described above. Step S12 and subsequent steps are performed. On the other hand, if No is determined in step S22, indicating that there is a possibility that the vehicle A can pass under the obstacle B, the viewpoint determining unit 13 determines in step S17. Do the following.
  • the driver can use the vehicle surrounding display device la in a situation such as a garage, for example, so that the driver can use the vehicle more easily. It is possible to provide the surrounding display device la.
  • the height H of the bottom surface of the obstacle B is combined with the specific location image.
  • FIG. 15 shows a second modified example of the above-described vehicle surrounding display device 1 (hereinafter, referred to as a vehicle surrounding display device lb).
  • FIG. 2 is a schematic diagram illustrating an example of a block configuration of FIG. 15, the vehicle periphery display device lb is different from the vehicle periphery display device 1 shown in FIG. 1 in that the vehicle periphery display device lb further includes a steering angle sensor 31 and a contact determination unit 32. Since there is no other difference between the two vehicle surrounding display devices 1 and lb, the same reference numerals in FIG. 15 denote the components corresponding to those in FIG. 1, and a description thereof will be omitted.
  • the steering angle sensor 31 detects the steering angle of the vehicle A, and passes the detection result to the contact determination unit 32.
  • the contact determination unit 32 derives a predicted trajectory that the vehicle A will travel from, based on the detection result from the steering angle sensor 31. Further, the contact determination unit 32 determines whether or not the obstacle B exists on the derived predicted trajectory from the distance C to the obstacle B and the direction D stored in the RAM, and generates a determination result K. . In the present embodiment, when the obstacle B exists on the predicted trajectory, the contact determination unit 32 stores the determination result [ ⁇ ⁇ ⁇ ] in the RAM. Also so
  • contact determination unit 32 is also typically configured by the combination of the CPU, ROM, and RAM described above.
  • FIG. 16 The flowchart shown in FIG. 16 is different from that of FIG. 4 in that steps S31 and S32 are further provided. Apart from that, there is no difference between the two flowcharts. Therefore, in FIG. 16, steps corresponding to those in FIG. 4 are denoted by the same step numbers, and description thereof is omitted.
  • the contact determination unit 32 derives a predicted trajectory of the vehicle A using the detection result from the steering angle sensor 31. Further, the contact determination unit 32 determines whether the obstacle B exists on the derived predicted trajectory, and stores the determination result K (“K” or “ ⁇ ⁇ ”) in the RAM.
  • step S31 the viewpoint determining unit 13 determines whether the determination result K in the RAM is "K".
  • step S32 If it is determined as Yes, it means that there is a high possibility that the obstacle B and the vehicle A do not come into contact with each other, so the viewpoint determining unit 13 performs the above-described step S17 and subsequent steps. On the other hand, the judgment of No in step S32 means that the obstacle B and the vehicle A Therefore, the viewpoint determining unit 13 performs the above-described step S12 and subsequent steps in order to determine whether to create the specific location image.
  • the driver can use the vehicle surrounding display device lb, for example, in a situation where the vehicle A enters the parking space, so that the usability is improved. It is possible to provide the vehicle surrounding display device la.
  • a predicted locus may be combined with the specific location image or the bird's-eye view image.
  • the second viewpoint P is determined as shown in FIGS. 8A, 8B, and 9.
  • the second viewpoint P is more preferably set as follows. That is,
  • the second viewpoint P when there is an obstacle B behind the reversing vehicle A and the steering of the vehicle A is turned to the left (counterclockwise), the second viewpoint P
  • the second viewpoint P is selected according to the direction of the obstacle B, the traveling direction of the vehicle A, and the steering operation direction.
  • the vehicle A has an obstacle.
  • the appearance of the obstacle is displayed on the display unit 15, so that the positional relationship between the vehicle A and the obstacle B can be easily understood by the driver.
  • the vehicle surrounding display device according to the present invention is useful for a navigation device, a parking assist device, or the like, which requires a driver to easily recognize an obstacle and to display an image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Mechanical Engineering (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Traffic Control Systems (AREA)

Abstract

Une visualisation de circonférence d’un véhicule susceptible d’afficher une image telle que le conducteur puisse facilement confirmer un obstacle. La visualisation de circonférence (1) du véhicule comprend une section (1) pour mesurer la distance vers un obstacle circonférentiel B et l’azimut de celui-ci en référence à un véhicule A, une section (12) pour comparer une distance mesurée avec une valeur de seuil spécifique, une section (13) pour déterminer un premier point de vue prédéterminé si les résultats de la comparaison indiquent une distance mesurée plus longue, déterminant sinon un deuxième point de vue sur la base d’un azimut mesuré, une section (14) pour créer une vue d’ensemble de la circonférence du véhicule vue depuis un premier point de vue donné lorsqu’un premier point de vue est paramétré et créant une image de position spécifique indicatrice du véhicule A et l’obstacle B vu depuis un deuxième point de vue donné lorsqu’un deuxième point de vue est paramétré, et une section (15) pour afficher la vue d’ensemble ou l'image de la position spécifique.
PCT/JP2005/007606 2004-04-27 2005-04-21 Visualisation de circonférence d’un véhicule Ceased WO2005107261A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2006512752A JPWO2005107261A1 (ja) 2004-04-27 2005-04-21 車両周囲表示装置
EP05734727A EP1748654A4 (fr) 2004-04-27 2005-04-21 Visualisation de circonférence d"un véhicule
US10/573,685 US7369041B2 (en) 2004-04-27 2005-04-21 Vehicle surrounding display device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-130989 2004-04-27
JP2004130989 2004-04-27

Publications (1)

Publication Number Publication Date
WO2005107261A1 true WO2005107261A1 (fr) 2005-11-10

Family

ID=35242048

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/007606 Ceased WO2005107261A1 (fr) 2004-04-27 2005-04-21 Visualisation de circonférence d’un véhicule

Country Status (5)

Country Link
US (1) US7369041B2 (fr)
EP (1) EP1748654A4 (fr)
JP (1) JPWO2005107261A1 (fr)
CN (1) CN100456828C (fr)
WO (1) WO2005107261A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145878A1 (fr) * 2012-03-29 2013-10-03 住友建機株式会社 Dispositif de surveillance de périmètre pour engin de travaux publics
WO2017065352A1 (fr) * 2015-10-13 2017-04-20 엘지전자 주식회사 Appareil de fourniture de vision pour véhicule et véhicule

Families Citing this family (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7797881B2 (en) * 2005-06-22 2010-09-21 Loitherstein Joel S Garage door control system
WO2007015446A1 (fr) * 2005-08-02 2007-02-08 Nissan Motor Co., Ltd. Dispositif de surveillance de l’environnement d’un véhicule et procédé de surveillance de l’environnement d’un véhicule
US20090128630A1 (en) * 2006-07-06 2009-05-21 Nissan Motor Co., Ltd. Vehicle image display system and image display method
DE102006052779A1 (de) * 2006-11-09 2008-05-15 Bayerische Motoren Werke Ag Verfahren zur Erzeugung eines Gesamtbilds der Umgebung eines Kraftfahrzeugs
JP5003946B2 (ja) * 2007-05-30 2012-08-22 アイシン精機株式会社 駐車支援装置
JP4325705B2 (ja) * 2007-06-15 2009-09-02 株式会社デンソー 表示システム及びプログラム
JP4462333B2 (ja) * 2007-11-13 2010-05-12 株式会社デンソー 走行支援装置
TW200925023A (en) * 2007-12-07 2009-06-16 Altek Corp Method of displaying shot image on car reverse video system
JP2009166691A (ja) * 2008-01-16 2009-07-30 Mazda Motor Corp 車両の走行制御装置
JP5094658B2 (ja) * 2008-09-19 2012-12-12 日立オートモティブシステムズ株式会社 走行環境認識装置
US8305444B2 (en) * 2008-11-14 2012-11-06 Toyota Motor Engineering & Manufacturing North America, Inc. Integrated visual display system
JP5068779B2 (ja) * 2009-02-27 2012-11-07 現代自動車株式会社 車両周囲俯瞰画像表示装置及び方法
US20110169957A1 (en) * 2010-01-14 2011-07-14 Ford Global Technologies, Llc Vehicle Image Processing Method
JP5505702B2 (ja) * 2010-02-24 2014-05-28 アイシン精機株式会社 車両周辺監視装置
DE102010010912A1 (de) * 2010-03-10 2010-12-02 Daimler Ag Fahrerassistenzvorrichtung mit optischer Darstellung erfasster Objekte
CN102906593B (zh) * 2010-05-19 2015-06-17 三菱电机株式会社 车辆后方监视装置
DE102010062254B4 (de) * 2010-12-01 2024-05-02 Robert Bosch Gmbh Verfahren zur Darstellung einer mittels Sensoren erfassten Umgebung und Vorrichtung zur Darstellung einer von fahrzeuggestützten Sensoren erfassten Umgebung
JP5124672B2 (ja) * 2011-06-07 2013-01-23 株式会社小松製作所 作業車両の周辺監視装置
JP5124671B2 (ja) * 2011-06-07 2013-01-23 株式会社小松製作所 作業車両の周辺監視装置
DE102011084554A1 (de) * 2011-10-14 2013-04-18 Robert Bosch Gmbh Verfahren zur Darstellung eines Fahrzeugumfeldes
KR102003562B1 (ko) * 2012-12-24 2019-07-24 두산인프라코어 주식회사 건설기계의 감지 장치 및 방법
JP5997640B2 (ja) * 2013-03-25 2016-09-28 株式会社ジオ技術研究所 3次元画像出力装置および背景画像生成装置
CN103600695B (zh) * 2013-11-22 2016-01-27 奇瑞汽车股份有限公司 一种检测后视盲区内车辆的方法及设备
DE102014205511A1 (de) * 2014-03-25 2015-10-01 Conti Temic Microelectronic Gmbh Verfahren und vorrichtung zur anzeige von objekten auf einer fahrzeuganzeige
DE102014107235A1 (de) * 2014-05-22 2015-11-26 Dr. Ing. H.C. F. Porsche Aktiengesellschaft Verfahren zur Darstellung einer Fahrzeugumgebung auf einer Anzeigevorrichtung; eine Anzeigevorrichtung; ein System aus einer Mehrzahl an Bilderfassungseinheiten und einer Anzeigevorrichtung; ein Computerprogramm
US10168425B2 (en) * 2014-07-03 2019-01-01 GM Global Technology Operations LLC Centralized vehicle radar methods and systems
KR20170055091A (ko) 2015-11-10 2017-05-19 현대오트론 주식회사 헤드업 디스플레이 제어 장치 및 방법
JP6669569B2 (ja) * 2016-04-04 2020-03-18 アルパイン株式会社 車両用周辺監視装置
US9747804B1 (en) * 2016-06-23 2017-08-29 GM Global Technology Operations LLC Object detection-based directional control of light and sound
CA3041177C (fr) * 2016-10-13 2021-08-31 Nissan Motor Co., Ltd. Procede et dispositif d'estimation de position d'hote
EP3537714B1 (fr) * 2017-02-28 2021-08-25 JVC Kenwood Corporation Dispositif, système et procédé de génération d'image vidéo de vue à vol d'oiseau et programme associé
US10579067B2 (en) * 2017-07-20 2020-03-03 Huawei Technologies Co., Ltd. Method and system for vehicle localization
US10497264B2 (en) 2017-09-26 2019-12-03 Toyota Research Institute, Inc. Methods and systems for providing warnings of obstacle objects
CN108725319B (zh) * 2017-10-31 2021-05-04 无锡职业技术学院 一种影像式倒车指导方法
KR102559686B1 (ko) * 2018-12-17 2023-07-27 현대자동차주식회사 차량 및 차량 영상 제어방법
CN110949286B (zh) * 2019-12-16 2021-06-04 广州小鹏汽车科技有限公司 车辆及其控制方法与装置
CN112158197B (zh) * 2020-08-21 2021-08-27 恒大新能源汽车投资控股集团有限公司 一种车辆盲区障碍物规避方法、装置及系统
US11987961B2 (en) * 2021-03-29 2024-05-21 Joy Global Surface Mining Inc Virtual field-based track protection for a mining machine
US11939748B2 (en) 2021-03-29 2024-03-26 Joy Global Surface Mining Inc Virtual track model for a mining machine

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002083284A (ja) * 2000-06-30 2002-03-22 Matsushita Electric Ind Co Ltd 描画装置
JP2003267171A (ja) * 2002-03-13 2003-09-25 Nissan Motor Co Ltd 車両後方監視装置
JP2003348574A (ja) * 2002-05-24 2003-12-05 Nissan Motor Co Ltd 車両用映像表示装置

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH06333200A (ja) * 1993-05-21 1994-12-02 Toshiba Corp 車載用監視システム
JPH0717328A (ja) 1993-06-30 1995-01-20 Mitsubishi Motors Corp 車両用周辺認識補助装置
JP3371605B2 (ja) * 1995-04-19 2003-01-27 日産自動車株式会社 大気効果表示機能付き鳥瞰図表示ナビゲーションシステム
CN2238765Y (zh) * 1995-08-18 1996-10-30 兰天 汽车倒车防撞监视器
EP1115250B1 (fr) * 1998-07-31 2012-06-06 Panasonic Corporation Procede et appareil d'affichage d'images
JP3596314B2 (ja) * 1998-11-02 2004-12-02 日産自動車株式会社 物体端の位置計測装置および移動体の通行判断装置
JP2000161915A (ja) * 1998-11-26 2000-06-16 Matsushita Electric Ind Co Ltd 車両用単カメラ立体視システム
CN2417651Y (zh) * 1998-12-21 2001-01-31 吴晧华 汽车安全防撞监视装置
EP2410741A1 (fr) * 1999-04-16 2012-01-25 Panasonic Corporation Appareil de traitement d'images et système de surveillance
JP3966673B2 (ja) * 1999-10-26 2007-08-29 本田技研工業株式会社 物体検知装置および車両の走行安全装置
JP4660872B2 (ja) * 2000-02-09 2011-03-30 ソニー株式会社 運転支援装置及び運転支援方法
US6369701B1 (en) * 2000-06-30 2002-04-09 Matsushita Electric Industrial Co., Ltd. Rendering device for generating a drive assistant image for drive assistance
JP3750512B2 (ja) * 2000-10-12 2006-03-01 日産自動車株式会社 車両用周辺障害物検出装置
JP4861574B2 (ja) * 2001-03-28 2012-01-25 パナソニック株式会社 運転支援装置
JP2002359839A (ja) * 2001-03-29 2002-12-13 Matsushita Electric Ind Co Ltd リアビューカメラの画像表示方法及びその装置
JP2002316602A (ja) * 2001-04-24 2002-10-29 Matsushita Electric Ind Co Ltd 車載カメラの撮像画像表示方法及びその装置
JP3608527B2 (ja) * 2001-05-15 2005-01-12 株式会社豊田中央研究所 周辺状況表示装置
US6636258B2 (en) * 2001-10-19 2003-10-21 Ford Global Technologies, Llc 360° vision system for a vehicle
US7145519B2 (en) 2002-04-18 2006-12-05 Nissan Motor Co., Ltd. Image display apparatus, method, and program for automotive vehicle
JP3683258B2 (ja) * 2002-05-31 2005-08-17 松下電器産業株式会社 車両周辺監視装置、画像生成方法および画像生成プログラム
US7230524B2 (en) * 2003-03-20 2007-06-12 Matsushita Electric Industrial Co., Ltd. Obstacle detection device
JP2007099261A (ja) * 2005-09-12 2007-04-19 Aisin Aw Co Ltd 駐車支援方法及び駐車支援装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002083284A (ja) * 2000-06-30 2002-03-22 Matsushita Electric Ind Co Ltd 描画装置
JP2003267171A (ja) * 2002-03-13 2003-09-25 Nissan Motor Co Ltd 車両後方監視装置
JP2003348574A (ja) * 2002-05-24 2003-12-05 Nissan Motor Co Ltd 車両用映像表示装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013145878A1 (fr) * 2012-03-29 2013-10-03 住友建機株式会社 Dispositif de surveillance de périmètre pour engin de travaux publics
JP2013205402A (ja) * 2012-03-29 2013-10-07 Sumitomo (Shi) Construction Machinery Co Ltd 作業機械用周辺監視装置
US9715015B2 (en) 2012-03-29 2017-07-25 Sumitomo(S.H.I.) Construction Machinery Co., Ltd. Periphery-monitoring device for working machines
WO2017065352A1 (fr) * 2015-10-13 2017-04-20 엘지전자 주식회사 Appareil de fourniture de vision pour véhicule et véhicule

Also Published As

Publication number Publication date
US7369041B2 (en) 2008-05-06
EP1748654A4 (fr) 2013-01-02
US20070120656A1 (en) 2007-05-31
CN1898961A (zh) 2007-01-17
EP1748654A1 (fr) 2007-01-31
JPWO2005107261A1 (ja) 2008-03-21
CN100456828C (zh) 2009-01-28

Similar Documents

Publication Publication Date Title
WO2005107261A1 (fr) Visualisation de circonférence d’un véhicule
JP5729158B2 (ja) 駐車支援装置および駐車支援方法
JP3776094B2 (ja) 監視装置、監視方法および監視用プログラム
JP6562709B2 (ja) 駐車支援装置および駐車支援方法
US8446268B2 (en) System for displaying views of vehicle and its surroundings
JP4883977B2 (ja) 車両用画像表示装置
JP6790998B2 (ja) 障害物検知装置および制御装置
US20090015675A1 (en) Driving Support System And Vehicle
US11257369B2 (en) Off road route selection and presentation in a drive assistance system equipped vehicle
JP6392693B2 (ja) 車両周辺監視装置、車両周辺監視方法、及びプログラム
CN108259879B (zh) 图像生成装置及图像生成方法
CN108269235A (zh) 一种基于opengl的车载环视多视角全景生成方法
JP6565188B2 (ja) 視差値導出装置、機器制御システム、移動体、ロボット、視差値導出方法、およびプログラム
WO2002080557A1 (fr) Dispositif d'aide a la conduite
JP2016120892A (ja) 立体物検出装置、立体物検出方法および立体物検出プログラム
JP2004114977A (ja) 移動体周辺監視装置
JP6313992B2 (ja) 牽引車用周囲監視装置
JP2012066616A (ja) 運転支援装置
JPWO2019202628A1 (ja) 路面検出装置、路面検出装置を利用した画像表示装置、路面検出装置を利用した障害物検知装置、路面検出方法、路面検出方法を利用した画像表示方法、および路面検出方法を利用した障害物検知方法
JP2004240480A (ja) 運転支援装置
JP4154980B2 (ja) 移動体周辺監視装置
JP2013239015A (ja) 駐車支援装置、駐車支援方法およびプログラム
CN105313775A (zh) 全景式监控影像装置及其方法
JP2020068515A (ja) 画像処理装置
JP6656359B2 (ja) 駐車支援用表示制御装置および駐車支援用表示制御方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200580001419.8

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2007120656

Country of ref document: US

Ref document number: 10573685

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2005734727

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2006512752

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

WWP Wipo information: published in national office

Ref document number: 2005734727

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 10573685

Country of ref document: US