[go: up one dir, main page]

WO2017158829A1 - Dispositif de commande d'affichage et procédé de commande d'affichage - Google Patents

Dispositif de commande d'affichage et procédé de commande d'affichage Download PDF

Info

Publication number
WO2017158829A1
WO2017158829A1 PCT/JP2016/058749 JP2016058749W WO2017158829A1 WO 2017158829 A1 WO2017158829 A1 WO 2017158829A1 JP 2016058749 W JP2016058749 W JP 2016058749W WO 2017158829 A1 WO2017158829 A1 WO 2017158829A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
distance
display
vehicle
generation unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/058749
Other languages
English (en)
Japanese (ja)
Inventor
聖崇 加藤
篤史 前田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Priority to PCT/JP2016/058749 priority Critical patent/WO2017158829A1/fr
Publication of WO2017158829A1 publication Critical patent/WO2017158829A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/25Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view to the sides of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present invention relates to a technique for displaying an image of an area where a driver's view is blocked.
  • a blind spot image display device for a vehicle described in Patent Document 1 defines a virtual projection spherical surface having a radius from a driver's viewpoint position to a subject as a radius, and each pixel of the display using the driver's viewpoint position as a viewpoint. The intersection of the straight line passing through the virtual projection sphere and the virtual projection spherical surface is obtained, the image of the image captured by the camera corresponding to the obtained intersection is specified, and an image to be displayed on the display is generated.
  • the present invention was made to solve the above-described problems, and even when there are a plurality of subjects in an area where the driver's field of view is blocked, an image of the area where the driver's field of view is blocked is captured.
  • the purpose is to display the image continuously with the scenery outside the vehicle viewed from the viewpoint position of the driver.
  • the display control apparatus uses a captured image obtained by capturing a blind spot area in which a driver's field of view is blocked by a structure of the vehicle in a peripheral area of the vehicle, from a driver's viewpoint position to a blind spot.
  • a distance image generation unit that generates a distance image using a value of a distance to an object located in the region as a pixel value, and a driver's viewpoint position in a region around the vehicle using the distance image generated by the distance image generation unit
  • a plurality of virtual projection spheres defined as outer peripheral surfaces of a sphere centered on the image are set, and the coordinates on the display screen for displaying the captured image are displayed on the imaging surface of the captured image using the plurality of virtual projection spheres.
  • a display image generation unit that generates an image to be displayed on the display screen using the coordinates on the imaging surface converted by the coordinate conversion unit.
  • an image obtained by capturing an area where the driver's view is blocked can be displayed continuously with the scenery outside the vehicle viewed from the viewpoint position of the driver.
  • FIG. 1 is a block diagram illustrating a configuration of a display control device according to a first embodiment.
  • 2 is a diagram illustrating a hardware configuration of a display control apparatus according to Embodiment 1.
  • FIG. It is the figure which looked at the vehicle provided with the display control apparatus which concerns on Embodiment 1 from upper direction. It is the figure which looked at the part in the vehicle provided with the display control apparatus which concerns on Embodiment 1 from the rear seat. It is a figure which shows the scenery which a driver
  • 3 is a flowchart showing an operation of the display control apparatus according to the first embodiment.
  • FIG. 6 is a diagram illustrating an example of a captured image received by an image input unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating an example of a distance image generated by a distance image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a display image generated by a display image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a display image generated by a display image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a display image generated by a display image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram illustrating a display example of a display image generated by a display image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 4 is a flowchart illustrating an operation of a coordinate conversion unit of the display control device according to the first embodiment.
  • FIG. 6 is a diagram schematically illustrating coordinate conversion processing of a coordinate conversion unit of the display control device according to the first embodiment. 4 is a flowchart illustrating an operation of a display image generation unit of the display control apparatus according to the first embodiment.
  • 6 is a diagram showing a display image generated by a display image generation unit of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a diagram showing a display result of the display control apparatus according to Embodiment 1.
  • FIG. 6 is a block diagram illustrating a configuration of a display control device according to Embodiment 2.
  • FIG. 6 is a diagram schematically illustrating coordinate conversion processing of a coordinate conversion unit of the display control device according to the first embodiment. 4 is a flowchart illustrating an operation of a display image generation unit of the display control apparatus according to the first embodiment.
  • 6 is a diagram showing a display image generated by
  • FIG. 10 is a flowchart illustrating an operation of a display image generation unit of the display control apparatus according to the second embodiment.
  • FIG. 10 is a diagram illustrating an example of setting conditions of the display control device according to the second embodiment.
  • 20A and 20B are diagrams illustrating an example of setting conditions of the display control device according to the second embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of a display control apparatus 10 according to the first embodiment.
  • the display control device 10 includes an image input unit 1, a distance information acquisition unit 2, a viewpoint information acquisition unit 3, an image processing unit 4, and a display control unit 8. Further, the image processing unit 4 includes a distance image generation unit 5, a coordinate conversion unit 6, and a display image generation unit 7.
  • the display control device 10 is mounted on, for example, a vehicle 20 described later.
  • the image input unit 1 receives an input of a captured image obtained by imaging an area around the vehicle 20 by an imaging unit such as a camera.
  • the captured image is an image obtained by capturing an area where the driver's field of view is blocked by at least a structure of the vehicle 20 such as a pillar (hereinafter referred to as a blind spot area) among the areas around the vehicle 20.
  • the distance information acquisition unit 2 acquires distance information that is a result of measuring a distance from the sensor to an object in the area by scanning at least a blind spot area by a sensor or the like mounted on the vehicle 20.
  • the viewpoint information acquisition unit 3 acquires viewpoint information indicating the driver's viewpoint position.
  • the viewpoint position of the driver is, for example, the position of the driver's eyes or the position of the head.
  • the captured image received by the image input unit 1, the information acquired by the distance information acquisition unit 2 and the viewpoint information acquisition unit 3 are output to the image processing unit 4.
  • position information indicating the arrangement position of the imaging means, position information indicating the arrangement position of the distance measuring means, and position information indicating the arrangement position of the display described later are set in advance.
  • the distance image generation unit 5 of the image processing unit 4 uses at least the blind spot from the driver's viewpoint position using the distance information acquired by the distance information acquisition unit 2 and the driver's viewpoint information acquired by the viewpoint information acquisition unit 3. The distance to the object located in the area is calculated.
  • the distance image generation unit 5 further refers to the position information of the image pickup unit and the position information of the distance measurement unit, for each pixel of the picked-up image received by the image input unit 1 from the viewpoint position of the driver.
  • a distance image is generated in which the distance values up to (hereinafter referred to as distance values) are associated as pixel values.
  • the distance image generation unit 5 performs a process of generating a single same target area by combining the areas having the same distance from the viewpoint position of the driver among the areas where the distance is measured, and the distance of the same target area
  • the subject area and the background area are set according to the above. For example, when the distance value of the generated same target area is less than the threshold, the distance image generation unit 5 sets the same target area as the subject area, and when the generated distance value of the same target area is greater than or equal to the threshold The same target area is set as the background area.
  • the coordinate conversion unit 6 refers to the same target region generated by the distance image generation unit 5 and the distance information of the distance image, the subject region set from the driver's viewpoint position with the driver's viewpoint position as the center.
  • a virtual projection spherical surface defined as the outer peripheral surface of the sphere with the radius up to is set.
  • the coordinate conversion unit 6 sets a virtual projection spherical surface defined as an outer peripheral surface of a sphere centered on the driver's viewpoint position and having a radius from the driver's viewpoint position to the set background area as a radius.
  • the coordinate conversion unit 6 converts coordinates on the display (display screen) in the vehicle viewed from the viewpoint position of the driver into coordinates on the imaging surface of the captured image using the set virtual projection spherical surface. Thereby, the image data displayed on the display in the vehicle is specified by the coordinates of the imaging surface, that is, the pixels.
  • the display in the vehicle is a display arranged on a structure of the vehicle 20 such as a pillar that generates a blind spot area, and is a display for displaying an image of the blind spot area. Details of the processing of the coordinate conversion unit 6 will be described later.
  • the display image generation unit 7 generates image data for each virtual projection spherical surface set by the coordinate conversion unit 6 using the coordinates converted by the coordinate conversion unit 6, that is, the coordinates on the imaging surface of the captured image.
  • the display image generation unit 7 refers to the distance value of the distance image generated by the distance image generation unit 5 and selects image data of the virtual projection spherical surface used for each of the subject region and the background region.
  • the display image generation unit 7 generates display image data from the selected image data based on a preset display position and display size. Further, the display image generation unit 7 may have functions such as processing, synthesis, and graphics superimposition of the generated display image data. For example, the display image generation unit 7 may convert the display image data so that a menu screen or an alert screen is superimposed on the display image.
  • the display control unit 8 generates display control information for displaying an image based on the display image data generated by the display image generation unit 7 on a display or the like, and outputs the display control information to the display or the like.
  • FIG. 2 is a diagram illustrating a hardware configuration example of the display control apparatus 10.
  • the image input unit 1, the distance information acquisition unit 2, the viewpoint information acquisition unit 3, the distance image generation unit 5, the coordinate conversion unit 6, the display image generation unit 7, and the display control unit 8 in the display control device 10 are realized by a processing circuit.
  • a processing circuit for performing display control of the displayed image is realized by a processing circuit.
  • the processing circuit is dedicated hardware, the processing circuit is, for example, a single circuit, a composite circuit, a programmed processor, a parallel programmed processor, an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-programmable Gate Array). ) Or a combination of these.
  • the functions of the image input unit 1, distance information acquisition unit 2, viewpoint information acquisition unit 3, distance image generation unit 5, coordinate conversion unit 6, display image generation unit 7 and display control unit 8 are realized by processing circuits. Alternatively, the functions of the respective units may be combined and realized by a processing circuit.
  • the processing circuit is a CPU (Central Processing Unit)
  • the processing circuit is the CPU 12 that executes a program stored in the memory 13 shown in FIG.
  • the functions of the image input unit 1, distance information acquisition unit 2, viewpoint information acquisition unit 3, distance image generation unit 5, coordinate conversion unit 6, display image generation unit 7 and display control unit 8 are software, firmware, or software and firmware. It is realized by the combination.
  • Software or firmware is described as a program and stored in the memory 13.
  • the CPU 12 reads out and executes a program stored in the memory 13 to thereby execute an image input unit 1, a distance information acquisition unit 2, a viewpoint information acquisition unit 3, a distance image generation unit 5, a coordinate conversion unit 6, and a display image generation unit. 7 and the display control unit 8 are realized.
  • these programs include procedures or methods of the image input unit 1, the distance information acquisition unit 2, the viewpoint information acquisition unit 3, the distance image generation unit 5, the coordinate conversion unit 6, the display image generation unit 7, and the display control unit 8. It can also be said to be executed by a computer.
  • the CPU 12 is, for example, a central processing unit, a processing unit, an arithmetic unit, a processor, a microprocessor, a microcomputer, or a DSP (Digital Signal Processor).
  • the memory 13 may be, for example, a nonvolatile or volatile semiconductor memory such as a RAM (Random Access Memory), a ROM (Read Only Memory), a flash memory, an EPROM (Erasable Programmable ROM), or an EEPROM (Electrically EPROM). Further, it may be a magnetic disk such as a hard disk or a flexible disk, or an optical disk such as a mini disk, CD (Compact Disc), or DVD (Digital Versatile Disc).
  • FIG. 3 is a view of the vehicle 20 including the display control device 10 according to the first embodiment as viewed from above.
  • FIG. 4 is a view of a part of the vehicle 20 provided with the display control device 10 according to the first embodiment as seen from the rear seat.
  • the driver A of the vehicle 20 is blocked from view by a front pillar (so-called A pillar) 21 on the left side of the vehicle 20, for example.
  • the viewpoint information acquired by the viewpoint information acquisition unit 3 is calculated using a captured image obtained by capturing the interior of the vehicle by the in-vehicle camera 33 or the like mounted on the vehicle 20.
  • the in-vehicle camera 33 arranged in front of the vehicle 20 captures the inside of the vehicle to acquire the captured image, and a calculation unit (not illustrated) calculates the face image of the driver A included in the captured image.
  • the viewpoint position 105 is calculated by analysis.
  • the viewpoint information can be acquired by using a known detection technique such as triangulation using captured images captured by a plurality of cameras or TOF (Time of Flight) using a monocular camera.
  • the viewpoint information may be calculated based on the physical information of the driver A registered in advance, or may be calculated and acquired based on the seat position of the driver's seat, the angle of the rearview mirror, and the angle of the side mirror. Also good.
  • the viewpoint position 105 indicated by the viewpoint information is set as a reference position for calculating the distance from the driver A to the object.
  • the captured image input to the image input unit 1 is captured by the vehicle exterior camera 31 a mounted on the vehicle 20.
  • the vehicle outside camera 31a is an imaging unit that images at least a blind spot area in which the driver's field of view is blocked by a structure of the vehicle 20, such as a pillar, in the peripheral area of the vehicle 20.
  • the outside camera 31 a is installed toward the left front of the vehicle 20, for example, near the side mirror 22.
  • the vehicle exterior camera 31a images an object or the like existing in the imaging range 102 that is an area around the vehicle 20 including at least the blind spot area 101.
  • the vehicle exterior camera 31a may be installed on the root portion of the front pillar 21 (not shown), the roof (not shown), the vehicle interior window (not shown), etc. in addition to the vicinity of the side mirror 22.
  • the installation position is not limited.
  • a plurality of outside cameras 31a may be arranged.
  • the vehicle exterior camera 31a may be configured by a single camera having a wide-angle lens, and the vehicle exterior camera 31a may be used for imaging both the left and right blind spot areas of the vehicle 20.
  • the vehicle exterior camera 31a is a camera that captures a color image, and includes an optical mechanism including a lens and a mirror, and an image sensor such as a CCD (Charge-Coupled Device) or CMOS (Complementary-Metal-Oxide Semiconductor) image sensor.
  • the outside camera 31a may include an infrared sensor or a light emitting element in order to enable night imaging.
  • the raw image data generated by the image sensor of the outside camera 31a is subjected to preprocessing such as color conversion processing, format conversion processing, filtering processing, and the like as necessary, and then the captured image To the image input unit 1.
  • the distance information acquired by the distance information acquisition unit 2 is generated by the distance measuring sensor 32a mounted on the vehicle 20.
  • the distance measuring sensor 32 a is disposed, for example, near the side mirror 22 toward the left front of the vehicle 20.
  • the distance measurement sensor 32 a measures the distance 103 to the object B and the distance 104 to the object C.
  • the distance measuring sensor 32a is not limited to the distance to the object B and the object C, but has a resolution range in which the distance to all the objects existing in the blind spot area 101 and each part constituting the object can be obtained.
  • the distance information is generated.
  • the distance measuring sensor 32a for example, a well-known distance measuring technique for a vehicle such as a millimeter wave radar, a laser radar, or an ultrasonic sensor can be applied. Further, the outside camera 31a may be used as a distance measuring unit. In this case, the distance information is generated from a distance measured using a technique such as triangulation using a plurality of cameras or TOF using a monocular camera. The distance information acquired from the distance measuring sensor 32a is updated at a constant cycle such as 10 to 240 Hz.
  • the display control information generated by the display control unit 8 is output to the display 34a.
  • the display 34 a is arranged on the surface of the front pillar 21 of the vehicle 20 that is visible to the driver A so as to overlap the blind spot area 101 that is a blind spot from the viewpoint position 105 of the driver A.
  • the display 34a is configured by using various display devices such as a liquid crystal display (LCD), an organic electro luminescence (OLE) display, and a projector. Further, the display 34a may be a display in which a plurality of small displays are arranged side by side.
  • the display 34a may include an illuminance sensor or the like for adjusting the brightness, and may display an image adjusted according to the amount of solar radiation in the vehicle interior.
  • the display image generation unit 7 has information on the arrangement position of the display 34a and the display size of the display 34a.
  • the vehicle exterior camera 31 b and the distance measuring sensor 32 b can be arranged in front of the right side of the vehicle 20 to acquire a captured image and distance information of the blind spot area.
  • a display image that displays an image of the blind spot area in front of the right side of the vehicle 20 is generated by the display control device 10 and is displayed on the display 34 b disposed in front of the front pillar 23.
  • the display control device 10 For other pillars other than the front pillars 21 and 23 (so-called B pillars, C pillars, etc.), the display control device 10 generates a display image from an image obtained by imaging the blind spot area, and is provided in the pillar. Can be displayed on the display.
  • FIG. 5 shows what the driver A of the vehicle 20 looks in the state shown in FIGS. 3 and 4 when he / she sees an area where the two objects B and C as persons are present from the viewpoint position 105. It is a figure. As shown in FIG.
  • the front pillar 21 exists between the front window 24 of the vehicle 20 and the left side window 25 of the vehicle 20. .
  • the field of view of the driver A is blocked by the front pillar 21, and a part of the object B and a part of the object C cannot be visually recognized.
  • a region where the field of view is blocked by the front pillar 21 is a blind spot region 101.
  • the display control device 10 generates a display image of the blind spot area 101 that cannot be visually recognized by the driver A, and performs display control for displaying the display image on the display 34a.
  • FIG. 6 is a flowchart showing the operation of the display control apparatus 10 according to the first embodiment.
  • the image input unit 1, the distance information acquisition unit 2, and the viewpoint information acquisition unit 3 acquire various types of information (step ST1). Specifically, the image input unit 1 receives an input of a captured image captured by the outside camera 31a. Further, the distance information acquisition unit 2 acquires distance information from the distance measuring sensor 32a to an object existing in the blind spot area 101. Further, the viewpoint information acquisition unit 3 acquires viewpoint information indicating the viewpoint position 105 of the driver A.
  • the distance image generation unit 5 includes the distance information acquired by the distance information acquisition unit 2 in step ST1, the position information of the distance measuring unit preset in the image processing unit 4, and the viewpoint acquired by the viewpoint information acquisition unit 3 in step ST1. From the information, the distance from the viewpoint position of the driver to each object is calculated (step ST2).
  • the distance image generation unit 5 for example, the distance 103 from the distance measurement sensor 32 a to the object B, the position information of the distance measurement sensor 32 a set in advance in the image processing unit 4, and the driver A's A distance value 201d from the viewpoint position 105 to the object B is calculated from the viewpoint position 105.
  • the distance image generation unit 5 includes the distance 104 from the distance measurement sensor 32a to the object C, the position information of the distance measurement sensor 32a preset in the image processing unit 4, and the viewpoint position 105 of the driver A.
  • the distance value 202d from the viewpoint position 105 to the object C is calculated.
  • one distance is shown as a representative, but distance information is also generated for at least the distances of the parts constituting the object existing in the blind spot area.
  • the distance image generation unit 5 refers to the distance from the viewpoint position of the driver calculated in step ST2 to each object, sets the subject area and the background area by dividing the area where the distance is measured by the distance measuring sensor 32a. (Step ST3).
  • the distance image generation unit 5 collectively sets adjacent areas having the same distance from the viewpoint position 105 of the driver A as one identical target area.
  • the distance is the same, the distance values do not need to be exactly the same, and the two distance values are approximate values (for example, 5 m ⁇ 30 cm) and are distance values obtained by measuring the same object. If it can be determined, the distances are considered to be the same.
  • the distance image generation unit 5 determines that the area is an object located near the vehicle 20 and sets it as a subject area.
  • the threshold is set to “30 m”, for example.
  • the distance image generation unit 5 determines that the area is an object located at a point away from the vehicle 20 and sets it as a background area.
  • the subject area is not limited to one, and a plurality of subject areas are set as long as the condition is satisfied.
  • the distance image generation unit 5 is configured such that, for example, the size of a certain same target area is less than a certain value even if the distance value of the generated same target area is less than a threshold, and the same target area is small enough to be ignored. If it is determined, it may be additionally determined that it is the background area.
  • the distance image generation unit 5 associates a distance value with the set subject area and background area.
  • the distance image generation unit 5 associates an average value or a median value of the distance values of the subject area as the distance value of the subject area.
  • the distance image generation unit 5 uses a distance value farther than the distance value associated with the subject area as the distance value of the background area, and the distance associated with the two subject areas when there are two subject areas. The median value or the same distance value as one subject area is associated.
  • the distance image generation unit 5 considers the difference between the arrangement positions of the outside camera 31a and the distance measuring sensor 32a based on the position information of the outside camera 31a and the position information of the distance measuring sensor 32a set in the image processing unit 4 in advance.
  • a distance image is generated by associating the distance value between the subject area and the background area set in step ST3 as a pixel value for each pixel of the captured image received by the image input unit 1 in step ST1 (step ST4). ).
  • FIG. 7 is a diagram illustrating a captured image received by the image input unit 1, and an object B and an object C that are persons exist in the captured image.
  • the distance image generation unit 5 is a diagram illustrating a distance image generated by the distance image generation unit 5, and includes a subject area Ba, a subject area Ca, and a background area D.
  • the subject area Ba is an area associated with a distance value 201d
  • the subject area Ca is an area associated with a distance value 202d
  • the background area D is associated with a distance value 203d. Indicates that this is an area.
  • the coordinate conversion unit 6 refers to the distance image generated in step ST4, and in the area around the vehicle 20, the driver's viewpoint position is the center, and the distance value set in each subject area is the radius.
  • a projection spherical surface is set (step ST5).
  • the coordinate conversion unit 6 uses the driver's viewpoint position as the center in the area around the vehicle 20 and sets the distance value set in the background area as the radius.
  • a virtual projection spherical surface is set (step ST6).
  • the coordinate conversion unit 6 performs the step ST5 and the step ST6, the virtual projection spherical surface 201 having the radius of the distance value 201d set in the subject area Ba and the distance value set in the subject area Ca.
  • a virtual projection spherical surface 202 having a radius of 202d and a virtual projection spherical surface 203 having a radius of the distance value 203d set in the background region D are set.
  • the setting of the virtual projection spherical surface is virtually performed in the calculation processing place in the display control apparatus 10.
  • the virtual projection spherical surfaces 201, 202, and 203 in FIG. 3 are merely the virtual spherical surfaces.
  • the coordinate conversion unit 6 converts the coordinate value on the display viewed from the viewpoint position into the coordinate value on the imaging surface of the imaging means using the virtual projection spherical surface set in step ST5 and step ST6 (step ST7).
  • the coordinate conversion unit 6 uses the virtual projection spherical surfaces 201, 202, 203 in step ST ⁇ b> 7, and the coordinate values on the surface of the display 34 a viewed from the viewpoint position 105 are not shown in the camera 31 a outside the vehicle. Convert to coordinate values on the imaging surface. Details of the coordinate conversion processing of the coordinate conversion unit 6 will be described later.
  • the display image generation unit 7 generates a display image for each virtual projection spherical surface using the coordinate values on the imaging surface obtained by converting the coordinate values on the display surface of the display in step ST7 (step ST8). .
  • the display image generation unit 7 uses the virtual projection spherical surfaces 201, 202, and 203 to convert the coordinates on the surface of the display 34 a to display images for display from the coordinate values on the imaging surface. Generate.
  • the display image generation unit 7 generates a display image to be displayed on the display using each display image generated in step ST8 (step ST9).
  • the display image generation unit 7 converts the display image of the subject area Ba into the display area generated by coordinate conversion using the virtual projection spherical surface 201 and the subject area information indicating the object B.
  • the display image of the subject area Ca is generated from the display image generated using the virtual projection spherical surface 202 and the information of the subject area indicating the object C, and is used for displaying the background area D.
  • An image is generated from the display image generated using the virtual projection spherical surface 203 and the subject area information indicating the background.
  • the display image generation unit 7 generates an overall display image to be displayed on the display by integrating the display images.
  • the display control unit 8 performs display control for displaying the display image generated in step ST9 on the display (step ST10), and returns to the process of step ST1.
  • FIG. 9 to 11 are diagrams showing the respective display images generated by the display image generation unit 7.
  • FIG. 9 shows a display image configured using coordinate values on the imaging screen converted using the virtual projection spherical surface 201, and the size of the subject region Bb is the size when the object B is viewed from the viewpoint position 105.
  • FIG. 10 shows a display image configured using coordinate values on the imaging screen converted using the virtual projection spherical surface 202, and the size of the subject region Cb is the size when the object C is viewed from the viewpoint position 105.
  • FIG. 11 shows a display image configured using coordinate values on the imaging screen converted using the virtual projection spherical surface 203.
  • the display image generated in step ST9 is an image in which the object B, the object C, and the background coincide with the size when viewed from the viewpoint position 105 as shown in FIG. .
  • FIG. 12 is a flowchart showing the operation of the coordinate conversion unit 6 of the display control apparatus 10 according to the first embodiment.
  • FIG. 13 is a diagram schematically illustrating a coordinate conversion process of the coordinate conversion unit 6 of the display control apparatus 10 according to the first embodiment.
  • FIG. 13 are the same as those shown in FIG. Each corresponds. Further, the subject areas Ba and Ca and the background area D shown in FIG. 13 correspond to the respective areas on the distance image shown in FIG. Moreover, in FIG. 13, the display surface 301 of the display 34a and the imaging surface 302 of the vehicle exterior camera 31a are shown.
  • the coordinate conversion unit 6 calculates the coordinate in of the intersection of the half line passing the coordinate pn on the display surface 301 of the display 34a from the viewpoint position 105 and the virtual projection spherical surface N (step ST21).
  • the coordinate pn is a coordinate in a coordinate system set so that the position of each point on the display surface 301 can be specified.
  • the coordinates on the virtual projection spherical surface N may be any as long as the position on the virtual projection spherical surface N can be specified.
  • the coordinate conversion unit 6 calculates a coordinate cn indicating an intersection of the straight line connecting the coordinate in calculated in step ST21 and the camera 31a outside the vehicle and the imaging surface 302 as a coordinate after conversion (step ST22).
  • the coordinates on the imaging surface 302 are coordinates in a coordinate system set so that the position of each point on the imaging surface 302 can be specified.
  • the coordinate conversion unit 6 determines whether all necessary coordinates on the display surface 301 have been converted (step ST23). If all necessary coordinates on the display surface 301 have not been converted (step ST23; NO), the coordinate conversion unit 6 sets the next coordinates on the display surface 301 (step ST24), and returns to the processing of step ST21.
  • the necessary coordinates are, for example, coordinate systems respectively corresponding to all the pixels on the display 34a.
  • step ST23 when the coordinates of all the points on the display surface 301 are converted (step ST23; YES), the coordinate conversion unit 6 determines whether the coordinate conversion has been performed for all the virtual projection spherical surfaces (step ST25). When coordinates have not been converted for all virtual projection spheres (step ST25; NO), the coordinate conversion unit 6 sets the next virtual projection sphere (step ST26) and returns to the process of step ST21. On the other hand, when coordinates are converted for all virtual projection spherical surfaces (step ST25; YES), the process proceeds to step ST8 of the flowchart of FIG.
  • the coordinate conversion unit 6 performs the processing from step ST21 to step ST26. It carries out in order with respect to the virtual projection spherical surface 201, the virtual projection spherical surface 202, and the virtual projection spherical surface 203. Further, the coordinate conversion unit 6 repeats the processing from step ST21 to step ST24 for each virtual projection spherical surface 201, 202, 203.
  • the coordinate conversion unit 6 calculates the coordinate i1 of the intersection point on the virtual projection spherical surface 201 and the half line passing through the viewpoint position 105 and the coordinate p1 on the display surface 301 in step ST21. To do.
  • step ST ⁇ b> 22 the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 1 of the intersection of the straight line connecting the coordinate i ⁇ b> 1 and the vehicle exterior camera 31 a and the imaging surface 302.
  • step ST21 the coordinate conversion unit 6 uses the half line passing through the viewpoint position 105 and the coordinate p2 on the display surface 301 and the coordinate i2 of the intersection point on the virtual projection spherical surface 201. calculate.
  • step ST ⁇ b> 22 the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 2 of the intersection of the straight line connecting the coordinate i ⁇ b> 2 and the outside camera 31 a and the imaging surface 302.
  • the coordinate conversion unit 6 has all necessary coordinates on the display surface 301. Perform coordinate transformation for.
  • the coordinate conversion unit 6 calculates the coordinate i5 of the intersection point on the virtual projection spherical surface 202 and the half line passing the viewpoint position 105 and the coordinate p5 on the display surface 301 in step ST21.
  • the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 5 of the intersection point between the imaging surface 302 and the straight line connecting the coordinate i ⁇ b> 5 and the vehicle exterior camera 31 a.
  • step ST21 the coordinate conversion unit 6 uses the half line passing through the viewpoint position 105 and the coordinate p6 on the display surface 301 and the coordinate i6 of the intersection point on the virtual projection spherical surface 202. calculate.
  • step ST ⁇ b> 22 the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 6 of the intersection point between the imaging surface 302 and the straight line connecting the coordinate i ⁇ b> 6 and the vehicle exterior camera 31 a.
  • step ST ⁇ b> 6 the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 6 of the intersection point between the imaging surface 302 and the straight line connecting the coordinate i ⁇ b> 6 and the vehicle exterior camera 31 a.
  • the coordinate conversion unit 6 has all necessary coordinates on the display surface 301. Perform coordinate transformation for.
  • the coordinate conversion unit 6 calculates, as step ST21, the half line passing through the viewpoint position 105 and the coordinate p3 on the display surface 301 and the coordinate i3 of the intersection point on the virtual projection spherical surface 203.
  • step ST ⁇ b> 22 the coordinate conversion unit 6 calculates the coordinate c ⁇ b> 3 of the intersection between the straight line connecting the coordinate i ⁇ b> 3 and the outside camera 31 a and the imaging surface 302.
  • step ST21 the coordinate conversion unit 6 uses the half line passing through the viewpoint position 105 and the coordinate p4 on the display surface 301 and the coordinate i4 of the intersection point on the virtual projection spherical surface 203. calculate.
  • step ST ⁇ b> 22 the coordinate conversion unit 6 calculates a coordinate c ⁇ b> 4 of an intersection point between the imaging surface 302 and a straight line connecting the coordinate i ⁇ b> 4 and the vehicle exterior camera 31 a.
  • FIG. 14 is a flowchart showing the operation of the display image generation unit 7 of the display control apparatus 10 according to the first embodiment.
  • the display image generation unit 7 sets the image area of the display image displayed on the display from the position information such as the installation position of the display (step ST31).
  • the display image generation unit 7 acquires the distance value at each pixel position of the distance image generated in step ST4 for the image region set in step ST31 (step ST32).
  • the display image generation unit 7 selects a display image generated using the virtual projection spherical surface corresponding to the distance value acquired in step ST32 from the display image generated in step ST8 (step ST33).
  • the display image generation unit 7 determines whether display images have been selected at all pixel positions of the distance image (step ST34). When the display image is not selected for all pixel positions (step ST34; NO), the process returns to step ST33.
  • step ST34 when display images are selected for all pixel positions (step ST34; YES), the display image generation unit 7 integrates all display images selected in step ST33 to generate a display image. (Step ST35). The display image generation unit 7 outputs the display image generated in step ST35 to the display control unit 8 (step ST36). Thereafter, the process proceeds to step ST10 shown in the flowchart of FIG.
  • FIG. 15 is a diagram illustrating a display image generated by the display image generation unit 7 of the display control apparatus 10 according to the first embodiment.
  • the display image generation unit 7 sets the size of the image area 401 of the display image that can be displayed on the display 34a shown in FIG.
  • the display image generation unit 7 sets, for example, the distance value 201d at the pixel position 402, the distance value 202d at the pixel position 403, and the distance value 203d at the pixel position 404 for the image area 401 whose size is set. Set.
  • the pixel positions 402, 403, and 040 are shown, but distance values are set at all the pixel positions in the image area 401.
  • the display image generation unit 7 selects a display image generated using the virtual projection spherical surface 201 corresponding to the distance value 201d acquired at the pixel position 402, for example. Similarly, as step ST33, the display image generation unit 7 selects a display image generated using the virtual projection spherical surface 202 corresponding to the distance value 202d acquired at the pixel position 403. In step ST33, the display image generation unit 7 selects a display image generated using the virtual projection spherical surface 203 corresponding to the distance value 203d acquired at the pixel position 404.
  • step ST34 When the display image generation unit 7 selects display images for all the pixel positions in the image area 401 in step ST34 (step ST34; YES), the image area 401 shown in FIG. A display image as shown in FIG. As step ST ⁇ b> 36, the display image generation unit 7 outputs data indicating the generated display image to the display control unit 8.
  • FIG. 16 is a diagram illustrating a display result of the display control apparatus 10 according to the first embodiment.
  • the display control unit 8 performs display control for displaying the display image indicated by the image area 401 in FIG. 15 on the display 34 a arranged in front of the front pillar 21.
  • the display 34a displays the display image indicated by the image area 401, so that the driver A actually moves from the viewpoint position 105 through the front window 24 and the side window 25 as shown in FIG.
  • the scenery outside the vehicle seen in FIG. 5 and the display image of the blind spot area displayed on the display 34a on the front pillar 21 are continuously displayed. Thereby, the driver A does not feel discomfort in the continuity between the scenery of the vehicle actually viewed from the viewpoint position 105 and the display image on the display 34a.
  • the driver's viewpoint is obtained by using a captured image in which a blind spot area in which the driver's field of view is blocked by the vehicle structure is imaged.
  • a distance image generation unit 5 that generates a distance image using a value of a distance from a position to an object located in a blind spot area as a pixel value, and using the distance image, the driver's viewpoint position is centered in a peripheral area of the vehicle.
  • a plurality of virtual projection spheres defined as the outer peripheral surface of the sphere are set, and coordinates on the display screen for displaying the captured image are converted into coordinates on the imaging surface using the plurality of virtual projection spheres set. Since it is configured to include the coordinate conversion unit 6 and the display image generation unit 7 that generates an image to be displayed on the display screen using the converted coordinates on the imaging screen, Outside the actual vehicle as seen from the viewpoint position It can be displayed by continuously color.
  • the distance image generation unit 5 sets the distance image having three target regions based on the three distance values.
  • the distance value to be used is three. It is not limited to.
  • the distance image generation unit 5 may generate the same target area in units of pixels.
  • the coordinate conversion unit 6 sets a virtual projection spherical surface in units of pixels, and performs coordinate conversion using the virtual projection spherical surface of each pixel.
  • the distance image generation unit 5 may be configured to set conditions for generation of the distance image and reduce the number of virtual projection spherical surfaces set by the coordinate conversion unit 6.
  • the distance image generation unit 5 generates a distance image based on the condition that the number of virtual projection spherical surfaces is three. Further, the distance image generation unit 5 generates a distance image based on the condition that three objects in order from the smallest distance value are subjects for generating the same target region. Further, the distance image generation unit 5 generates a distance image based on the condition that the distance value of ⁇ 50 cm is regarded as the same distance value. In addition, the distance image generation unit 5 generates a distance image based on the condition that all objects at a position having a distance value of 30 m or more are regarded as the background. Further, the distance image generation unit 5 refers to the appearance frequency of the distance value, and generates a distance image based on a condition that a certain number of distance images are generated in descending order of appearance frequency.
  • FIG. 17 is a block diagram illustrating a configuration of the display control apparatus 10a according to the second embodiment.
  • the display control apparatus 10a according to the second embodiment is provided with a vehicle information acquisition unit 9 in addition to the display control apparatus 10 shown in the first embodiment, and includes an image processing unit 4a instead of the image processing unit 4.
  • the image processing unit 4a includes a distance image generation unit 5, a coordinate conversion unit 6a, and a display image generation unit 7a.
  • the same or corresponding parts as the components of the display control apparatus 10 according to the first embodiment are denoted by the same reference numerals as those used in the first embodiment, and the description thereof is omitted or simplified.
  • the vehicle information acquisition unit 9 acquires vehicle information of a vehicle on which the display control device 10 is mounted via an in-vehicle network (not shown).
  • the vehicle information is information indicating, for example, own vehicle position information, traveling direction, vehicle speed, acceleration, steering angle, and the like.
  • Information acquired by the vehicle information acquisition unit 9 is input to the image processing unit 4.
  • the coordinate conversion unit 6a of the image processing unit 4a determines the number of virtual projection spherical surfaces to be set according to the distance image generated by the distance image generation unit 5 and the vehicle information acquired by the vehicle information acquisition unit 9. For example, when the coordinate conversion unit 6a refers to the vehicle speed of the vehicle information, when it is determined that the vehicle is traveling at a high speed from the vehicle speed acquired by the vehicle information acquisition unit 9, the number of virtual projection spherical surfaces is reduced. By changing the set number of virtual projection spherical surfaces according to the vehicle information by the coordinate conversion unit 6a, the processing load of image processing can be suppressed.
  • the display image generation unit 7a changes the image data of the display image to be generated according to the vehicle information acquired by the vehicle information acquisition unit 9.
  • the case where the display image generation unit 7a refers to the vehicle speed of the vehicle information will be described.
  • the processing amount that can be processed by the display image generating unit 7a is calculated by the data amount (size) of one frame of the display image to be generated and the number of frames to be displayed per second.
  • the update speed of the display image is determined from the definition of one frame of the display image.
  • the display image generation unit 7a reduces the resolution of the display image to be generated, and conversely improves the update speed of the display image. Thereby, the display image generation unit 7a can suppress the processing load of the image processing.
  • the display image generation unit 7a when the processing capability of the display image generation unit 7a is constant and the vehicle is traveling at a low speed, improvement in the definition of one frame of the display image is required rather than the update speed of the display image. Therefore, when the vehicle is traveling at a low speed, the display image generation unit 7a decreases the update speed of the display image to be generated and improves the definition of the display image. Thereby, the display image generation unit 7a can generate a display image with improved visibility.
  • the display image generation unit 7a is configured to perform a process of changing the number of colors of the display image according to the vehicle information acquired by the vehicle information acquisition unit 9, and to adjust the processing amount and the image quality of the display image. May be. Moreover, the process of the coordinate conversion part 6a mentioned above and the process of the display image generation part 7a may be performed simultaneously, and only one process may be performed.
  • the vehicle information acquisition unit 9 in the display control device 10a is realized by the input device 11 in FIG. 2 that inputs information from the outside. Further, the coordinate conversion unit 6a and the display image generation unit 7a in the display control device 10a are realized by a processing circuit.
  • functions may be executed by dedicated hardware or software.
  • the processing circuit is the CPU 12 that executes a program stored in the memory 13 shown in FIG.
  • FIG. 18 is a flowchart showing the operation of the coordinate conversion unit 6a of the display control apparatus 10a according to the second embodiment.
  • FIG. 18 the same steps as those in the flowchart of the first embodiment shown in FIG.
  • FIG. 19 is a diagram illustrating an example of data referred to by the coordinate conversion unit 6a of the display control apparatus 10a according to the second embodiment.
  • the coordinate conversion unit 6a refers to the vehicle information acquired by the vehicle information acquisition unit 9, and sets the number of virtual projection spherical surfaces set according to the vehicle information (step ST41). .
  • the coordinate conversion unit 6a refers to a database (not shown) in which the conditions shown in FIG. 19 are stored, and sets the number of virtual projection spherical surfaces according to the vehicle speed of the vehicle information, for example.
  • the coordinate conversion unit 6a performs the processing of step ST5 and step ST6 so that three virtual projection spherical surfaces are set. Note that which distance value the coordinate conversion unit 6a uses to set the virtual projection spherical surface is determined so that, for example, distance values having a high distribution of appearance frequencies are used in order as described in the first embodiment. .
  • FIGS. 20A and 20B are diagrams illustrating an example of data referred to by the display image generation unit 7a of the display control apparatus 10a according to Embodiment 2.
  • the display image generation unit 7a refers to a database (not shown) in which the setting conditions shown in FIG. 20A or 20B are stored, and determines the setting conditions for the display image according to the vehicle information.
  • the setting conditions of FIGS. 20A and 20B divide the vehicle speed into three stages of low speed driving, medium speed driving, and high speed driving, and the resolution (definition) of the display image set by the display image generating unit 7a according to each driving speed.
  • the frame rate (update speed) of the display image and the number of colors of the display image are shown.
  • FIG. 20A shows a case where the number of colors of the display image is a fixed value.
  • the display image generation unit 7a refers to the setting conditions of FIG. 20A.
  • the display image generation unit 7a When the vehicle is traveling at a low speed, the display image generation unit 7a generates a display image based on the conditions of resolution 1920 ⁇ 960, frame rate 30 fps, color number RGB 24 bits. Generate. Further, when the vehicle is traveling at a medium speed, the display image generation unit 7a generates a display image based on the conditions of resolution 1280 ⁇ 720, frame rate 60 fps, color number RGB 24 bits. Further, when the vehicle is traveling at a high speed, the display image generation unit 7a generates a display image based on the conditions of resolution 960 ⁇ 480, frame rate 120 fps, color number RGB 24 bits.
  • FIG. 20B shows a case where the resolution of the display image is a fixed value.
  • the display image generation unit 7a refers to the database of FIG. 20B, and generates a display image based on the conditions of resolution 1280 ⁇ 720, frame rate 30 fps, color number RGB 48 bits when the vehicle is traveling at a low speed. To do. Further, when the vehicle is traveling at a medium speed, the display image generation unit 7a generates a display image based on the conditions of resolution 1280 ⁇ 720, frame rate 60 fps, color number RGB 24 bits. In addition, when the vehicle is traveling at high speed, the display image generation unit 7a generates a display image based on the conditions of resolution 1280 ⁇ 720, frame rate 120 fps, color number YUV 16 bits.
  • the coordinate conversion unit 6a and the display image generation unit 7a are configured to refer to the vehicle speed of the vehicle information.
  • the number of projection spheres can be set, the definition of the display image, the update speed of the display image, and the like can be determined.
  • the vehicle information acquisition unit 9 that acquires the vehicle information indicating the traveling state of the vehicle is provided, and the coordinate conversion unit 6a refers to the vehicle information according to the vehicle information. Since the configuration is such that the number of virtual projection spheres to be generated is set, it is possible to suppress the load of coordinate conversion processing.
  • the vehicle information acquisition part 9 which acquires the vehicle information which shows the driving state of a vehicle
  • the display image generation part 7a refers to vehicle information, according to the driving state of a vehicle. Since it is configured to determine at least one of the definition of the display image to be generated, the update speed of the display image, or the number of colors of the display image, the load when generating the display image can be suppressed. it can.
  • the distance from the distance measuring sensor 32a to the object in the area the position information of the outside camera 31a, the position information of the distance measuring sensor 32a, the driver's viewpoint information, and the display
  • the description has been made on the assumption that the coordinates on 34a are information expressed using three-dimensional spatial coordinates.
  • the present invention can be freely combined with each embodiment, modified any component of each embodiment, or omitted any component in each embodiment. Is possible.
  • the display control device is provided in a structure of a vehicle because an image obtained by imaging a region that becomes a blind spot for a driver can be displayed continuously with an actual scene viewed from the viewpoint position of the driver.
  • the present invention can be applied to display an image on a display and can be used to improve driver visibility.
  • 1 image input unit 2 distance information acquisition unit, 3 viewpoint information acquisition unit, 4, 4a image processing unit, 5 distance image generation unit, 6, 6a coordinate conversion unit, 7, 7a display image generation unit, 8 display control unit, 9 Vehicle information acquisition unit, 10, 10a Display control device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

La présente invention comporte : une unité de génération d'image de distance (5) pour utiliser une image capturée dans laquelle un angle mort est capturé à l'endroit où le champ de vision d'un conducteur est obstrué par une structure d'un véhicule pour générer une image de distance dont la valeur de pixel est représentée par une valeur de la distance allant de la position du point de vue du conducteur à un objet situé dans l'angle mort ; une unité de conversion de coordonnées (6) pour utiliser l'image de distance générée afin de régler une pluralité de sphères de projection virtuelles dans une zone autour du véhicule qui sont définies comme surfaces circonférentielles externes d'un globe centré autour de la position du point de vue du conducteur, et utiliser la pluralité de sphères de projection virtuelles pour convertir des coordonnées sur un écran d'affichage pour afficher l'image capturée dans des coordonnées sur la surface de capture d'image de l'image capturée ; et une unité de génération d'image d'affichage (7) pour utiliser les coordonnées converties sur la surface de capture d'image pour générer une image affichée sur l'écran d'affichage.
PCT/JP2016/058749 2016-03-18 2016-03-18 Dispositif de commande d'affichage et procédé de commande d'affichage Ceased WO2017158829A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/058749 WO2017158829A1 (fr) 2016-03-18 2016-03-18 Dispositif de commande d'affichage et procédé de commande d'affichage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/058749 WO2017158829A1 (fr) 2016-03-18 2016-03-18 Dispositif de commande d'affichage et procédé de commande d'affichage

Publications (1)

Publication Number Publication Date
WO2017158829A1 true WO2017158829A1 (fr) 2017-09-21

Family

ID=59851079

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/058749 Ceased WO2017158829A1 (fr) 2016-03-18 2016-03-18 Dispositif de commande d'affichage et procédé de commande d'affichage

Country Status (1)

Country Link
WO (1) WO2017158829A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109552177A (zh) * 2017-09-26 2019-04-02 电装国际美国公司 用于环境动画和在界面上投射环境动画的系统和方法
CN113342914A (zh) * 2021-06-17 2021-09-03 重庆大学 一种用于地球仪区域检测的数据集获取及自动标注的方法
CN113744353A (zh) * 2021-09-15 2021-12-03 合众新能源汽车有限公司 盲区图像生成方法、设备和计算机可读介质
CN114125378A (zh) * 2021-10-28 2022-03-01 重庆利龙科技产业(集团)有限公司 基于单目摄像头与距离传感器的透明a柱系统及实现方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005269010A (ja) * 2004-03-17 2005-09-29 Olympus Corp 画像生成装置、画像生成プログラム、及び画像生成方法
JP2006135797A (ja) * 2004-11-08 2006-05-25 Matsushita Electric Ind Co Ltd 車両用周囲状況表示装置
JP2006270175A (ja) * 2005-03-22 2006-10-05 Megachips System Solutions Inc 車載画像記録システム
JP2007015667A (ja) * 2005-07-11 2007-01-25 Denso Corp 道路撮像装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005269010A (ja) * 2004-03-17 2005-09-29 Olympus Corp 画像生成装置、画像生成プログラム、及び画像生成方法
JP2006135797A (ja) * 2004-11-08 2006-05-25 Matsushita Electric Ind Co Ltd 車両用周囲状況表示装置
JP2006270175A (ja) * 2005-03-22 2006-10-05 Megachips System Solutions Inc 車載画像記録システム
JP2007015667A (ja) * 2005-07-11 2007-01-25 Denso Corp 道路撮像装置

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109552177A (zh) * 2017-09-26 2019-04-02 电装国际美国公司 用于环境动画和在界面上投射环境动画的系统和方法
CN109552177B (zh) * 2017-09-26 2022-02-18 电装国际美国公司 用于环境动画和在界面上投射环境动画的系统和方法
CN113342914A (zh) * 2021-06-17 2021-09-03 重庆大学 一种用于地球仪区域检测的数据集获取及自动标注的方法
CN113744353A (zh) * 2021-09-15 2021-12-03 合众新能源汽车有限公司 盲区图像生成方法、设备和计算机可读介质
CN114125378A (zh) * 2021-10-28 2022-03-01 重庆利龙科技产业(集团)有限公司 基于单目摄像头与距离传感器的透明a柱系统及实现方法

Similar Documents

Publication Publication Date Title
US10899277B2 (en) Vehicular vision system with reduced distortion display
JP7342197B2 (ja) 撮像装置及び撮像装置の制御方法
US11472338B2 (en) Method for displaying reduced distortion video images via a vehicular vision system
JP6724982B2 (ja) 信号処理装置および撮像装置
JP5953824B2 (ja) 車両用後方視界支援装置及び車両用後方視界支援方法
JP5321711B2 (ja) 車両用周辺監視装置および映像表示方法
JP2009081664A (ja) 車両用周辺監視装置および映像表示方法
US11273763B2 (en) Image processing apparatus, image processing method, and image processing program
US11827148B2 (en) Display control device, display control method, moving body, and storage medium
JPWO2018003532A1 (ja) 物体検出表示装置、移動体及び物体検出表示方法
US11438531B2 (en) Imaging apparatus and electronic equipment
US12417642B2 (en) System to integrate high distortion wide-angle camera recognition with low distortion normal-angle camera recognition
US12387455B2 (en) Image processing system, mobile object, image processing method, and storage medium, with output of image recognition result integrtated on basis of first result regarding image recognition on at least partial region and second result regarding image recognition on wider region
US10455159B2 (en) Imaging setting changing apparatus, imaging system, and imaging setting changing method
JP7756636B2 (ja) 撮像装置
TWI842952B (zh) 攝像裝置
JP7551699B2 (ja) カメラシステムおよびその制御方法、プログラム
US11375136B2 (en) Imaging device for high-speed read out, method of driving the same, and electronic instrument
WO2017158829A1 (fr) Dispositif de commande d'affichage et procédé de commande d'affichage
US11778316B2 (en) Imaging apparatus
WO2022219874A1 (fr) Dispositif et procédé de traitement de signaux, et programme
WO2020153262A1 (fr) Dispositif de mesure et dispositif de télémétrie

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16894447

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 16894447

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP