WO2024120665A1 - Surround view system for a vehicle - Google Patents
Surround view system for a vehicle Download PDFInfo
- Publication number
- WO2024120665A1 WO2024120665A1 PCT/EP2023/073613 EP2023073613W WO2024120665A1 WO 2024120665 A1 WO2024120665 A1 WO 2024120665A1 EP 2023073613 W EP2023073613 W EP 2023073613W WO 2024120665 A1 WO2024120665 A1 WO 2024120665A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optical sensor
- data
- vehicle
- angle
- view
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/10—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used
- B60R2300/105—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of camera system used using multiple cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/60—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
- B60R2300/607—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective from a bird's eye viewpoint
Definitions
- the invention relates to a surround view system for a vehicle, a usage of a processing unit in a surround view system for a vehicle, a method for rendering a surround view of a vehicle, a vehicle comprising such a surround view system, a computer program and a non-transitory computer readable medium.
- Vehicle surround view systems rely on a static installation including relative positions and angles with respect to the vehicle body of their cameras.
- the cameras are calibrated to these positions and angles.
- a change of a relative position or an angle might lead to a distorted view, blind spots and other effects, which disturb the surround view or make the view of a camera unusable and the surround view incomplete.
- the described embodiments similarly relate to the surround view system for a vehicle, the usage of a processing unit in a surround view system for a vehicle, the method for rendering a surround view of a vehicle, the vehicle comprising such a surround view system, the computer program and the non-transitory computer readable medium. Synergistic effects may result from various combinations of the embodiments, although they may not be described in detail.
- a surround view system for a vehicle comprises a processing unit, which is configured to receive first optical sensor data from an optical sensor.
- the optical sensor data contain first data where the optical sensor has a first angle with respect to the vehicle body.
- the processing unit is further configured to calibrate the optical sensor using the first optical sensor data, to construct a view using the first optical sensor data of the calibrated optical sensor, and to render a surround view using the view.
- the processing unit is further configured to receive second optical sensor data from the optical sensor.
- the second optical sensor data contain second data where the optical sensor has a second angle with respect to the vehicle body.
- the processing unit is further configured to estimate the difference of the first angle and the second angle, to reconstruct the view of the optical sensor using the second optical sensor data of the optical sensor having the second angle, and to render the surround view using the reconstructed view.
- first data relates to optical sensor data, i.e. , image data that are captured when the camera is not pivoted.
- second data relates to optical sensor data, i.e., image data that are captured when the optical sensor is pivoted.
- the surround view is updated repeatedly a plurality of times with further first data, when the optical sensor is not pivoted, and updated repeatedly a plurality of times with further second data, i.e. further image data, when the optical sensor is pivoted.
- the second angle can be determined, and that a view with the second angle can be transformed to a view with a first angle.
- transformation matrices or rotation matrices may be used to transform a view with a second angle to a view with a first angle.
- the first and second angles may relate to a common local vehicle coordination system.
- a surround view system of a vehicle composes a, for example 360° surround view using a plurality of single views.
- Each single view is obtained by a respective camera capturing a section of the total surround view.
- the sections of the plurality of cameras usually overlap and the processing unit recognizes overlapping areas and matches them to combine or merge adjacent views. Areas not covered by any of the views can be, for example, interpolated if the area is small, or drawn as a solid color area or pattern, for example.
- the cameras are calibrated such that their relative position and, in particular, the orientation is exactly defined, so that the images of the different cameras can be merged seamlessly and the presented objects can be displayed at a correct angle and position on the surround view.
- the processing unit of the present disclosure is configured to overcome the problem that the image cannot be matched such that the views cannot be combined anymore and a blind spot occurs. For that, the processing unit takes into account that the second data is data of a view at a different angle. The processing unit “corrects” the view such that it rotates the view from the second angle to a view of the first angle.
- the processing unit then can process the view and find matching adjacent areas so that the views of the plurality of cameras can be combined and used for rendering the surround view. Since the view of the pivoted or rotated camera might not cover the complete side of the vehicle, there might still remain a blind spot. However, this remaining blind spot is small compared to the blind spot caused by discarding the complete view or image of the pivoted camera.
- the camera may be a camera arranged, for example, at a door of a vehicle.
- An event related to pivoting the camera is, for example, an action leading to an opening of the door, a vehicle movement or the actuation of the door itself. Therefore, the “event” in this example is not necessarily the opening of the door itself but an event, which may be followed by the opening of the door.
- the processing unit may be part of a human-machine-interface (HMI) or a driving assistance system.
- the processing unit may further comprise one or more processors capable for processing signals, images, videos and/or for providing data and signals to a display.
- the processing unit may further comprise interfaces to, for example, a data storage and to external units, such as other driving assistance systems and/or directly or indirectly accessible sensor units.
- the camera When, for example, the door is opened, the camera follows a circular curve with the midpoint being the hinge of the door and performs a change in both, position and orientation of the camera, which can be expressed as a combination of rotation and translation.
- the term “pivot” takes into account such a movement but may also include a rotation only.
- the surround view system further comprises the optical sensor, wherein the optical sensor is mounted on a pivotable part of the vehicle body.
- Pivotable parts may be, for example, doors, a retractable mirror, or a trunk lid.
- the optical sensor may be a camera such as a digital or analog camera, or a device using, for example CCD or CMOS chips.
- the event is a change of a vehicle state
- the processing unit is further configured monitor the vehicle state, and to perform the receiving first optical sensor data from an optical sensor, the calibrating the optical sensor using the first optical sensor data and the constructing a view using the optical sensor data of the calibrated optical sensor and rendering the surround view using the view, when the processing unit detects that the vehicle is in a first vehicle state.
- the processing unit is further configured to perform the receiving second optical sensor data from the optical sensor, the estimating the difference of the first angle and the second angle, and the reconstructing the view of the optical sensor and rendering the surround view using the reconstructed view, when the processing unit detects that the vehicle is in a second vehicle state.
- the event is detected by monitoring the vehicle state.
- the surround view is composed of the images of the calibrated cameras without change regarding the orientation of the camera.
- the rotation angle is estimated and the surround view is rendered using the reconstructed view.
- the vehicle state is whether the electrical system is switched on or off, the motor is running or not running, and/or the vehicle is moving or not moving.
- a door may be opened when a vehicle stops or shortly after a vehicle has been stopped.
- the mirrors on the side of a vehicle may be retracted when the electrical system and/or the electrical system is switched on when inserting the key into the ignition lock.
- the processing unit is further configured to detect the event, such as the change of the vehicle state, using the optical sensor data sensed by the optical sensor, which may pivot.
- the processing unit evaluates the images that show a movement relative to a road and which therefore indicate that the motor is on and that the vehicle is moving.
- the processing unit evaluates the images of an optical sensor mounted at a side mirror or a door and detects that the images show an increasing part of the vehicle body.
- the processing unit detects a speed of relative rotation of objects in the surrounding that is higher than that of the vehicle driving along a curve.
- the processing unit is further configured to detect the event using external vehicle data or external sensor data.
- the external vehicle data may be received from devices, sensors or a control unit of the vehicle, which are external to the surround view system but inside or attached to the vehicle.
- the external data may further be received from auxiliary devices such as a navigation system or a driver assistance system.
- the data may contain, for example, information about the movement of the vehicle, whether the electrical system is switched on, or the motor is on.
- the movement information may be, for example, information about a speed, an angle of the wheels, or a path of the vehicle.
- the term “data” includes also digital or analog signals in this disclosure.
- the event is a pivoting of the pivotable part
- the processing unit is further configured to detect the pivoting of the pivotable part by receiving external sensor data.
- the term “external” means external with respect to the surround view system.
- the external sensor data is provided, for example from one or more sensors detecting whether a door is closed, such as a Hall sensor or a contact sensor or a proximity sensor, or a control device for an actuator for, for example retracting a side mirror.
- the control device may have a communication or signal link to the processing unit for indicating an actuation.
- the control device may also provide information about a rotation angle of the controlled device.
- the processing unit is further configured to estimate the difference between the first and the second angle using stored vehicle data and/or external sensor data.
- the difference between the first and the second angle may be, for example, binary, relating to a binary monitored vehicle state.
- the possible vehicle states that are detected by the processing unit may be “vehicle is moving” and “vehicle is not moving”. Since, in this example, no further state information is available, the difference between the first and the second angle is translated into a “door closed” angle, for example 0°, or a “door opened” angle, for example, 70°.
- This information may be contained in a data storage of a device of the vehicle, which is accessed by the processing unit, or may be contained in a data storage of the surround view system.
- the processing unit may also receive and use measurement data of a sensor that measures the current door opening angle. The processing unit then may calculate the view according to this angle information.
- the pivotable part of the vehicle body is a door of the vehicle.
- the door may be, for example the door next to the seat of the driver or of the passengers. If external sensor data are used, the sensors may provide information which seat is occupied. Further, a seat belt sensor may provide information, on which seat a release of the seat belt has occurred and therefore an opening of the door can be assumed.
- the surround view system comprises further a plurality of optical sensors, and the processing unit is further configured to receive optical data from the plurality of optical sensors and to render the surround view system using further the optical data of the plurality of optical sensors.
- the further optical sensors may be mounted, for example, on the front side, the rear side, the edges or anywhere else at the vehicle body.
- a usage of a processing unit in a surround view system for a vehicle receives first optical sensor data from an optical sensor.
- the optical sensor data contain first data where the optical sensor has first angle with respect to the vehicle body, calibrates the optical sensor using the first optical sensor data, and constructs a view using the optical sensor data of the calibrated optical sensor.
- the processing unit detects an event related to pivoting the optical sensor and receives upon detecting the event second optical sensor data from the optical sensor.
- the second optical sensor data contain second data where the optical sensor has a second angle with respect to the vehicle body.
- the processing unit estimates the difference of the first angle and the second angle, and reconstructs the surround view using the second optical sensor data of the optical sensor having the second angle and renders the surround view using the reconstructed view.
- a method for rendering a surround view of a vehicle comprises the following steps.
- first optical sensor data are received by the processing unit from an optical sensor, the first optical sensor data containing first data where the optical sensor has first angle with respect to the vehicle body.
- the optical sensor is calibrated by the processing unit using the first optical sensor data and a view is constructed using the first optical sensor data of the calibrated optical sensor.
- an event related to pivoting the optical sensor is detected.
- second optical sensor data are received from the optical sensor, the second optical sensor data containing second data where the optical sensor has a second angle with respect to the vehicle body.
- the difference of the first angle and the second angle is estimated.
- the view of the optical sensor is reconstructed using the second optical sensor data of the optical sensor having the second angle and the surround view is rendered using the reconstructed view. By reconstructing the view and using this view, the previous view is discarded or at least not used for rendering the surround view.
- the surround view is rendered using the view, i.e. , an angle has to be determined and the view has not to be re-constructed.
- a vehicle comprising a surround view system as described herein is provided.
- a program is provided that, when executed by a processor, causes the processor to implement the method for rendering a surround view of a vehicle as described herein.
- a non-transitory computer readable medium having stored thereon a program that when executed by a processor causes the processor to implement the method for rendering a surround view of a vehicle is provided.
- the computer readable medium may be seen as a storage medium or memory device, such as for example, a USB stick, a CD, a DVD, a data storage device, a hard disk, or any other medium on which a program element as described above can be stored.
- a memory device may include, but is not limited to, a non-transitory computer-readable medium, such as flash memory, a random access memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), and non-volatile RAM (NVRAM).
- non- transitory computer-readable media is intended to be representative of any tangible, computer-readable media, including, without limitation, non- transitory computer storage devices, including, without limitation, volatile and non-volatile media, and removable and non-removable media such as a firmware, physical and virtual storage, CD-ROMs, DVDs, and any other digital source such as a network or the Internet, as well as yet to be developed digital means, with the sole exception being a transitory, propagating signal.
- a floppy disk a compact disc - read only memory (CD-ROM), a magneto-optical disk (MOD), a digital versatile disc (DVD), or any other computer-based device implemented in any method or technology for short-term and long-term storage of information, such as, computer-readable instructions, data structures, program modules and submodules, or other data
- CD-ROM compact disc - read only memory
- MOD magneto-optical disk
- DVD digital versatile disc
- the methods described herein may be encoded as executable instructions, e.g., “software” and “firmware,” embodied in a non-transitory computer-readable medium.
- the terms “software” and “firmware” are interchangeable, and include any computer program stored in memory for execution by personal computers, workstations, clients and servers. Such instructions, when executed by a processor, cause the processor to perform at least a portion of the methods described herein.
- Fig. 1 shows a sketch of an ideal surround view
- Fig. 2a shows a sketch of the surround view where the camera is pivoted
- Fig. 2b shows a sketch of the surround view where the camera view of the pivoted camera is eliminated
- Fig. 3 shows a sketch of the surround view after correction
- Fig. 4 shows a flow diagram of a method for rendering a surround view of a vehicle
- Fig. 5 shows a block diagram of a surround view system
- Fig. 6 shows a block diagram of a method for rendering a surround view of a vehicle.
- Fig. 1 shows the surround view of a vehicle 600, where the vehicle 600 is in a state, where the doors are closed.
- the surround view is represented by a bowl view, in which images from several cameras 511 , 512, 513, 514 are combined to a single view.
- the cameras 511 , 512, 513, 514 are mounted on the vehicle body. In this disclosure, it is distinguished between the rigid, i.e. , fix, part of the vehicle body and movable or pivotable parts of the vehicle body such as doors and rotatable mirrors. Some of the cameras 511 , 512, 513, 514 are mounted on such movable or pivotable parts.
- the cameras 511 , 512, 513, 514 are extrinsically calibrated with respect to the vehicle body.
- the calibration may be performed under reference conditions.
- Reference conditions may be in a defied surrounding with known objects and geometries with respect to these objects.
- the objects may include markings on walls or the floor or street.
- conditions for calibration may also be present, for example, when the vehicle 600 is in movement and all movable or rotatable parts are in a position corresponding to a driving scenario. That is, the doors are closed and the mirrors are adjusted. Therefore, the cameras 511 , 512, 513, 514 may be re-calibrated also after the calibration under reference conditions.
- Fig. 1 shows a surround view rendered by the views of the cameras 511 , 512, 513, 514 under such calibrated or re-calibrated conditions.
- the squares in the figure may be interpreted as a common coordinate system of the calibrated cameras, which can be mapped to the vehicle body coordinate system. In the calibrated state, they are oriented all exactly in the same direction.
- the thick lines 108 show the orientation of the camera view of camera 511 .
- Lines 112 represent a scale for distances. Some columns 106, 110 are shown, which provide a 3D-impression. A column 110 is shown at the intersection between the views of two cameras 511 , 512, where the views merge. Lines 104 show the field of view of camera 512.
- Fig. 2a shows a surround view when a door - in Fig. 2 the door on the left side of the vehicle - is open.
- the orientation of the view of camera 511 differs as can be seen by the thick lines and the orientation of the squares.
- the cameras still use the extrinsics of the calibration as described above.
- the camera coordinate system of camera 511 is rotated and displaced with respect to the common surround view coordinate system. Lines 108 and 202 the orientation of the view.
- the changed view is also recognizable in 3D by columns 106.
- the processing unit that processes the images of the camera and produces the surround view now detects that the vehicle is in a state where the door is open.
- the detection may be based on information from the cameras or on external information.
- the processing unit may for example evaluate the images of the surround view camera. For example, it may detect, that same objects are visible by two cameras, which normally should not be the case.
- the camera may detect by comparing an image with a previous image from this camera 511 and/or the further cameras 511 , 513, 514 that a change of vehicle movement from driving to standstill or vice versa has taken place.
- this information may further be combined with an external information, for example that the motor has stopped.
- external information from other sensors, controllers or driver assistance systems may be used.
- Fig. 2b shows a sketch of the surround view where the camera is pivoted, and the angle is not taken into account.
- the view of the pivoted camera cannot be used for rendering the surround view and hence is eliminated. This results in a surround view where nearly the complete left side is a blind spot.
- Fig. 3 shows the processed and rendered surround view, where the angle of the pivoted camera 511 has been taken into account and the processing unit has corrected the view as shown in Fig. 2a by the current door angle.
- the views are aligned again and the surround view can be composed using all camera views.
- the remaining blind spot in the surround view is caused by the field of view of camera 511 , which may have an opening angle of 180° or less, and which therefore does not cover the complete left side when the door is open.
- Lines 108 and 202 show again the orientation of the view, which differs from the orientation of the pivoted camera. Additionally to the angle, also the distances may be fitted.
- Fig. 4 shows a flow diagram representing the method 400 for rendering a surround view of a vehicle.
- the flow diagram serves as overview. The steps have been described in detail above so that the detailed description is not repeated at this point.
- step 402 the electrical system of the vehicle and the motor are switched on. The vehicle may start to move, for example, at this step, or shortly after the next one or two steps.
- step 404 the surround view system is switched on.
- the surround view system may be a human machine interface with a processing unit and a display to which optical sensors, which are in the following also referred to as “cameras”, are connected providing the optical sensor data, which are in the following also referred to as “images” or “image data”, for rendering the surround view.
- the processing unit receives the first image data of a live a camera feed. Each image represents a view of the corresponding camera.
- the processing unit calibrates the cameras using the first image data, and in step 410, the processing unit constructs the view, also using the first image data.
- the first image data are captured when the camera is not pivoted, for example, when the vehicle is moving.
- the processing unit monitors the vehicle state and/or the vehicle dynamics. The vehicle state or vehicle dynamics may be obtained by evaluating the images of the cameras or by receiving external device or sensor information. If the state of the vehicle is not changed, i.e.
- step 414 the processing unit receives second optical sensor data from the optical sensor.
- the second optical sensor data contains second data where the optical sensor has a second angle with respect to the vehicle body due to pivoting the optical sensor. It is assumed that at the point of time when the vehicle stops, the door is not yet opened. The “first” image therefore might show a view where the door is still closed. The second image may thus captured with a delay with respect to the capturing of the second image.
- step 416 the angle of the open door with respect to the closed door is estimated. This may happen, for example, using external sensor data or data contained in a data storage.
- step 418 the view is reconstructed by taking into account the estimated angle.
- step 420 the surround view is rendered using the reconstructed view of the pivoted camera, and the flow jumps back to step 406.
- Fig. 5 shows a block diagram of a surround view system 500 comprising the processing unit 502 to which optical sensors 511 , 512, 513, and 514 are connected. Further, the processing unit has access to a data storage 510, where the first optical sensor data, the second optical sensor data, and further data such as collected sensor data are at least temporarily stored, and has an interface 520 to external devices. The processing unit may also access external data by accessing external data storages or by receiving data directly from external sensors or devices 522.
- Fig. 6 shows a block diagram of a vehicle 600 comprising such a surround view system 500 with a processing unit 502 and optical sensors 511 ...514.
- the processing unit 502 may receive data from external sensors or external devices 522.
- processors and “computer” and related terms, e.g., “processing device,” “computing device,” and “controller” are not limited to just those integrated circuits referred to in the art as a computer, but broadly refers to a processor, a processing device, a controller, a general purpose central processing unit (CPU), a graphics processing unit (GPU), a microcontroller, a microcomputer, a programmable logic controller (PLC), a reduced instruction set computer (RISC) processor, a field programmable gate array (FPGA), a digital signal processing (DSP) device, an application specific integrated circuit (ASIC), and other programmable circuits or processing devices capable of executing the functions described herein, and these terms are used interchangeably herein.
- PLC programmable logic controller
- RISC reduced instruction set computer
- FPGA field programmable gate array
- DSP digital signal processing
- ASIC application specific integrated circuit
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP23762235.2A EP4631240A1 (en) | 2022-12-05 | 2023-08-29 | Surround view system for a vehicle |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2218221.6 | 2022-12-05 | ||
| GB2218221.6A GB2625252A (en) | 2022-12-05 | 2022-12-05 | Surround view system for a vehicle |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024120665A1 true WO2024120665A1 (en) | 2024-06-13 |
Family
ID=84926497
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/EP2023/073613 Ceased WO2024120665A1 (en) | 2022-12-05 | 2023-08-29 | Surround view system for a vehicle |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4631240A1 (en) |
| GB (1) | GB2625252A (en) |
| WO (1) | WO2024120665A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170341583A1 (en) * | 2016-05-27 | 2017-11-30 | GM Global Technology Operations LLC | Systems and methods for towing vehicle and trailer with surround view imaging devices |
| US20180164831A1 (en) * | 2016-12-09 | 2018-06-14 | Lg Electronics Inc. | Around view monitoring apparatus for vehicle, driving control apparatus, and vehicle |
| US10654423B2 (en) * | 2011-04-25 | 2020-05-19 | Magna Electronics Inc. | Method and system for dynamically ascertaining alignment of vehicular cameras |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020145581A (en) * | 2019-03-06 | 2020-09-10 | パナソニックIpマネジメント株式会社 | Display control device, display system, and display control method |
-
2022
- 2022-12-05 GB GB2218221.6A patent/GB2625252A/en active Pending
-
2023
- 2023-08-29 WO PCT/EP2023/073613 patent/WO2024120665A1/en not_active Ceased
- 2023-08-29 EP EP23762235.2A patent/EP4631240A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10654423B2 (en) * | 2011-04-25 | 2020-05-19 | Magna Electronics Inc. | Method and system for dynamically ascertaining alignment of vehicular cameras |
| US20170341583A1 (en) * | 2016-05-27 | 2017-11-30 | GM Global Technology Operations LLC | Systems and methods for towing vehicle and trailer with surround view imaging devices |
| US20180164831A1 (en) * | 2016-12-09 | 2018-06-14 | Lg Electronics Inc. | Around view monitoring apparatus for vehicle, driving control apparatus, and vehicle |
Also Published As
| Publication number | Publication date |
|---|---|
| GB202218221D0 (en) | 2023-01-18 |
| EP4631240A1 (en) | 2025-10-15 |
| GB2625252A (en) | 2024-06-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9216765B2 (en) | Parking assist apparatus, parking assist method and program thereof | |
| US9925919B2 (en) | Parking assistance device | |
| CN112572415B (en) | Parking assist device | |
| EP3007932B1 (en) | Door protection system | |
| US10239520B2 (en) | Parking assistance device and parking assistance method | |
| US10018473B2 (en) | Vehicle position detecting device | |
| US10189500B2 (en) | Parking assistance apparatus | |
| US9902427B2 (en) | Parking assistance device, parking assistance method, and non-transitory computer readable medium storing program | |
| JP6275006B2 (en) | Parking assistance device | |
| US10160490B2 (en) | Vehicle control device | |
| US20160078764A1 (en) | Parking assist system | |
| CN108696719B (en) | Method and apparatus for calibrating a vehicle camera of a vehicle | |
| US10489950B2 (en) | Display control device for displaying a captured image of a vehicle periphery | |
| US10977506B2 (en) | Apparatus for determining visual confirmation target | |
| US10846884B2 (en) | Camera calibration device | |
| WO2024120665A1 (en) | Surround view system for a vehicle | |
| US10676081B2 (en) | Driving control apparatus | |
| US20240375685A1 (en) | Parking exit assistance apparatus | |
| US11654961B2 (en) | Steering assistance control apparatus | |
| JP2025540163A (en) | Surround view systems for vehicles | |
| US12488599B2 (en) | Parking assistance device | |
| JP6002575B2 (en) | Turn cancel signal output device for vehicle | |
| JP2020145581A (en) | Display control device, display system, and display control method | |
| JP7722241B2 (en) | Self-position error estimation device and self-position error estimation method | |
| US20250005937A1 (en) | Parking assistance device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23762235 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2025532089 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2025532089 Country of ref document: JP |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023762235 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023762235 Country of ref document: EP Effective date: 20250707 |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023762235 Country of ref document: EP |