WO2024155407A1 - Displays with exterior views - Google Patents
Displays with exterior views Download PDFInfo
- Publication number
- WO2024155407A1 WO2024155407A1 PCT/US2023/084853 US2023084853W WO2024155407A1 WO 2024155407 A1 WO2024155407 A1 WO 2024155407A1 US 2023084853 W US2023084853 W US 2023084853W WO 2024155407 A1 WO2024155407 A1 WO 2024155407A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- view
- vehicle
- window
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/205—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0194—Supplementary details with combiner of laminated type, for optical or mechanical aspects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B2027/0192—Supplementary details
- G02B2027/0196—Supplementary details having transparent supporting structure for display mounting, e.g. to a window or a windshield
Definitions
- This relates generally to systems, and, more particularly, to systems with displays.
- Systems such as vehicles may include windows such as a front windshield and may also include mirrors such as side view mirrors and a rear view mirror.
- Mirrors such as these may be used to check for cars or other objects in the vicinity of the vehicle.
- mirrors are located separately from the front windshield, so a driver or other passenger in the vehicle must look away from the front windshield to check the mirrors before changing lanes, breaking, or taking other actions. If care is not taken, the risk of collisions may be increased when the user is looking away from the front windshield in order to check the mirrors.
- Systems such as vehicles may include a vehicle body with one or more openings. Windows may be mounted in the openings.
- the windows may include a front windshield through which a user views the environment in front of the vehicle.
- One or more displays may be integrated into the vehicle windows such as the front windshield.
- the display may be a head-up display, a surface-laminated display, an embedded display, and/or any other suitable type of display.
- the display in the front windshield may be configured to display images captured by a camera in the vehicle.
- a left camera may be mounted to a left side of the body and may be configured to capture a left side view (sometimes referred to as a left side mirror view).
- the left camera may be co-located with a left side view mirror or may replace a left side view mirror.
- a right camera may be mounted to a right side of the body and may be configured to capture a right side view (sometimes referred to as a right side mirror view).
- the right camera may be co-located with a right side view mirror or may replace a right side view mirror.
- the one or more displays in the front windshield may be configured to display a live video feed of the left side view captured by the left camera and the right side view captured by the right camera.
- Control circuitry e.g., a controller, microprocessor, etc.
- the left and right side views may be displayed on separate respective displays on the front windshield or may be displayed by a single common display on the front windshield.
- the display on the front windshield may allow the user to check for cars, people, or other obstructions around the vehicle (e.g., before changing lanes or taking other actions) without having to divert the user’s gaze away from the front windshield.
- Additional cameras may be mounted to the vehicle body such as a rear view camera, a birdseye view camera, a three-hundred-and-sixty-degree camera, etc.
- the display on the front windshield may display live video feeds from the different cameras at the same time or at different times depending on turn signal information, speedometer information, user input, proximity sensor information or other sensor information, etc.
- FIG. 1 is a top view of an illustrative system with a window in accordance with some embodiments.
- FIG. 2 is a schematic diagram of an illustrative vehicle with a window and a display in accordance with some embodiments.
- FIG. 3 is a top view of an illustrative vehicle with one or more cameras that capture mirror views and one or more windows with integrated displays for displaying the mirror views in accordance with some embodiments.
- FIG. 4 is a front view of an illustrative window with discrete integrated displays for displaying one or more different mirror views on the window in accordance with some embodiments.
- FIG. 5 is a front view of an illustrative window with an integrated display that extends across most or all of the window in accordance with some embodiments.
- FIG. 6 is a front view of an illustrative window with an integrated display that displays a live camera feed and computer-generated display content in accordance with some embodiments.
- FIG. 7 is a side view of an illustrative window with a head-up display for displaying a mirror view on the window in accordance with some embodiments.
- FIG. 8 is a side view of an illustrative window with a surface-laminated display for displaying a mirror view on the window in accordance with some embodiments.
- FIG. 9 is a side view of an illustrative window with an embedded display for displaying a mirror view on the window in accordance with some embodiments.
- Systems may be provided with windows.
- a vehicle or other system may have a window such as a front windshield that a user (e.g., a driver, a rider, or other passenger) in the vehicle looks through to see where the vehicle is headed.
- a user e.g., a driver, a rider, or other passenger
- One or more displays may be integrated into a vehicle window such as a front windshield or other window.
- the display may be a head-up display, a surface-laminated display, an embedded display, or other suitable display.
- the user may be able to view the display on the window without needing to shift the user’s gaze away from the window.
- the vehicle may include cameras mounted to one or more different locations of the vehicle.
- the cameras may be configured to capture images and/or video of the vehicle’s surroundings and/or of the vehicle itself. Cameras may, for example, be co-located with mirrors and/or may replace mirrors entirely. Cameras may include a left side camera that captures a left side mirror view, a right side camera that captures a right side mirror view, a rear view camera that captures a rear mirror view, and/or other cameras that capture other views of the vehicle’s surroundings or the vehicle itself (e.g., a camera may capture a birdseye view of the vehicle, if desired).
- the one or more different mirror views captured by the cameras may be displayed by the one or more displays that are integrated into the window.
- the window may contain a single display for displaying multiple different mirror views, or the window may contain multiple displays each configured to display a different respective one of the mirror views. This allows the user to check the left side mirror view, the right side mirror view, the rear mirror view, and/or other camera views without diverting the user’s gaze away from the window.
- Each mirror view on the window may be a live camera feed (e.g., live video feed) of the images captured by a respective one of the cameras.
- control circuitry e.g., a controller, microprocessor, etc.
- vehicle may be configured to overlay computergenerated display content onto the live camera feed and/or may be configured to supplement or replace the live camera feed with computer-generated display content.
- the computergenerated display content may include symbols, text, navigation-related display content, virtual images, warnings, alerts, highlighted or otherwise modified portions of the real-world content in the live camera feed, and/or other display content.
- FIG. 1 An illustrative system of the type that may include a window with an integrated mirror view is shown in FIG. 1.
- System 10 may be a vehicle, building, or other type of system.
- FIG. 1 is a top view of an illustrative vehicle 10.
- vehicle 10 is the type of vehicle that may carry passengers (e.g., an automobile, truck, or other automotive vehicle).
- Configurations in which vehicle 10 is a robot (e.g., an autonomous robot) or other vehicle that does not carry human passengers may also be used.
- Vehicles such as automobiles may sometimes be described herein as an example.
- Vehicle 10 may be manually driven (e.g., by a human driver), may be operated via remote control, and/or may be autonomously operated (e.g., by an autonomous driving system or other autonomous propulsion system).
- vehicle sensors such as lidar, radar, visible and/or infrared cameras (e.g., two-dimensional and/or three-dimensional cameras), proximity (distance) sensors, and/or other sensors
- an autonomous driving system and/or driver-assistance system in vehicle 10 may perform automatic braking, steering, and/or other operations to help avoid pedestrians, inanimate objects, and/or other external structures on roadways.
- Vehicle 10 may include a body such as vehicle body 12.
- Body 12 may include vehicle structures such as body panels formed from metal and/or other materials, may include doors 18, a hood, side body panels, a trunk, fenders, a chassis to which wheels are mounted, a roof, etc.
- Windows (sometimes referred to as glazings) such as window 14 may be coupled to body 12 and may be configured to cover openings in body 12.
- Windows may be formed in doors 18 and other portions of vehicle body 12 (e.g., on the sides of vehicle body 12, on the roof of vehicle 10, and/or in other portions of vehicle 10).
- FIG. 1 shows a window 14 formed on a front portion F of vehicle 10 (e.g., a front windshield).
- Windows e.g., window 14
- doors 18, and other portions of body 12 may separate interior region 20 of vehicle 10 from the exterior environment (e.g., exterior region 16) that is surrounding vehicle 10. Doors 18 may be opened and closed to allow people to enter and exit vehicle 10. Seats and other structures may be formed in the interior of vehicle body 12.
- Motorized window positioners may be used to open and close windows 14, if desired.
- the windows in system 10 such as window 14 may include a front window mounted within an opening in body 12 at the front of a vehicle (e.g., a front windshield), a moon roof (sun roof) window or other window extending over some or all of the top of a vehicle, a rear window at the rear of a vehicle, and/or side windows on the sides of a vehicle.
- Window 14 may be flat or window 14 may have one or more curved portions.
- each window 14 in system 10 may be at least 0.1 m 2 , at least 0.5 m 2 , at least 1 m 2 , at least 5 m 2 , at least 10 m 2 , less than 20 m 2 , less than 10 m 2 , less than 5 m 2 , or less than 1.5 m 2 (as examples).
- Vehicle 10 may have automotive lighting such as one or more headlights (sometimes referred to as headlamps), driving lights, fog lights, daytime running lights, turn signals, brake lights, and/or other lights.
- vehicle 10 may include an exterior display that is configured to display content at the exterior of the vehicle.
- vehicle 10 may also include components such as propulsion and steering systems 24 (e.g., wheels, an engine, a motor, brakes, a steering wheel, etc.).
- Propulsion and steering systems 24 may include manually adjustable driving systems and/or autonomous driving systems having wheels coupled to body 12, steering controls, one or more motors for driving the wheels, etc.
- the propulsion and steering systems 24 may propel the vehicle forward in the positive Y- direction.
- Vehicle 10 also includes control circuitry 26 and input-output devices 28.
- Control circuitry 26 (sometimes referred to as a controller, microprocessor, or central processing unit) may be configured to run an autonomous driving application, a navigation application (e.g., an application for displaying maps on a display), and software for controlling vehicle climate control devices, lighting, media playback, window movement, door operations, sensor operations, and/or other vehicle operations.
- the control circuitry 26 may form part of an autonomous driving system that drives vehicle 10 on roadways autonomously using data such as sensor data.
- the control circuitry may include processing circuitry and storage and may be configured to perform operations in vehicle 10 using hardware (e.g., dedicated hardware or circuitry), firmware and/or software.
- Non-transitory computer readable storage media e.g., tangible computer readable storage media
- the software code may sometimes be referred to as software, data, program instructions, computer instructions, instructions, or code.
- the non-transitory computer readable storage media may include non-volatile memory such as non-volatile random-access memory, one or more hard drives (e.g., magnetic drives or solid-state drives), one or more removable flash drives or other removable media, or other storage.
- Software stored on the non-transitory computer readable storage media may be executed on the processing circuitry of control circuitry 26.
- the processing circuitry may include application-specific integrated circuits with processing circuitry, one or more microprocessors, a central processing unit (CPU) or other processing circuitry.
- Input-output devices 28 may include displays, light-emitting diodes and other lightemitting devices, haptic devices, speakers, and/or other devices for providing output. Output devices may, for example, be used to provide vehicle occupants and others with haptic output, audio output, visual output (e.g., displayed content, light, etc.), and/or other suitable output.
- the input-output devices 28 may also include input devices such as buttons, sensors, and other devices for gathering user input, for gathering environmental measurements, for gathering information on vehicle operations, and/or for gathering other information.
- the sensors may include ambient light sensors, touch sensors (e.g., on a touch-sensitive display or separate from a display), force sensors, proximity sensors, optical sensors such as cameras operating at visible, infrared, and/or ultraviolet wavelengths (e.g., fisheye cameras, two- dimensional cameras, three-dimensional cameras, and/or other cameras), capacitive sensors, resistive sensors, ultrasonic sensors (e.g., ultrasonic distance sensors), microphones, radiofrequency sensors such as radar sensors, lidar (light detection and ranging) sensors, door open/close sensors, seat pressure sensors and other vehicle occupant sensors, window sensors, position sensors for monitoring location, orientation, and movement, speedometers, and/or other sensors.
- ambient light sensors e.g., touch sensors (e.g., on a touch-sensitive display or separate from a display), force sensors, proximity sensors, optical sensors such as cameras operating at visible, infrared, and/or ultraviolet wavelengths (e.g., fisheye cameras, two- dimensional cameras, three-dimensional cameras, and/or other
- control circuitry 26 may gather information from sensors and/or other input-output devices 28 such as lidar data, camera data (e.g., two-dimensional images), radar data, and/or other sensor data. This information may be used by an autonomous driving system and/or driver’s assistance system in vehicle 10.
- sensors and/or other input-output devices 28 such as lidar data, camera data (e.g., two-dimensional images), radar data, and/or other sensor data. This information may be used by an autonomous driving system and/or driver’s assistance system in vehicle 10.
- input-output devices 28 may include a display such as display 30.
- Display 30 may be integrated into a window such as window 14 of FIG. 1, may be integrated into a mirror, and/or may be located in other portions of vehicle 10.
- Display 30 may include a pixel array with liquid crystal display pixels, organic light-emitting diode pixels, electrophoretic ink pixels, and/or pixels based on any other suitable display technology.
- display 30 may be a surface-laminated display that is laminated to an outer surface of window 14, may be an embedded display that is embedded within window 14, may be a head-up display in which images are projected onto window 14 and reflected towards an eye box for viewing by a user, and/or may be any suitable display that is viewable on window 14.
- a projector unit or other light source may emit image light onto the window.
- Window 14 may serve as an optical combiner that combines the image light projected onto the window by the projector and the real-world light that passes through window 14 (in the negative Y-direction of FIG. 1).
- a head-up display may include a projector unit that projects images onto a window that does not include any optical combiners. The images may reflect off of the window surface towards the user.
- Input-output devices 28 may include one or more cameras such as cameras 34.
- Cameras 34 may be visible light cameras, two-dimensional cameras, three-dimensional cameras, and/or any other suitable type of camera. Cameras 34 may be configured to capture images of the environment surrounding vehicle 10 and/or images of vehicle 10 itself. Cameras 34 may be located in any suitable location on vehicle 10. Cameras 34 may be mounted on an interior (e.g., interior 20) or exterior portion (e.g., exterior 16) of body 12 of vehicle 10. During operation of vehicle 10, display 30 on window 14 may be configured to display a live camera feed of images captured with cameras 34.
- FIG. 3 is a top view of vehicle 10 showing illustrative locations for cameras 34 and displays 30 in vehicle 10.
- vehicle 10 may include cameras 34. There may be one, two, three, four, five, ten, more than ten, or less than ten cameras 34. With one illustrative arrangement that is sometimes described herein as an example, cameras 34 are visible light cameras with image sensors formed from a two-dimensional array of pixels. [0037] Cameras 34 may be located in any suitable location on vehicle 10. Cameras 34 may be mounted on an interior or exterior portion of body 12 of vehicle 10. In the example of FIG.
- cameras 34 include a left side view camera 34 mounted on left side L of body 12, a right side view camera 34 mounted on right side R of body 12, a rear view camera 34 mounted on back side B of body 12, and a top view camera 34 mounted on top side T of body 12. There may be more than one camera mounted to a respective side of body 12. Cameras 34 may be mounted to front side F of body 12, inside of body 12, and/or in any other suitable location on vehicle 10.
- FIG. 3 is merely illustrative.
- One or more displays 30 on front windshield window 14 may be configured to display the images captured by cameras 34.
- User 36 e.g., a driver, rider, passenger, etc.
- This allows user 36 to view objects, obstructions, people, or other items in the vicinity of vehicle 10 without having to divert the user’s gaze away from window 14.
- Cameras 34 may be placed in locations that would otherwise need mirrors to allow user 36 to view objects that are outside of the user’s forward-looking field of view.
- a first camera 34 may be located on left side L of body 12 and may be configured to capture a left side mirror view of region 58L to the left of body 12.
- a second camera 34 may be located on right side R of body 12 and may be configured to capture a ride side mirror view of region 58R to the right of body 12.
- a third camera 34 may be located on back side B of body 12 and may be configured to capture a rear mirror view of rear region 58B behind vehicle 10.
- camera 34 on top surface T of body 12 may be configured to capture a three-hundred-and-sixty-degree view of the area surrounding vehicle 10, may be configured to capture a birdseye view of vehicle 10, and/or may be configured to capture other fields of view of vehicle 10 and/or its surroundings.
- One or more displays 30 on window 14 may be configured to display a live camera feed of the images captured by cameras 34.
- one or more of displays 30 may display live camera views of left side region 58L, right side region 58R, and rear region 58B behind vehicle 10.
- different displays 30 may be configured to display the different camera views.
- a first display 30 may display a live camera view of left side region 58L
- a second display 30 may display a live camera view of right side region 58R
- a third display 30 may display a live camera view of back side region 58B.
- the different displays 30 that display the different camera views may be located in different portions of window 14, if desired.
- a camera view of left side region 58L may be displayed on a display 30 located on a left portion of window 14
- a camera view of right side region 58R may be displayed on a display 30 located on a right portion of window 14
- a camera view of back region 58B may be displayed on a display 30 located in an upper central portion of window 14.
- the different camera views may all be displayed by the same display 30 (at the same time or at different times).
- the different camera views such as camera views 58L, 58R, 58B, and/or other camera views may be stitched together or tiled together to create a panoramic view or other integrated view of the environment surrounding vehicle 10.
- camera views may be positioned on window 14 based on where user 36 is looking (e.g., so that camera views can be displayed in a location that aligns with the user’s gaze on window 14).
- camera views may be positioned on window 14 based on the locations of real- world objects in front of vehicle 10. For example, if an object (e.g., a vehicle, a person, or other obstruction) is visible through window 14, the camera views may be positioned on window 14 so as to avoid obstructing the user’s field of view of the object.
- an object e.g., a vehicle, a person, or other obstruction
- Gaze tracking sensors or other user-position-tracking sensors may be used to track the position of user 36 while forward-facing sensors such as LIDAR sensors, cameras, or other sensors may be used to track the position of objects in front of vehicle 10 so that camera views can be displayed on window 14 without obstructing the user’s field of view of objects in front of vehicle 10.
- forward-facing sensors such as LIDAR sensors, cameras, or other sensors may be used to track the position of objects in front of vehicle 10 so that camera views can be displayed on window 14 without obstructing the user’s field of view of objects in front of vehicle 10.
- control circuitry 26 may be configured to dynamically adjust the position of the camera views being display on window 14 based on sensor information, based on user input, based on vehicle settings, based on user position, based on the position of nearby vehicles, people, obstructions, road signs, etc., based on the urgency of the displayed information (e.g., whether the camera views should be urgently checked by the user to avoid a collision), and/or based on other information.
- user 36 may seamlessly view real- world content through window 14 while also viewing the captured camera views on window 14.
- camera views may be displayed on window 14 in positions that align with the real-world being viewed through window 14 to provide an immersive experience in which the user can observe both the road ahead and the surrounding camera views captured by cameras 34 at a glance.
- a right edge of a left side camera view may be aligned with a left edge of the user’s forward-facing field of view
- a left edge of a right side camera view may be aligned with a right edge of the user’s forward-facing field of view. This allows the user to observe an integrated, wrap-around field of view of vehicle 10 and its surroundings on window 14.
- Displays 30 may be configured to continuously display the live video feeds from cameras 34, or displays 30 may only display the live video feeds from cameras 34 in response to user input and/or control signals from control circuitry 26.
- control circuitry 26 may determine when to display a camera view (and which camera view to display) on display 30 based on turn signals, vehicle speed, proximity sensor data or other sensor data, and/or other information.
- control circuitry 26 may use display 30 to automatically display a camera view of left side region 58L in response to a left turn signal being activated, in response to sensing lateral acceleration of the vehicle towards the left side, in response to a proximity sensor in vehicle 10 detecting an object or person approaching left side L of body 12, and/or in response to other information.
- Control circuitry 26 may use display 30 to automatically display a camera view of right side region 58R in response to a right turn signal being activated, in response to sensing lateral acceleration of the vehicle towards the right side, in response to a proximity sensor in vehicle 10 detecting an object or person approaching right side R of body 12, and/or in response to other information.
- Control circuitry 26 may use display 30 to automatically display a camera view of rear region 58B in response to vehicle 10 being placed in a reverse gear, in response to a rapid deceleration of vehicle 10, in response to a proximity sensor in vehicle 10 detecting an object or person approaching back side B of body 12, and/or in response to other information.
- the live camera feed may be activated when it appears that the user has not appropriately checked mirrors 56 and/or has not observed other blind spot warnings.
- Vehicle 10 may have blind spot indicator lights that turn on whenever another vehicle is located in the user’s blind spot (e.g., region 58L or region 58R). If user 36 begins to change lanes when the blind spot indicator light is turned on, control circuitry 26 may then activate the live video feed on window 14 as a last attempt to warn the user of a vehicle or other object in the user’s blind spot (e.g., left side view 58L may be activated and displayed on window 14 in response to a user moving to the left despite the left side blind spot indicator light being turned on).
- left side view 58L may be activated and displayed on window 14 in response to a user moving to the left despite the left side blind spot indicator light being turned on.
- side view cameras 34 may be co-located with side view mirrors such as side view mirrors 56.
- a left side view camera 34 that captures a left side mirror view of region 58L may be co-located with a left side view mirror 56
- a right side view camera 34 that captures a right side mirror view of region 58R may be colocated with a right side view mirror 56.
- Rear view camera 34 may be placed on a back side B of vehicle 10 or may be placed inside of vehicle and co-located with a rear view mirror.
- the field of view of each side view camera 34 may be similar to the field of view of the colocated side view mirror 56, or cameras 34 may have different fields of view than mirrors 56.
- side view mirrors 56 and/or other mirrors such as a rear view mirror inside of vehicle 10 may be eliminated and replaced by cameras 34 (e.g., vehicle 10 may be free of side view mirrors and/or a rear view mirror, if desired).
- FIG. 4 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where multiple displays 30 in different portions of window 14 are used to display different mirror views.
- displays 30 may be located in different portions of window 14.
- Display 30L may be located in a lower left comer region of window 14 and may be configured to display images of left side region 58L captured by a first one of cameras 34.
- Display 30R may be located in a lower right comer region of window 14 and may be configured to display images of right side region 58R captured by a second one of camera 34.
- Display 30B may be located in an upper central region of window 14 and may be configured to display images of rear region 58B captured by a third one of camera 34. Displays 30 may be located in other portions of window 14 and/or may be configured to display different camera views.
- FIG. 4 is merely illustrative.
- FIG. 5 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where a single display 30 spans across most (e.g., more than half) or all of window 14.
- display 30 may be configured to display a live camera feed from a forward-facing camera (e.g., on front portion F of body 12) that captures images of the region in front of vehicle 10.
- the user may view the live camera feed of the environment in front of the vehicle instead of or in addition to looking through window 14 to view the environment directly.
- Control circuitry 26 may use different portions of display 30 to display the different camera views from cameras 34. Most of display 30 may be reserved for captured images of a forward-facing field of view (where vehicle 10 is headed), while captured images of left side mirror view 58L, right side mirror view 58R, and rear mirror view 58B may be displayed on smaller portions of display 30.
- FIG. 6 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where a single display 30 spans across less than half of window 14 and is used to display the one or more different mirror views captured by cameras 34. Control circuitry 26 may switch between different camera views (e.g., camera views 58L, 58R, 58B, etc.) or may display two or more of the different camera views at the same time.
- different camera views e.g., camera views 58L, 58R, 58B, etc.
- display 30 may display a live video feed such as live video feed 38.
- Live video feed 38 may include images captured by one or more of cameras 34.
- control circuitry 26 may overlay computer-generated display content such as computer-generated display content 40 onto live video 38.
- Computer-generated display content 40 may include symbols, text, alerts, warnings, virtual images, navigation-related information (e.g., navigation instructions, arrows, points of interest, etc.) highlighted, colored, or otherwise modified portions of real-world objects in live video 38, and/or other computer-generated display content.
- lower-resolution display output such as visual indicator 44 may be located adjacent to display 30 to highlight the content that is being displayed on display 30 and/or to convey other information.
- visual indicator 44 may be activated when control circuitry 26 switches display 30 from an informational display mode (e.g., displaying speed, mileage, heading information, etc.) to a left side mirror view of region 58L (e.g., when a left turn signal is activated).
- Visual indicator 44 may blink or provide other visual output when an object approaching vehicle 10 comes within the field of view of one of cameras 34.
- the color, frequency, and intensity of light emitted from visual indicator 44 may be adjusted in any suitable manner. These examples are merely illustrative.
- FIGS. 7, 8, and 9 show side views of different types of displays 30 that may be integrated into window 14 to display integrated mirror views on window 14. These examples are merely illustrative. If desired, other types of displays may be integrated into window 14 for displaying images captured by cameras 34 and/or for displaying other content.
- display 30 is a head-up display.
- a user’s eye 42 may look in the positive Y-direction through window 14 (e.g., through the front windshield) while operating vehicle 10.
- head-up display 30 may include an optical combiner such as optical combiner 52 and a light source such as projector 32 or other light source.
- Projector 32 may emit light 48 (e.g., for forming images) towards optical combiner 52 on region 46 of window 14.
- the images emitted by head-up display 30 may be viewable on region 46 of window 14.
- Light 48 reflects off of combiner 52 of window 14 towards eye 42, as shown by reflected light 50.
- Optical combiner 52 may be used to combine image light from projector 32 and light from the real world that is passing through window 14. In this way, the user within vehicle 10 may view real- world objects through window 14 while also viewing images from projector 32 (e.g., images captured by cameras 34) on window 14.
- Combiner 52 may be a separate optical combiner formed on the surface of window 14, or combiner 52 may be formed as an integral part of window 14.
- optical combiner 52 may be omitted.
- projector 32 may project images onto window 14 and the images may reflect off of the window surface (e.g., a glass surface or other surface) towards eye 42.
- display 30 is a surface-laminated display having display layer 54 laminated to a surface of window 14.
- Display layer 54 may include an array of pixels such as liquid crystal pixels, polymer-dispersed liquid crystal pixels, organic lightemitting diode pixels, and/or display pixels based on other display technologies.
- display 30 is an embedded display that is embedded within window 14.
- Window 14 may include window layers such as structural window layers 14-1 and 14-2 (e.g., layers of glass or other transparent material).
- Adhesive layer 14-3 may be interposed between outer structural layers 14-1 and 14-2 and may attach the outer structural layers together.
- Display 30 may be located between outer structural layers 14-1 and 14-2.
- Display 30 may include an array of pixels such as liquid crystal pixels, polymer-dispersed liquid crystal pixels, organic light-emitting diode pixels, and/or display pixels based on other display technologies.
- a system in accordance with an embodiment, includes a vehicle body having left and right sides, a front windshield coupled to the body, a first camera on the vehicle body that is configured to capture a left side view of a first region adjacent to the left side of the vehicle body, a second camera on the vehicle body that is configured to capture a right side view of a second region adjacent to the right side of the vehicle body, and a display on the front windshield that is configured to display the left side view captured by the first camera and the right side view captured by the second camera.
- the display includes a head-up display.
- the head-up display includes a projector configured to project the left side view and the right side view onto the front windshield.
- the display includes a surface-laminated display that is laminated to a surface of the front windshield.
- the front windshield includes first and second structural layers and the display includes an embedded display that is interposed between the first and second structural layers.
- the first and second structural layers include glass.
- the system includes a controller configured to display the left side view on the display in response to a left turn signal being activated and to display the right side view on the display in response to a right turn signal being activated.
- the left side view and the right side view are stitched together on the display to form an integrated view of an environment surrounding the vehicle body.
- the system includes a rear view camera that captures a rear view, the display on the front windshield is configured to display the rear view.
- the system includes a controller that is configured to overlay computer-generated display content onto at least one of the left side view and the right side view on the display.
- a vehicle in accordance with an embodiment, includes a vehicle body having an opening, a window in the opening, a camera mounted to the vehicle body, the camera is configured to capture images of an environment outside of the vehicle body, and a display integrated into the window that is configured to display a live video feed of the images captured by the camera.
- the display is selected from the group consisting of: a head-up display, a surface-laminated display, and an embedded display.
- the window includes a front windshield and the images of the environment include images of a left side region to the left of the vehicle body.
- the vehicle includes an additional camera mounted to a right side of the vehicle body that is configured to capture additional images of a right side region to the right of the vehicle body, and an additional display integrated into the window that is configured to display an additional live video feed of the additional images captured by the additional camera.
- the live video feed is positioned on the front windshield based on at least one of: a gaze position and a position of an external object in front of the vehicle.
- a vehicle in accordance with an embodiment, includes a vehicle body having an opening, first and second cameras mounted to opposing sides of the vehicle body, a front windshield in the opening, and a display integrated into the front windshield and configured to display a live video feed of images captured by the first and second cameras.
- the vehicle includes a controller configured to determine when display the live video feed of the images on the display based on at least one of: turn signal information, user input, and sensor data.
- the images include a left side view and right side view and the display is configured to display the left side view in response to sensing lateral acceleration to the left and to display the right side view in response to sensing lateral acceleration to the right.
- the controller is configured to overlay computer-generated display content onto the live video feed.
- real- world objects are viewable through the front windshield from a forward-facing field of view and the live video feed is positioned on the window to align with the forward-facing field of view.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Closed-Circuit Television Systems (AREA)
Abstract
Systems may include a body and windows. One or more cameras may be mounted to the body. One or more displays may be integrated into a window and may be configured to display a live video feed of images captured by the cameras. A left camera may be mounted to a left side of the body and may be configured to capture a left side view. A right camera may be mounted to a right side of the body and may be configured to capture a right side view. The display on the window may be used to display the left side view and/or the right side view.
Description
Displays with Exterior Views
This application claims priority to U.S. provisional patent application No. 63/480,089, filed January 16, 2023, which is hereby incorporated by reference herein in its entirety.
Field
[0001] This relates generally to systems, and, more particularly, to systems with displays.
Background
[0002] Systems such as vehicles may include windows such as a front windshield and may also include mirrors such as side view mirrors and a rear view mirror. Mirrors such as these may be used to check for cars or other objects in the vicinity of the vehicle. In a conventional vehicle, mirrors are located separately from the front windshield, so a driver or other passenger in the vehicle must look away from the front windshield to check the mirrors before changing lanes, breaking, or taking other actions. If care is not taken, the risk of collisions may be increased when the user is looking away from the front windshield in order to check the mirrors.
Summary
[0003] Systems such as vehicles may include a vehicle body with one or more openings. Windows may be mounted in the openings. The windows may include a front windshield through which a user views the environment in front of the vehicle.
[0004] One or more displays may be integrated into the vehicle windows such as the front windshield. The display may be a head-up display, a surface-laminated display, an embedded display, and/or any other suitable type of display.
[0005] The display in the front windshield may be configured to display images captured by a camera in the vehicle. For example, a left camera may be mounted to a left side of the body and may be configured to capture a left side view (sometimes referred to as a left side mirror view). The left camera may be co-located with a left side view mirror or may replace a left side view mirror. A right camera may be mounted to a right side of the body and may be configured to capture a right side view (sometimes referred to as a right side mirror view). The right camera may be co-located with a right side view mirror or may replace a right side
view mirror.
[0006] The one or more displays in the front windshield may be configured to display a live video feed of the left side view captured by the left camera and the right side view captured by the right camera. Control circuitry (e.g., a controller, microprocessor, etc.) in the vehicle may automatically display the live video feed of the left side view when a left turn signal is activated and may automatically display the live video feed of the right side view when a right turn signal is activated. The left and right side views may be displayed on separate respective displays on the front windshield or may be displayed by a single common display on the front windshield.
[0007] The display on the front windshield may allow the user to check for cars, people, or other obstructions around the vehicle (e.g., before changing lanes or taking other actions) without having to divert the user’s gaze away from the front windshield.
[0008] Additional cameras may be mounted to the vehicle body such as a rear view camera, a birdseye view camera, a three-hundred-and-sixty-degree camera, etc. The display on the front windshield may display live video feeds from the different cameras at the same time or at different times depending on turn signal information, speedometer information, user input, proximity sensor information or other sensor information, etc.
Brief Description of the Drawings
[0009] FIG. 1 is a top view of an illustrative system with a window in accordance with some embodiments.
[0010] FIG. 2 is a schematic diagram of an illustrative vehicle with a window and a display in accordance with some embodiments.
[0011] FIG. 3 is a top view of an illustrative vehicle with one or more cameras that capture mirror views and one or more windows with integrated displays for displaying the mirror views in accordance with some embodiments.
[0012] FIG. 4 is a front view of an illustrative window with discrete integrated displays for displaying one or more different mirror views on the window in accordance with some embodiments.
[0013] FIG. 5 is a front view of an illustrative window with an integrated display that extends across most or all of the window in accordance with some embodiments.
[0014] FIG. 6 is a front view of an illustrative window with an integrated display that
displays a live camera feed and computer-generated display content in accordance with some embodiments.
[0015] FIG. 7 is a side view of an illustrative window with a head-up display for displaying a mirror view on the window in accordance with some embodiments.
[0016] FIG. 8 is a side view of an illustrative window with a surface-laminated display for displaying a mirror view on the window in accordance with some embodiments.
[0017] FIG. 9 is a side view of an illustrative window with an embedded display for displaying a mirror view on the window in accordance with some embodiments.
Detailed Description
[0018] Systems may be provided with windows. For example, a vehicle or other system may have a window such as a front windshield that a user (e.g., a driver, a rider, or other passenger) in the vehicle looks through to see where the vehicle is headed.
[0019] One or more displays may be integrated into a vehicle window such as a front windshield or other window. The display may be a head-up display, a surface-laminated display, an embedded display, or other suitable display. The user may be able to view the display on the window without needing to shift the user’s gaze away from the window.
[0020] The vehicle may include cameras mounted to one or more different locations of the vehicle. The cameras may be configured to capture images and/or video of the vehicle’s surroundings and/or of the vehicle itself. Cameras may, for example, be co-located with mirrors and/or may replace mirrors entirely. Cameras may include a left side camera that captures a left side mirror view, a right side camera that captures a right side mirror view, a rear view camera that captures a rear mirror view, and/or other cameras that capture other views of the vehicle’s surroundings or the vehicle itself (e.g., a camera may capture a birdseye view of the vehicle, if desired).
[0021] The one or more different mirror views captured by the cameras may be displayed by the one or more displays that are integrated into the window. The window may contain a single display for displaying multiple different mirror views, or the window may contain multiple displays each configured to display a different respective one of the mirror views. This allows the user to check the left side mirror view, the right side mirror view, the rear mirror view, and/or other camera views without diverting the user’s gaze away from the window.
[0022] Each mirror view on the window may be a live camera feed (e.g., live video feed) of the images captured by a respective one of the cameras. If desired, control circuitry (e.g., a controller, microprocessor, etc.) in the vehicle may be configured to overlay computergenerated display content onto the live camera feed and/or may be configured to supplement or replace the live camera feed with computer-generated display content. The computergenerated display content may include symbols, text, navigation-related display content, virtual images, warnings, alerts, highlighted or otherwise modified portions of the real-world content in the live camera feed, and/or other display content.
[0023] An illustrative system of the type that may include a window with an integrated mirror view is shown in FIG. 1. System 10 may be a vehicle, building, or other type of system. FIG. 1 is a top view of an illustrative vehicle 10. In the example of FIG. 1, vehicle 10 is the type of vehicle that may carry passengers (e.g., an automobile, truck, or other automotive vehicle). Configurations in which vehicle 10 is a robot (e.g., an autonomous robot) or other vehicle that does not carry human passengers may also be used. Vehicles such as automobiles may sometimes be described herein as an example.
[0024] Vehicle 10 may be manually driven (e.g., by a human driver), may be operated via remote control, and/or may be autonomously operated (e.g., by an autonomous driving system or other autonomous propulsion system). Using vehicle sensors such as lidar, radar, visible and/or infrared cameras (e.g., two-dimensional and/or three-dimensional cameras), proximity (distance) sensors, and/or other sensors, an autonomous driving system and/or driver-assistance system in vehicle 10 may perform automatic braking, steering, and/or other operations to help avoid pedestrians, inanimate objects, and/or other external structures on roadways.
[0025] Vehicle 10 may include a body such as vehicle body 12. Body 12 may include vehicle structures such as body panels formed from metal and/or other materials, may include doors 18, a hood, side body panels, a trunk, fenders, a chassis to which wheels are mounted, a roof, etc. Windows (sometimes referred to as glazings) such as window 14 may be coupled to body 12 and may be configured to cover openings in body 12. Windows may be formed in doors 18 and other portions of vehicle body 12 (e.g., on the sides of vehicle body 12, on the roof of vehicle 10, and/or in other portions of vehicle 10). FIG. 1 shows a window 14 formed on a front portion F of vehicle 10 (e.g., a front windshield). Windows (e.g., window 14), doors 18, and other portions of body 12 may separate interior region 20 of vehicle 10 from
the exterior environment (e.g., exterior region 16) that is surrounding vehicle 10. Doors 18 may be opened and closed to allow people to enter and exit vehicle 10. Seats and other structures may be formed in the interior of vehicle body 12.
[0026] Motorized window positioners may be used to open and close windows 14, if desired. The windows in system 10 such as window 14 may include a front window mounted within an opening in body 12 at the front of a vehicle (e.g., a front windshield), a moon roof (sun roof) window or other window extending over some or all of the top of a vehicle, a rear window at the rear of a vehicle, and/or side windows on the sides of a vehicle. Window 14 may be flat or window 14 may have one or more curved portions. The area of each window 14 in system 10 may be at least 0.1 m2, at least 0.5 m2, at least 1 m2, at least 5 m2, at least 10 m2, less than 20 m2, less than 10 m2, less than 5 m2, or less than 1.5 m2 (as examples).
[0027] Vehicle 10 may have automotive lighting such as one or more headlights (sometimes referred to as headlamps), driving lights, fog lights, daytime running lights, turn signals, brake lights, and/or other lights. In some cases, vehicle 10 may include an exterior display that is configured to display content at the exterior of the vehicle.
[0028] As shown in the schematic diagram of FIG. 2, vehicle 10 may also include components such as propulsion and steering systems 24 (e.g., wheels, an engine, a motor, brakes, a steering wheel, etc.). Propulsion and steering systems 24 may include manually adjustable driving systems and/or autonomous driving systems having wheels coupled to body 12, steering controls, one or more motors for driving the wheels, etc. In FIG. 1, the propulsion and steering systems 24 may propel the vehicle forward in the positive Y- direction.
[0029] Vehicle 10 also includes control circuitry 26 and input-output devices 28. Control circuitry 26 (sometimes referred to as a controller, microprocessor, or central processing unit) may be configured to run an autonomous driving application, a navigation application (e.g., an application for displaying maps on a display), and software for controlling vehicle climate control devices, lighting, media playback, window movement, door operations, sensor operations, and/or other vehicle operations. For example, the control circuitry 26 may form part of an autonomous driving system that drives vehicle 10 on roadways autonomously using data such as sensor data. The control circuitry may include processing circuitry and storage and may be configured to perform operations in vehicle 10 using hardware (e.g., dedicated hardware or circuitry), firmware and/or software. Software code for performing
operations in vehicle 10 and other data is stored on non-transitory computer readable storage media (e.g., tangible computer readable storage media) in the control circuitry. The software code may sometimes be referred to as software, data, program instructions, computer instructions, instructions, or code. The non-transitory computer readable storage media may include non-volatile memory such as non-volatile random-access memory, one or more hard drives (e.g., magnetic drives or solid-state drives), one or more removable flash drives or other removable media, or other storage. Software stored on the non-transitory computer readable storage media may be executed on the processing circuitry of control circuitry 26. The processing circuitry may include application-specific integrated circuits with processing circuitry, one or more microprocessors, a central processing unit (CPU) or other processing circuitry.
[0030] Input-output devices 28 may include displays, light-emitting diodes and other lightemitting devices, haptic devices, speakers, and/or other devices for providing output. Output devices may, for example, be used to provide vehicle occupants and others with haptic output, audio output, visual output (e.g., displayed content, light, etc.), and/or other suitable output. The input-output devices 28 may also include input devices such as buttons, sensors, and other devices for gathering user input, for gathering environmental measurements, for gathering information on vehicle operations, and/or for gathering other information. The sensors may include ambient light sensors, touch sensors (e.g., on a touch-sensitive display or separate from a display), force sensors, proximity sensors, optical sensors such as cameras operating at visible, infrared, and/or ultraviolet wavelengths (e.g., fisheye cameras, two- dimensional cameras, three-dimensional cameras, and/or other cameras), capacitive sensors, resistive sensors, ultrasonic sensors (e.g., ultrasonic distance sensors), microphones, radiofrequency sensors such as radar sensors, lidar (light detection and ranging) sensors, door open/close sensors, seat pressure sensors and other vehicle occupant sensors, window sensors, position sensors for monitoring location, orientation, and movement, speedometers, and/or other sensors.
[0031] During operation, control circuitry 26 may gather information from sensors and/or other input-output devices 28 such as lidar data, camera data (e.g., two-dimensional images), radar data, and/or other sensor data. This information may be used by an autonomous driving system and/or driver’s assistance system in vehicle 10.
[0032] As shown in FIG. 2, input-output devices 28 may include a display such as display
30. Display 30 may be integrated into a window such as window 14 of FIG. 1, may be integrated into a mirror, and/or may be located in other portions of vehicle 10. Display 30 may include a pixel array with liquid crystal display pixels, organic light-emitting diode pixels, electrophoretic ink pixels, and/or pixels based on any other suitable display technology.
[0033] In arrangements where display 30 is integrated into a window such as window 14, display 30 may be a surface-laminated display that is laminated to an outer surface of window 14, may be an embedded display that is embedded within window 14, may be a head-up display in which images are projected onto window 14 and reflected towards an eye box for viewing by a user, and/or may be any suitable display that is viewable on window 14. [0034] In a head-up display, a projector unit or other light source may emit image light onto the window. Window 14 may serve as an optical combiner that combines the image light projected onto the window by the projector and the real-world light that passes through window 14 (in the negative Y-direction of FIG. 1). In this way, the users of the vehicle may observe the images being displayed on window 14 by display 30 while also viewing real- world objects through window 14. In some arrangements, a head-up display may include a projector unit that projects images onto a window that does not include any optical combiners. The images may reflect off of the window surface towards the user.
[0035] Input-output devices 28 may include one or more cameras such as cameras 34. Cameras 34 may be visible light cameras, two-dimensional cameras, three-dimensional cameras, and/or any other suitable type of camera. Cameras 34 may be configured to capture images of the environment surrounding vehicle 10 and/or images of vehicle 10 itself. Cameras 34 may be located in any suitable location on vehicle 10. Cameras 34 may be mounted on an interior (e.g., interior 20) or exterior portion (e.g., exterior 16) of body 12 of vehicle 10. During operation of vehicle 10, display 30 on window 14 may be configured to display a live camera feed of images captured with cameras 34.
[0036] FIG. 3 is a top view of vehicle 10 showing illustrative locations for cameras 34 and displays 30 in vehicle 10. As shown in FIG. 3, vehicle 10 may include cameras 34. There may be one, two, three, four, five, ten, more than ten, or less than ten cameras 34. With one illustrative arrangement that is sometimes described herein as an example, cameras 34 are visible light cameras with image sensors formed from a two-dimensional array of pixels. [0037] Cameras 34 may be located in any suitable location on vehicle 10. Cameras 34 may
be mounted on an interior or exterior portion of body 12 of vehicle 10. In the example of FIG. 3, cameras 34 include a left side view camera 34 mounted on left side L of body 12, a right side view camera 34 mounted on right side R of body 12, a rear view camera 34 mounted on back side B of body 12, and a top view camera 34 mounted on top side T of body 12. There may be more than one camera mounted to a respective side of body 12. Cameras 34 may be mounted to front side F of body 12, inside of body 12, and/or in any other suitable location on vehicle 10. The example of FIG. 3 is merely illustrative.
[0038] One or more displays 30 on front windshield window 14 may be configured to display the images captured by cameras 34. User 36 (e.g., a driver, rider, passenger, etc.) may view the images captured by camera 34 on displays 30 on window 14. This allows user 36 to view objects, obstructions, people, or other items in the vicinity of vehicle 10 without having to divert the user’s gaze away from window 14.
[0039] Cameras 34 may be placed in locations that would otherwise need mirrors to allow user 36 to view objects that are outside of the user’s forward-looking field of view. For example, a first camera 34 may be located on left side L of body 12 and may be configured to capture a left side mirror view of region 58L to the left of body 12. A second camera 34 may be located on right side R of body 12 and may be configured to capture a ride side mirror view of region 58R to the right of body 12. A third camera 34 may be located on back side B of body 12 and may be configured to capture a rear mirror view of rear region 58B behind vehicle 10. If desired, camera 34 on top surface T of body 12 may be configured to capture a three-hundred-and-sixty-degree view of the area surrounding vehicle 10, may be configured to capture a birdseye view of vehicle 10, and/or may be configured to capture other fields of view of vehicle 10 and/or its surroundings.
[0040] One or more displays 30 on window 14 may be configured to display a live camera feed of the images captured by cameras 34. For example, one or more of displays 30 may display live camera views of left side region 58L, right side region 58R, and rear region 58B behind vehicle 10. If desired, different displays 30 may be configured to display the different camera views. For example, a first display 30 may display a live camera view of left side region 58L, a second display 30 may display a live camera view of right side region 58R, and a third display 30 may display a live camera view of back side region 58B. The different displays 30 that display the different camera views may be located in different portions of window 14, if desired. For example, a camera view of left side region 58L may be displayed
on a display 30 located on a left portion of window 14, a camera view of right side region 58R may be displayed on a display 30 located on a right portion of window 14, and a camera view of back region 58B may be displayed on a display 30 located in an upper central portion of window 14. In other arrangements, the different camera views may all be displayed by the same display 30 (at the same time or at different times). For example, the different camera views such as camera views 58L, 58R, 58B, and/or other camera views may be stitched together or tiled together to create a panoramic view or other integrated view of the environment surrounding vehicle 10.
[0041] If desired, camera views may be positioned on window 14 based on where user 36 is looking (e.g., so that camera views can be displayed in a location that aligns with the user’s gaze on window 14). In other arrangements, camera views may be positioned on window 14 based on the locations of real- world objects in front of vehicle 10. For example, if an object (e.g., a vehicle, a person, or other obstruction) is visible through window 14, the camera views may be positioned on window 14 so as to avoid obstructing the user’s field of view of the object. Gaze tracking sensors or other user-position-tracking sensors may be used to track the position of user 36 while forward-facing sensors such as LIDAR sensors, cameras, or other sensors may be used to track the position of objects in front of vehicle 10 so that camera views can be displayed on window 14 without obstructing the user’s field of view of objects in front of vehicle 10. In general, control circuitry 26 may be configured to dynamically adjust the position of the camera views being display on window 14 based on sensor information, based on user input, based on vehicle settings, based on user position, based on the position of nearby vehicles, people, obstructions, road signs, etc., based on the urgency of the displayed information (e.g., whether the camera views should be urgently checked by the user to avoid a collision), and/or based on other information.
[0042] Because display 30 is integrated into window 14, user 36 may seamlessly view real- world content through window 14 while also viewing the captured camera views on window 14. If desired, camera views may be displayed on window 14 in positions that align with the real-world being viewed through window 14 to provide an immersive experience in which the user can observe both the road ahead and the surrounding camera views captured by cameras 34 at a glance. For example, a right edge of a left side camera view may be aligned with a left edge of the user’s forward-facing field of view, and a left edge of a right side camera view may be aligned with a right edge of the user’s forward-facing field of view.
This allows the user to observe an integrated, wrap-around field of view of vehicle 10 and its surroundings on window 14.
[0043] Displays 30 may be configured to continuously display the live video feeds from cameras 34, or displays 30 may only display the live video feeds from cameras 34 in response to user input and/or control signals from control circuitry 26. For example, control circuitry 26 may determine when to display a camera view (and which camera view to display) on display 30 based on turn signals, vehicle speed, proximity sensor data or other sensor data, and/or other information. For example, control circuitry 26 may use display 30 to automatically display a camera view of left side region 58L in response to a left turn signal being activated, in response to sensing lateral acceleration of the vehicle towards the left side, in response to a proximity sensor in vehicle 10 detecting an object or person approaching left side L of body 12, and/or in response to other information. Control circuitry 26 may use display 30 to automatically display a camera view of right side region 58R in response to a right turn signal being activated, in response to sensing lateral acceleration of the vehicle towards the right side, in response to a proximity sensor in vehicle 10 detecting an object or person approaching right side R of body 12, and/or in response to other information. Control circuitry 26 may use display 30 to automatically display a camera view of rear region 58B in response to vehicle 10 being placed in a reverse gear, in response to a rapid deceleration of vehicle 10, in response to a proximity sensor in vehicle 10 detecting an object or person approaching back side B of body 12, and/or in response to other information.
[0044] If desired, the live camera feed may be activated when it appears that the user has not appropriately checked mirrors 56 and/or has not observed other blind spot warnings. Vehicle 10 may have blind spot indicator lights that turn on whenever another vehicle is located in the user’s blind spot (e.g., region 58L or region 58R). If user 36 begins to change lanes when the blind spot indicator light is turned on, control circuitry 26 may then activate the live video feed on window 14 as a last attempt to warn the user of a vehicle or other object in the user’s blind spot (e.g., left side view 58L may be activated and displayed on window 14 in response to a user moving to the left despite the left side blind spot indicator light being turned on).
[0045] In some arrangements, side view cameras 34 may be co-located with side view mirrors such as side view mirrors 56. For example, a left side view camera 34 that captures a left side mirror view of region 58L may be co-located with a left side view mirror 56, and a
right side view camera 34 that captures a right side mirror view of region 58R may be colocated with a right side view mirror 56. Rear view camera 34 may be placed on a back side B of vehicle 10 or may be placed inside of vehicle and co-located with a rear view mirror. The field of view of each side view camera 34 may be similar to the field of view of the colocated side view mirror 56, or cameras 34 may have different fields of view than mirrors 56. In other arrangements, side view mirrors 56 and/or other mirrors such as a rear view mirror inside of vehicle 10 may be eliminated and replaced by cameras 34 (e.g., vehicle 10 may be free of side view mirrors and/or a rear view mirror, if desired).
[0046] FIG. 4 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where multiple displays 30 in different portions of window 14 are used to display different mirror views. As shown in FIG. 4, displays 30 may be located in different portions of window 14. Display 30L may be located in a lower left comer region of window 14 and may be configured to display images of left side region 58L captured by a first one of cameras 34. Display 30R may be located in a lower right comer region of window 14 and may be configured to display images of right side region 58R captured by a second one of camera 34. Display 30B may be located in an upper central region of window 14 and may be configured to display images of rear region 58B captured by a third one of camera 34. Displays 30 may be located in other portions of window 14 and/or may be configured to display different camera views. The example of FIG. 4 is merely illustrative.
[0047] FIG. 5 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where a single display 30 spans across most (e.g., more than half) or all of window 14. In this type of arrangement, display 30 may be configured to display a live camera feed from a forward-facing camera (e.g., on front portion F of body 12) that captures images of the region in front of vehicle 10. The user may view the live camera feed of the environment in front of the vehicle instead of or in addition to looking through window 14 to view the environment directly.
[0048] Control circuitry 26 may use different portions of display 30 to display the different camera views from cameras 34. Most of display 30 may be reserved for captured images of a forward-facing field of view (where vehicle 10 is headed), while captured images of left side mirror view 58L, right side mirror view 58R, and rear mirror view 58B may be displayed on smaller portions of display 30.
[0049] FIG. 6 is a front view of an illustrative window 14 (e.g., when viewed from the perspective of user 36 of FIG. 3) in an arrangement where a single display 30 spans across less than half of window 14 and is used to display the one or more different mirror views captured by cameras 34. Control circuitry 26 may switch between different camera views (e.g., camera views 58L, 58R, 58B, etc.) or may display two or more of the different camera views at the same time.
[0050] As shown in FIG. 6, display 30 may display a live video feed such as live video feed 38. Live video feed 38 may include images captured by one or more of cameras 34. If desired, control circuitry 26 may overlay computer-generated display content such as computer-generated display content 40 onto live video 38. Computer-generated display content 40 may include symbols, text, alerts, warnings, virtual images, navigation-related information (e.g., navigation instructions, arrows, points of interest, etc.) highlighted, colored, or otherwise modified portions of real-world objects in live video 38, and/or other computer-generated display content.
[0051] If desired, lower-resolution display output such as visual indicator 44 may be located adjacent to display 30 to highlight the content that is being displayed on display 30 and/or to convey other information. For example, visual indicator 44 may be activated when control circuitry 26 switches display 30 from an informational display mode (e.g., displaying speed, mileage, heading information, etc.) to a left side mirror view of region 58L (e.g., when a left turn signal is activated). Visual indicator 44 may blink or provide other visual output when an object approaching vehicle 10 comes within the field of view of one of cameras 34. The color, frequency, and intensity of light emitted from visual indicator 44 may be adjusted in any suitable manner. These examples are merely illustrative.
[0052] FIGS. 7, 8, and 9 show side views of different types of displays 30 that may be integrated into window 14 to display integrated mirror views on window 14. These examples are merely illustrative. If desired, other types of displays may be integrated into window 14 for displaying images captured by cameras 34 and/or for displaying other content.
[0053] In the example of FIG. 7, display 30 is a head-up display. A user’s eye 42 may look in the positive Y-direction through window 14 (e.g., through the front windshield) while operating vehicle 10.
[0054] In some arrangements, head-up display 30 may include an optical combiner such as optical combiner 52 and a light source such as projector 32 or other light source. Projector 32
may emit light 48 (e.g., for forming images) towards optical combiner 52 on region 46 of window 14. The images emitted by head-up display 30 may be viewable on region 46 of window 14. Light 48 reflects off of combiner 52 of window 14 towards eye 42, as shown by reflected light 50.
[0055] Optical combiner 52 may be used to combine image light from projector 32 and light from the real world that is passing through window 14. In this way, the user within vehicle 10 may view real- world objects through window 14 while also viewing images from projector 32 (e.g., images captured by cameras 34) on window 14. Combiner 52 may be a separate optical combiner formed on the surface of window 14, or combiner 52 may be formed as an integral part of window 14.
[0056] In other arrangements, optical combiner 52 may be omitted. With this type of configuration, projector 32 may project images onto window 14 and the images may reflect off of the window surface (e.g., a glass surface or other surface) towards eye 42.
[0057] In the example of FIG. 8, display 30 is a surface-laminated display having display layer 54 laminated to a surface of window 14. Display layer 54 may include an array of pixels such as liquid crystal pixels, polymer-dispersed liquid crystal pixels, organic lightemitting diode pixels, and/or display pixels based on other display technologies.
[0058] In the example of FIG. 9, display 30 is an embedded display that is embedded within window 14. Window 14 may include window layers such as structural window layers 14-1 and 14-2 (e.g., layers of glass or other transparent material). Adhesive layer 14-3 may be interposed between outer structural layers 14-1 and 14-2 and may attach the outer structural layers together. Display 30 may be located between outer structural layers 14-1 and 14-2. Display 30 may include an array of pixels such as liquid crystal pixels, polymer-dispersed liquid crystal pixels, organic light-emitting diode pixels, and/or display pixels based on other display technologies.
[0059] In accordance with an embodiment, a system is provided that includes a vehicle body having left and right sides, a front windshield coupled to the body, a first camera on the vehicle body that is configured to capture a left side view of a first region adjacent to the left side of the vehicle body, a second camera on the vehicle body that is configured to capture a right side view of a second region adjacent to the right side of the vehicle body, and a display on the front windshield that is configured to display the left side view captured by the first camera and the right side view captured by the second camera.
[0060] In accordance with another embodiment, the display includes a head-up display. [0061] In accordance with another embodiment, the head-up display includes a projector configured to project the left side view and the right side view onto the front windshield. [0062] In accordance with another embodiment, the display includes a surface-laminated display that is laminated to a surface of the front windshield.
[0063] In accordance with another embodiment, the front windshield includes first and second structural layers and the display includes an embedded display that is interposed between the first and second structural layers.
[0064] In accordance with another embodiment, the first and second structural layers include glass.
[0065] In accordance with another embodiment, the system includes a controller configured to display the left side view on the display in response to a left turn signal being activated and to display the right side view on the display in response to a right turn signal being activated. [0066] In accordance with another embodiment, the left side view and the right side view are stitched together on the display to form an integrated view of an environment surrounding the vehicle body.
[0067] In accordance with another embodiment, the system includes a rear view camera that captures a rear view, the display on the front windshield is configured to display the rear view.
[0068] In accordance with another embodiment, the system includes a controller that is configured to overlay computer-generated display content onto at least one of the left side view and the right side view on the display.
[0069] In accordance with an embodiment, a vehicle is provided that includes a vehicle body having an opening, a window in the opening, a camera mounted to the vehicle body, the camera is configured to capture images of an environment outside of the vehicle body, and a display integrated into the window that is configured to display a live video feed of the images captured by the camera.
[0070] In accordance with another embodiment, the display is selected from the group consisting of: a head-up display, a surface-laminated display, and an embedded display. [0071] In accordance with another embodiment, the window includes a front windshield and the images of the environment include images of a left side region to the left of the vehicle body.
[0072] In accordance with another embodiment, the vehicle includes an additional camera mounted to a right side of the vehicle body that is configured to capture additional images of a right side region to the right of the vehicle body, and an additional display integrated into the window that is configured to display an additional live video feed of the additional images captured by the additional camera.
[0073] In accordance with another embodiment, the live video feed is positioned on the front windshield based on at least one of: a gaze position and a position of an external object in front of the vehicle.
[0074] In accordance with an embodiment, a vehicle is provided that includes a vehicle body having an opening, first and second cameras mounted to opposing sides of the vehicle body, a front windshield in the opening, and a display integrated into the front windshield and configured to display a live video feed of images captured by the first and second cameras.
[0075] In accordance with another embodiment, the vehicle includes a controller configured to determine when display the live video feed of the images on the display based on at least one of: turn signal information, user input, and sensor data.
[0076] In accordance with another embodiment, the images include a left side view and right side view and the display is configured to display the left side view in response to sensing lateral acceleration to the left and to display the right side view in response to sensing lateral acceleration to the right.
[0077] In accordance with another embodiment, the controller is configured to overlay computer-generated display content onto the live video feed.
[0078] In accordance with another embodiment, real- world objects are viewable through the front windshield from a forward-facing field of view and the live video feed is positioned on the window to align with the forward-facing field of view.
[0079] The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
Claims
1. A system comprising: a vehicle body having left and right sides; a front windshield coupled to the body; a first camera on the vehicle body that is configured to capture a left side view of a first region adjacent to the left side of the vehicle body; a second camera on the vehicle body that is configured to capture a right side view of a second region adjacent to the right side of the vehicle body; and a display on the front windshield that is configured to display the left side view captured by the first camera and the right side view captured by the second camera.
2. The system defined in claim 1 wherein the display comprises a head-up display.
3. The system defined in claim 2 wherein the head-up display comprises a projector configured to project the left side view and the right side view onto the front windshield.
4. The system defined in claim 1 wherein the display comprises a surface- laminated display that is laminated to a surface of the front windshield.
5. The system defined in claim 1 wherein the front windshield comprises first and second structural layers and wherein the display comprises an embedded display that is interposed between the first and second structural layers.
6. The system defined in claim 5 wherein the first and second structural layers comprise glass.
7. The system defined in claim 1 further comprising a controller configured to display the left side view on the display in response to a left turn signal being activated and to display the right side view on the display in response to a right turn signal being activated.
8. The system defined in claim 1 wherein the left side view and the right side view are stitched together on the display to form an integrated view of an environment surrounding the vehicle body.
9. The system defined in claim 1 further comprising a rear view camera that captures a rear view, wherein the display on the front windshield is configured to display the rear view.
10. The system defined in claim 1 further comprising a controller that is configured to overlay computer-generated display content onto at least one of the left side view and the right side view on the display.
11. A vehicle comprising: a vehicle body having an opening; a window in the opening; a camera mounted to the vehicle body, wherein the camera is configured to capture images of an environment outside of the vehicle body; and a display integrated into the window that is configured to display a live video feed of the images captured by the camera.
12. The vehicle defined in claim 11 wherein the display is selected from the group consisting of: a head-up display, a surface-laminated display, and an embedded display.
13. The vehicle defined in claim 11 wherein the window comprises a front windshield and wherein the images of the environment include images of a left side region to the left of the vehicle body.
14. The vehicle defined in claim 13 further comprising: an additional camera mounted to a right side of the vehicle body that is configured to capture additional images of a right side region to the right of the vehicle body; and
an additional display integrated into the window that is configured to display an additional live video feed of the additional images captured by the additional camera.
15. The vehicle defined in claim 11 wherein the live video feed is positioned on the front windshield based on at least one of: a gaze position and a position of an external object in front of the vehicle.
16. A vehicle, comprising: a vehicle body having an opening; first and second cameras mounted to opposing sides of the vehicle body; a front windshield in the opening; and a display integrated into the front windshield and configured to display a live video feed of images captured by the first and second cameras.
17. The vehicle defined in claim 16 further comprising a controller configured to determine when display the live video feed of the images on the display based on at least one of: turn signal information, user input, and sensor data.
18. The vehicle defined in claim 17 wherein the images include a left side view and right side view and wherein the display is configured to display the left side view in response to sensing lateral acceleration to the left and to display the right side view in response to sensing lateral acceleration to the right.
19. The vehicle defined in claim 17 wherein the controller is configured to overlay computer-generated display content onto the live video feed.
20. The vehicle defined in claim 16 wherein real-world objects are viewable through the front windshield from a forward-facing field of view and wherein the live video feed is positioned on the window to align with the forward-facing field of view.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363480089P | 2023-01-16 | 2023-01-16 | |
| US63/480,089 | 2023-01-16 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024155407A1 true WO2024155407A1 (en) | 2024-07-25 |
Family
ID=89768580
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2023/084853 Ceased WO2024155407A1 (en) | 2023-01-16 | 2023-12-19 | Displays with exterior views |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2024155407A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5414439A (en) * | 1994-06-09 | 1995-05-09 | Delco Electronics Corporation | Head up display with night vision enhancement |
| US20050111698A1 (en) * | 2003-11-20 | 2005-05-26 | Nissan Motor Co., Ltd. | Apparatus for vehicle surroundings monitoring and method thereof |
| CN107662542A (en) * | 2016-07-27 | 2018-02-06 | 华东交通大学 | The head-up display device of road scene can be had an X-rayed under the conditions of low visibility |
| US20180065482A1 (en) * | 2016-09-06 | 2018-03-08 | Denso Korea Electronics Corporation | Hud integrated cluster system for vehicle camera |
| US20190315275A1 (en) * | 2016-11-21 | 2019-10-17 | Lg Electronics Inc. | Display device and operating method thereof |
-
2023
- 2023-12-19 WO PCT/US2023/084853 patent/WO2024155407A1/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5414439A (en) * | 1994-06-09 | 1995-05-09 | Delco Electronics Corporation | Head up display with night vision enhancement |
| US20050111698A1 (en) * | 2003-11-20 | 2005-05-26 | Nissan Motor Co., Ltd. | Apparatus for vehicle surroundings monitoring and method thereof |
| CN107662542A (en) * | 2016-07-27 | 2018-02-06 | 华东交通大学 | The head-up display device of road scene can be had an X-rayed under the conditions of low visibility |
| US20180065482A1 (en) * | 2016-09-06 | 2018-03-08 | Denso Korea Electronics Corporation | Hud integrated cluster system for vehicle camera |
| US20190315275A1 (en) * | 2016-11-21 | 2019-10-17 | Lg Electronics Inc. | Display device and operating method thereof |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12436611B2 (en) | Vehicular vision system | |
| US7898434B2 (en) | Display system and program | |
| TWI690436B (en) | Image display system for vehicle, and vehicle mounted with the image display system | |
| US10029621B2 (en) | Rear view camera system using rear view mirror location | |
| US20190315275A1 (en) | Display device and operating method thereof | |
| US11884216B2 (en) | Vehicular full mirror display system with auxiliary forward and rearward views | |
| US20220413604A1 (en) | Vehicle and method of controlling the same | |
| KR20190012052A (en) | Side mirror for vehicle | |
| US11420680B2 (en) | Method for assisting a user of a motor vehicle when swerving around an obstacle, driver assistance device, and a motor vehicle | |
| JP6623910B2 (en) | Display device for vehicles | |
| US20240149792A1 (en) | Motor vehicle with an assistance system | |
| US10836311B2 (en) | Information-presenting device | |
| KR20250128970A (en) | Method for controlling a display on a front windshield of a vehicle, display system and vehicle | |
| JP2014102770A (en) | Inter-vehicular distance alerting device | |
| JP2025501429A (en) | System having a movable display | |
| JP2017202721A (en) | Display system | |
| WO2024155407A1 (en) | Displays with exterior views | |
| US12233713B2 (en) | HUD park assist | |
| US20250353434A1 (en) | Vehicle side-view mirror system | |
| WO2024163077A1 (en) | Vehicle with gap-filling light | |
| WO2024186480A1 (en) | Systems with light-transmissive support structures | |
| KR20250064275A (en) | Rear View Mirror System For A Vehicle And A Vehicle |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23848045 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |