[go: up one dir, main page]

WO2003001474A2 - Method and apparatus for detecting possible collisions and transferring information between vehicles - Google Patents

Method and apparatus for detecting possible collisions and transferring information between vehicles Download PDF

Info

Publication number
WO2003001474A2
WO2003001474A2 PCT/US2002/020403 US0220403W WO03001474A2 WO 2003001474 A2 WO2003001474 A2 WO 2003001474A2 US 0220403 W US0220403 W US 0220403W WO 03001474 A2 WO03001474 A2 WO 03001474A2
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
local
kinematic state
data
inter
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/US2002/020403
Other languages
French (fr)
Other versions
WO2003001474A3 (en
Inventor
Robert Pierce Lutter
Dan Alan Preston
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Medius Inc
Original Assignee
Medius Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US09/892,293 external-priority patent/US20020140548A1/en
Priority claimed from US09/892,333 external-priority patent/US6615137B2/en
Application filed by Medius Inc filed Critical Medius Inc
Priority to AU2002349794A priority Critical patent/AU2002349794A1/en
Publication of WO2003001474A2 publication Critical patent/WO2003001474A2/en
Anticipated expiration legal-status Critical
Publication of WO2003001474A3 publication Critical patent/WO2003001474A3/en
Ceased legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/03Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for
    • B60R16/0315Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for supply of electrical power to vehicle subsystems or for using multiplexing techniques
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions
    • G08G1/0962Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
    • G08G1/0965Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages responding to signals from another vehicle, e.g. emergency vehicle
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/161Decentralised systems, e.g. inter-vehicle communication
    • G08G1/162Decentralised systems, e.g. inter-vehicle communication event-triggered
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/164Centralised systems, e.g. external to vehicles

Definitions

  • Vehicle collisions are often caused when a driver can not see or is unaware of an oncoming object.
  • a tree may obstruct a drivers view of oncoming traffic at an intersection. The driver has to enter the intersection with no knowledge whether another vehicle may be entering the same intersection. After entering the intersection, it is often too late for the driver to avoid an oncoming car that has failed to properly yield.
  • Sensor data is generated for areas around a vehicle. Any objects detected in the sensor data are identified and a kinematic state for the object determined. The kinematic states for the detected objects are compared with the kinematic state of the vehicle. If it is likely that a collision will occur between the detected objects and the local vehicle, a warning is automatically generated to notify the vehicle operator of the impending collision. The sensor data and kinematic state of the vehicle can be transmitted to other vehicles so that the other vehicles are also notified of possible collision conditions. Assorted vehicle subsystems, sensors, communication devices, and other electronic devices are connected to a processing unit by using a plurality of wireless links.
  • FIG. 1 is a diagram of an inter-vehicle communication system.
  • FIG. 2 is a block diagram showing how the inter-vehicle communication system of FIG. 1 operates.
  • FIG. 3 is a diagram showing how sensor data can be exchanged between different vehicles.
  • FIG. 4 is a diagram showing Graphical User Interfaces (GUIs) are used for different vehicles that share sensor data.
  • GUIs Graphical User Interfaces
  • FIG. 5 is a diagram showing how collision information can be exchanged between different vehicles.
  • FIGS. 6 and 7 are diagrams showing how kinetic state information for multiple vehicles can be used to identify road direction.
  • FIGS. 8 and 9 are diagrams showing how the inter- vehicle communication system is used to help avoid collisions.
  • FIG. 10 is a diagram showing how an emergency signal is broadcast to multiple vehicles from a police vehicle.
  • FIGS. 11 and 12 are diagrams showing sensors are used to indicate proximity of a local vehicle to other objects.
  • FIGS. 13 and 14 show different sensor and communication envelopes that are used by the inter-vehicle communication system.
  • FIG. 15 is a block diagram showing the different data inputs and outputs that are coupled to an inter-vehicle communication processor.
  • FIG. 16 is a block diagram showing how the processor in FIG. 16 operates.
  • FIG. 17 is a block diagram illustrating a first embodiment of the present invention.
  • FIG. 18 is a block diagram illustrating a second embodiment of the present invention.
  • FIG. 19 is a block diagram of a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17.
  • FIG. 20 is block diagram of a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18.
  • FIG. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20.
  • FIG. 1 shows a multi-vehicle communication system 12 that allows different vehicles to exchange kinematic state data.
  • Each vehicle 14 may include one or more sensors 18 that gather sensor information around the associated vehicle 14.
  • a transmitter/receiver (transceiver) in the vehicle 14 transmits to other vehicles kinematic state data 19 for objects detected by the sensors 18 and kinematic state data 17 for the vehicle itself.
  • a Central Processing Unit (CPU) 20 in the vehicle 14 is coupled between the sensors 18 and transceivers 16.
  • the CPUs 20 display the sensor information acquired from the local sensors 18 in the same vehicle and also displays, if appropriate, the kinematic state data 17 and 19 received from the other vehicles 14.
  • the CPU 20 for one of the vehicles may identify an object 22 that is detected by the sensor 18 A.
  • the CPU 20A identifies how far the object 22 is away from the vehicle 14A.
  • the CPU 20 A may also generate a warning signal if the object 22 comes within a specific distance of the vehicle 14A.
  • the CPU 20A then transmits the kinematic state data for object 22 to the other vehicles 14B and 14C that are within some range of vehicle 14 A.
  • the CPU 20B from vehicle 14B establishes communication with the transmitting vehicle 14A in box 24.
  • a navigation grid is established in box 26 that determines where the vehicle 14A is in relationship to vehicle 14B.
  • vehicle 14A sending its kinematic state data 17 such as location, speed, acceleration, and direction to vehicle 14B.
  • the vehicle 14B receives the kinematic state data for object 22 from vehicle 14A in box 28.
  • the CPU 20B determines the position of object 22 relative to vehicle 14B.
  • the CPU 20B displays the object on a digital map in vehicle 14B in box 32.
  • vehicle 14B receives the position of vehicle 14A and the information regarding object 22 through an intermediary vehicle 14C.
  • the transceiver 16A in vehicle 14A transmits the kinematic state of vehicle 14A and the information regarding object 22 to vehicle 14C.
  • the transceiver 16C in vehicle 14C then relays its own kinematic state data along with the kinematic state data of vehicle 14A and object 22 to vehicle 14B.
  • the CPU 20B determines from the kinematic state of vehicle 14A and the kinematic state of object 22, the position of object 22 is in relation to vehicle 14B. If the position of object 22 is within some range of vehicle 14B, the object 22 is displayed on a Graphical User Interface (GUI) inside of vehicle 14B (not shown).
  • GUI Graphical User Interface
  • FIG. 3 shows an example of how the Inter-vehicle communication system 12 shown in FIG. 1 can be used to identify different objects that may not be detectable from a local vehicle. There are five vehicles shown in FIG. 3. Vehicle D is in an intersection 40.
  • a vehicle A is heading into the intersection 40 from the east and another vehicle B is heading into the intersection 40 coming from the west.
  • Vehicle E or vehicle F may not be able to see either vehicle A or vehicle B.
  • a building 44 obstructs easterly views by vehicles E and F and a tree 46 obstructs a westerly view by vehicle E and F.
  • Vehicle A or vehicle B may be entering the intersection 40 at a particular speed and distance that is likely to collide with vehicle E or vehicle F. Vehicle E or vehicle F could avoid the potential collision if notified in sufficient time. However, the tree 46 and building 44 prevent vehicles E and F from seeing either vehicle A or vehicle B until they have already entered the intersection 40.
  • Vehicle D includes multiple sensors 42 that sense objects in front, such as vehicle C, in the rear, such as vehicle E, or on the sides, such as vehicles A and B.
  • a processor in vehicle D (not shown) processes the sensor data and identifies the speed, direction and position of vehicles A and B.
  • a transceiver 48 in vehicle D transmits the data identifying vehicles A and B to vehicle E.
  • a transceiver 48 in vehicle E then relays the sensor data to vehicle F.
  • both vehicles E and F are notified about oncoming vehicles A and B even when vehicles A and B cannot be seen visually by the operators of vehicles E and F or detected electronically by sensors on vehicle E and F.
  • the sensing ranges for vehicles E and F are extended by receiving the sensing information from vehicle D.
  • FIG. 4 shows three different screens 50, 52, and 54 that are displayed by vehicles D, E, and F, respectively.
  • Each of screens 50, 52, and 54 are Graphical User Interfaces or other display systems that display sensor data and vehicle information from one or more different vehicles.
  • vehicle D shows different motion vectors that represent objects detected by sensors 42 (FIG. 3).
  • a motion vector 56 shows vehicle B approaching from the west
  • a motion vector 58 shows vehicle C moving in front of vehicle D in a northern direction
  • a motion vector 60 shows vehicle A approaching from the east
  • a motion vector 62 shows vehicle E approaching the back of vehicle D from a southern direction.
  • Screen 52 shows objects displayed by the GUI in vehicle E.
  • Motion vector 64 shows vehicle D moving in front of vehicle E and motion vectors 60 and 56 show vehicles A and B coming toward vehicle D from the east and the west, respectively. Even if the vehicles A and B can not be detected by sensors in vehicle E, the vehicles are detected by sensors in vehicle D and then transmitted to vehicle E.
  • Screen 54 shows the motion vectors displayed to an operator of vehicle F.
  • the motion vectors 64 and 66 shows vehicles D and E traveling north in front of vehicle F.
  • the vehicles A and B are shown approaching vehicle D from the east and west, respectively.
  • the inter-vehicle communication system allows vehicles to effectively see around corners and other obstructions by sharing sensor information between different vehicles. This allows any of the vehicles to anticipate and avoid potential accidents.
  • the operator of vehicle E can see by the displayed motion vector 60 that vehicle A is traveling at 40 MPH. This provides the operator of vehicle E a warning that vehicle A may not be stopping at intersection 40 (FIG. 3). Even if vehicle E has the right of way, vehicle E can avoid a collision by slowing down or stopping while vehicle A passes through intersection 40.
  • the motion vector 56 for vehicle B indicates deceleration and a current velocity of only 5 MPH. Deceleration may be indicated by a shorter motion vector 56 or by an alphanumeric display around the motion vector 56.
  • the motion vector 56 indicates that vehicle B is slowing down or stopping at intersection 40. Thus, if vehicle B were the only other vehicle entering intersection 40, the operator of vehicle E is more confident about entering intersection 50 without colliding into another vehicle.
  • vehicle F may not be close enough to intersection 40 to worry about colliding with vehicle A.
  • screen 54 shows that vehicle E may be on a collision track with vehicle A. If vehicle E were following too close to vehicle D, then vehicle E could possibly run into the pileup that may occur between vehicle D and vehicle A.
  • the operator of vehicle F seeing the possible collision between vehicles D and A in screen 54 can anticipate and avoid the accident by slowing down or stopping before entering the intersection 40.
  • the operator of vehicle F may also try and prevent the collision by honk a horn.
  • FIG. 5 shows another example of how sensor data and other vehicle kinematic state data can be transmitted between different vehicles.
  • Vehicles 70, 72, and 74 are all involved in an accident. At least one of the vehicles, in this case vehicle 70, broadcasts a collision indication message 76.
  • the accident indication message 76 can be triggered by anyone of multiple detected events. For example, the collision indication message 76 may be generated whenever an airbag is deployed in vehicle 76. Alternatively, sensors 78 in the vehicle 70 detect the collision. The detected collision causes a processor in vehicle 70 to broadcast the collision indication message 76.
  • the collision indication message 76 is received by a vehicle 80 that is traveling in the opposite traffic lane.
  • the vehicle 80 includes a transceiver 81 that in this example relays the collision indication message 76 to another vehicle 84 that is traveling in the same direction.
  • Vehicle 84 relays the message to other vehicles 82 and 86 that are traveling in the direction of the on coming collision.
  • Processors 83 and 87 in the vehicles 82 and 86 receive the collision indication message 76 and generate a warning message that may either be annunciated or displayed to drivers of vehicles 82 and 86.
  • the collision indication message 76 is received by vehicle 82 directly from vehicle 70.
  • the processor 83 in vehicle 82 generates a warning indication and also relays the collision indication message 76 to vehicle 86.
  • the collision indication message 76 and other sensor data and messages can be relayed by any vehicle traveling in any direction.
  • FIGS. 6 and 7 show an example of how the inter-vehicle communication system can be utilized to identify road direction.
  • FIG. 6 shows three vehicles A, B, and C traveling along the same stretch of highway 88.
  • Each vehicle includes a Global Positioning System (GPS) that periodically identifies a current longitude and latitude.
  • GPS Global Positioning System
  • Each vehicle A, B, and C generates kinematic state data 92 that includes position, velocity, acceleration or deceleration, and/or direction.
  • the kinematic state data 92 for each vehicle A, B, and C is broadcast to the other vehicles in the same vicinity.
  • the vehicles A, B, and C receive the kinematic state data from the other vehicles and display the information to the vehicle driver.
  • FIG. 7 shows a GUI 94 in vehicle A (FIG. 6).
  • the GUI 94 shows any combination of the position, driving direction, speed, distance, and acceleration for the other vehicles B and C.
  • Vectors 96 and 98 can visually represent this kinematic state data.
  • the position of vector 98 represents the longitude and latitude of vehicle B and the direction of vector 98 represents the direction that vehicle B is traveling.
  • the length of vector 98 represents the current speed and acceleration of vehicle 98. Displaying the kinematic state of other vehicles B and C allows the driver of vehicle A to anticipate curves and other turns in highway 88 (FIG. 6) regardless of the weather conditions.
  • the kinematic state data 92 for the vehicles A, B and C does not have to always be relayed by other vehicles.
  • the kinematic state data 92 can be relayed by a repeater located on a stationary tower 90. This may be desirable for roads with little traffic where there are generally long distances between vehicles on the same highway 88.
  • the transmitters 91 may also send along with the location data 93 some indication that the data is being transmitted from a stationary reference post.
  • the transmitters 91 can also include temperature sensors that detect different road conditions, such as ice. An ice warning is then generated along with the location data.
  • the processors in the vehicles A, B and C then display the transmitters 91 as nonmoving objects 100 along with any road condition information in the GUI 94.
  • FIGS. 8 and 9 show in more detail how collision information is exchanged and used by different vehicles.
  • vehicle A has collided with a tree 102. Upon impact with tree 102, the vehicle A deploys one or more airbags.
  • a processor 104 in vehicle A detects the airbag deployment and automatically sends out an air bag deployment message 106 over a cellular telephone network to an emergency vehicle service such as AAA.
  • the processor 104 broadcasts the kinematic state data 108 of vehicle A.
  • the kinematic state data 108 indicates a rapid deceleration of vehicle A.
  • the processor 104 may send a warning indication.
  • Another vehicle B receives GPS location data 112 from one or more GPS satellites 110.
  • Onboard sensor data 114 is also monitored by processor 116 to determine the speed, direction, etc. of vehicle B.
  • the onboard sensor data 114 may also include data from one or more sensors that are detecting objects within the vicinity of vehicle B.
  • the processor 116 in vehicle B determines a current location of vehicle B based on the GPS data 112 and the onboard sensor data 114. The processor 116 then determines if a danger condition exists by comparing the kinematic state of vehicle A with the kinematic state of vehicle B. For example, if vehicle A is within 50 feet of vehicle B, and vehicle B is traveling at 60 MPH, then processor 116 may determine that vehicle B is in danger of colliding with vehicle A. In this situation, a warning signal may be generated by processor 116. Alternatively, if vehicle A is 100 feet in front of vehicle B, and vehicle B is only traveling at 5 MPH, processor 116 may determine that no danger condition currently exists for vehicle B and no warning signal is generated.
  • FIG. 9 shows one example of how a GUI 105 in vehicle B displays information received from vehicle A and from local sensors.
  • the processor 116 displays vehicle A directly in front of vehicle B. Either from sensor data transmitted from vehicle A or from local sensors, the processor 116 generates a motion vector 113 that identifies another vehicle C approaching from the left.
  • the local sensors in vehicle B also detect another object 107 off to the left of vehicle B.
  • the processor 116 receives all of this sensor data information and generates a steering queue 109 that determines the best path for avoiding vehicle A, vehicle C and object 107. In this example, it is determined that vehicle B should move in a northeasterly direction to avoid colliding with all of the detected objects.
  • the processor 116 can also calculate a time to impact 111 with the closest detected object by comparing the kinematic state of the vehicle B with the kinematic states of the detected objects.
  • FIG. 10 shows another example of how vehicle information may be exchanged between different vehicles.
  • a police vehicle 120 is in pursuit of a chase vehicle 126.
  • police vehicle 120 may be entering an intersection 128.
  • the police vehicle 120 broadcasts an emergency warning signal 124.
  • the emergency warning signal 124 notifies all of the vehicles 122 that an emergency vehicle 120 is nearby and that the vehicles 122 should slowdown or stop.
  • Processors 130 in the vehicles 122 can generate an audible signal to the vehicle operator, display a warning icon on a GUI, and/or show the location of police vehicle 120 on the GUI.
  • the processor 130 in each vehicle 122 receives the kinematic state of police vehicle 120 and determines a relative position of the local vehicle 122 in relation to the police vehicle 120. If the police vehicle 120 is within a particular range, the processor 130 generates a warning signal and may also automatically slow or stop the vehicle 122.
  • the police vehicle 120 sends a disable signal 132 to a processor (not shown) in the chase vehicle 126.
  • the disable signal 132 causes the processor in chase vehicle 126 to automatically slow down the chase vehicle 126 and then eventually stop the chase vehicle 126.
  • FIGS. 11 and 12 show another application for the sensors 136 that are located around vehicle A. Vehicles A and B are parked in parking slots 138 and 140, respectively. Vehicle A has pulled out of parking slot 138 and is attempting to negotiate around vehicle B. The operator of vehicle A cannot see how far vehicle A is from vehicle B.
  • the sensors 136 detect objects that come within a certain distance of vehicle
  • sensors 136 may be activated only when the vehicle A is traveling below a certain speed, or may be activated at any speed, or may be manually activated by the vehicle operator. In any case, the sensors 136 detect vehicle B and display vehicle B on a GUI 144 shown in FIG. 12. The processor in vehicle A may also determine the closest distance between vehicle A and vehicle B and also identify the distance to impact and the particular area of impact 145 on vehicle A.
  • FIG. 13 shows an example of sensor and communication envelopes that are generated by sensors and transceivers in vehicle A.
  • a first local sensor envelope 150 is created around the vehicle A by multiple local sensors 158.
  • the sensor data from the local sensor envelope 150 is used by a processor to detect objects located anywhere around vehicle A.
  • Transceivers 156 are used to generate communication envelopes 152.
  • the transceivers 156 allow communications between vehicles that are located generally in front and in back of vehicle A However, it should be understood that any variety of communication and sensor envelopes can be generated by transceivers and sensors in vehicle A.
  • FIG. 14 shows another example of different sensor envelopes that can be generated around vehicle A.
  • a first type of sensor such as an infrared sensor, may be located around vehicle A to generate close proximity sensor envelopes 160 and 162.
  • a second type of sensor and antenna configuration such as radar antennas, may be used to generate larger sensor envelopes 164, 166, and 168.
  • the local sensor envelopes 160 and 162 may be used to detect objects in close proximity to vehicle A. For example, parked cars, pedestrians, etc.
  • the larger radar envelopes 164, 166 and 168 may be used for detecting objects that are further away from vehicle A.
  • envelopes 164, 166, and 168 may be used for detecting other vehicles that are longer distances from vehicle A.
  • the different sensor envelopes may dynamically change according to how fast the vehicle A is moving. For example, envelope 164 may be used when vehicle A is moving at a relatively low speed. When vehicle A accelerates to a higher speed, object detection will be needed for longer distances. Thus, the sensors may dynamically change to larger sensor envelopes 166 and 168 when vehicle A is moving at higher speeds. Any combination of local sensor envelopes 160 and 162 and larger envelopes 164, 166, and 168 may be used.
  • FIG. 15 is a detailed diagram of the components in one of the vehicles used for gathering local sensor data and receiving external sensor data from other vehicles.
  • a processor 170 receives sensor data from one or more local object detection sensors 172.
  • the sensors may be infrared sensors, radar sensors, or any other type of sensing device that can detect objects.
  • Communication transceivers 174 exchange sensor data, kinematic state data, and other notification messages with other vehicles. Any wireless communication device can be used for communicating information between the different vehicles including microwave, cellular, Citizen Band, two-way radio, etc.
  • a GPS receiver 176 periodically reads location data from GPS satellites.
  • Vehicle sensors 178 include any of the sensors or monitoring devices in the vehicle that detect vehicle direction, speed, temperature, collision conditions, breaking state, airbag deployment, etc.
  • Operator inputs 180 include any monitoring or selection parameter that may be input by the vehicle operator. For example, the operator may wish to view all objects within a 100 foot radius. In another situation, the operator may wish to view all objects within a one mile radius.
  • the processor display the objects within the range selected by the operator on GUT 182.
  • the speed of the vehicle identified by vehicle sensors 178 may determine what data from sensors 172 or from transceivers 174 is used to display on the GUI 182. For example, at higher speeds, the processor may want to display objects that are further distances from the local vehicle.
  • the processor receives sensor data from sensors on the local vehicle.
  • the processor performs image recognition algorithms on the sensor data in block 192. If an object is detected in block 194, kinematic state data for the object is determined in block 200. If the detected object is within a specified range in block 196, then the object is displayed on the GUI in block 198. For example, the current display range for the vehicle may only be for objects detected within 200 feet. If the detected object is outside of 200 feet, it will no be displayed on the GUI.
  • the processor receives kinematic state data for other vehicles and objects detection data from the other vehicles in block 202. Voice data from the other vehicles can also be transmitted along with the kinematic state data. In a similar manner as blocks 196 and 198, if any object detected by another vehicle is within a current display range in block 206, then the other object is displayed on the GUI in block 208. At the same time, the processor determines the current kinematic state its own local vehicle in block 205.
  • the processor in block 210 compares the kinematic state information of the local vehicle with all of the other objects and vehicles that are detected. If a collision condition is eminent based on the comparison, then the processor generates a collision warning in block 212.
  • a collision condition is determined in one example by comparing the current kinematic state of the local vehicle with the kinematic state of the detected objects. If the velocity vector (current speed and direction) of the local vehicle is about to interest with the velocity vector for another detected object, then a collision condition is indicated and a warning signal generated. Collision conditions are determined by analyzing the bearing rate of change of the detected object with respect to the local vehicle.
  • the processor identifies a possible collision condition.
  • a first warning signal is generated.
  • a second collision signal is generated.
  • Fig. 17 depicts a plurality of sensor devices 1011 located aboard an automobile. Though each of the sensors 1011 is annotated with the same number, this is merely to indicate that the sensors perform similar functions, and not to suggest that each of the sensor devices 1011 is exactly the same. Rather, each of the devices 1011 could be an LR sensor, a radar sensor, or another variety of sensor placed to monitor any condition within the automobile or exterior to the automobile that may be of use when implementing collision avoidance, situational awareness, navigation, or system diagnostic functions.
  • Each of the sensor devices 1011 is linked to a processing unit 1041 located within the automobile by a plurality of wireless links 1051.
  • the wireless links 1051 are uni-directional in nature, because the sensor devices 1011 typically transmit only raw data to processing unit 1041.
  • the similar numbering system merely indicates that they are a class of devices that both transmit and receive data from the processing unit, and does not imply that each of the devices 1021 is exactly the same.
  • the class of devices 1021 might include a security system, an environmental control system, a number of audio and video entertainment devices, a cellular phone, a GPS receiver and antenna, or personal digital assistants (PDA).
  • PDA personal digital assistants
  • the devices 1021 will be located within the automobile. However, some of the devices may be located outside the automobile, as in the case of a cellular phone or PDA.
  • GUI 1031 located in the automobile and linked to the processing unit 1041 by a bi-directional wireless or hardwired link 1061.
  • the GUI is the means by which the driver of the automobile can input commands to control a variety of the devices 1021.
  • the driver also receives system status data at GUI 1031 from the processing unit 1041.
  • GUI 1031 may take, including a touch-screen display or heads-up display similar to those typically found in military aircraft.
  • Processing unit 1041 may transmit data directly to GUI 1031 from sensor devices 1011 or may first perform a sensor fusion operation when multiple sensors are monitoring the same condition.
  • Processing unit 1041 may also transmit data received from one or more of the devices 1021 to GUI 1031.
  • the uni-directional wireless links 1051 and the bi-directional wireless links 1061 may be one of several types, depending upon the specific sensor or system that is wirelessly linked to the processing unit.
  • one of the sensor devices 1011 might require an IEEE 802.11 protocol, while one of the devices 1021 utilizes a Motorola Bluetooth link.
  • the processing unit 1041 has the capability of interfacing with sensor devices 1021 or devices 1031 using an analog cellular link, a Cellular Digital Packet Data (CDPD) link, a Satcom link, or a hardwired link.
  • CDPD Cellular Digital Packet Data
  • the number of sensor devices 1011 and 1021 or the pattern in which they are depicted in Fig. 17 should not be considered a limitation.
  • the number of devices 1021 and 1031 and the physical location of 1011, 1021, 1031, and 1041 within the automobile will vary depending on the specific design.
  • Fig. 18 is an illustration of a second embodiment of the present invention. Like the first embodiment depicted in Fig. 17, there are a plurality of sensor devices 1011, a plurality of devices 1021, a GUI 1031, and a plurality of one-directional wireless links 1051 and bi-directional links 1061.
  • the dashed lines divide the interior of an automobile into separate zones, with the engine, passenger, and trunk compartments represented by zone 2021, 2041, and 2061, respectively.
  • Zone 2081 represents the area outside the automobile.
  • the number of devices and wireless links located in each zone is arbitrary, there may be more or less depending on the specific design.
  • Each of the devices 1011, 1021, and 1031 is wirelessly linked with a signal interface unit 2031 that is located in the same zone.
  • the signal interface units 2031 are coupled to a bus 2051 that is installed to run throughout all zones of the automobile.
  • the processing unit 1041 is also coupled to the bus 2051. Once signals are received by the signal interface units 2041, they may be placed on bus 2051 and transmitted to the processing unit 1041. Similarly, signals are transmitted from processing unit 1041 to devices 1021 and GUI 1031 via the bus/signal interface route.
  • Device 1021 is located outside of the automobile in zone 2081 to indicate that there may be devices such as PDAs or cellular phones that receive or transmit data to the processing unit 1041 via a bi-directional wireless link.
  • This zone bus structure takes advantage of the natural shielding offered by the different structural compartments of an automobile.
  • Each zone contains a single signal interface unit that serves as the point where wireless signals are received and transmitted in each zone.
  • the number of zones may vary depending on the type of automobile that the invention is installed in. For example, a sport-utility vehicle would require only two signal interface units 2031 because it effectively has only two zones, the engine and passenger/cargo compartment.
  • the processing unit 1041 is shown located in zone 2041, but it might be moved to any zone depending on the space requirements of specific designs.
  • processing unit 1041 Both embodiments of the present invention described above will facilitate detection of people within the automobile, and based upon detection various functions may be implemented by processing unit 1041. For example, if a subset of the sensor devices 1011 happened to be IR sensors installed in the passenger compartment of an automobile, the sensors can indicate when a person is within the vehicle. Based upon this occupancy data, the processing unit could operate the lighting system more efficiently by turning off the dome light when the vehicle is parked and the last occupant leaves the vehicle, rather than the usual automatic shut off. As another example, typically keys must be in the ignition to operate the car radio and environmental controls. These systems could be enabled merely by a person's presence in the vehicle. The invention could also prevent airbags from being deployed in an accident for passenger seats where no passenger is sitting. An alarm system could be configured to disable the ignition when an unauthorized occupant is detected or to call 911 with the current location of the vehicle taken from the GPS system.
  • Fig. 19 is a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17.
  • the dashed lines in Fig. 19 indicate an engine compartment region 300, a passenger compartment region 310, a trunk compartment region 320, and a region 330 that represents the area external to the automobile.
  • Engine compartment 300 contains two IR sensors 302 that face forward to pick up heat signatures emanating from other automobiles.
  • Sensor 304 is a RF transmitter, receiver, and antenna that detects other automobiles.
  • Sensor 306 is a thermal sensor to monitor engine temperature.
  • Each of the sensors 302, 304, and 306 wirelessly transmits data to the processing unit 318 located in the passenger compartment 310 of the automobile with an IEEE 802.11 wireless link 340.
  • Passenger compartment 310 contains a touch screen display 312 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands.
  • Car audio components 314 are also located within the passenger compartment. Touch-screen display 312 and car audio components 314 are linked to the processing unit 318 by bi-directional wireless Bluetooth links 350.
  • two IR sensors 316 are installed to monitor the occupancy state of the automobile. The two sensors 316 are linked to processing unit 318 by wireless IEEE 802.11 links 340.
  • Trunk compartment contains GPS receiver and antenna 322 and multiband cellular receiver/transmitter/antenna 324.
  • the GPS subsystem 322 and cellular subsystem 324 are linked to processing unit 318 in the passenger compartment via bidirectional wireless Bluetooth links 350.
  • a mobile PDA unit 332 is located outside of the automobile in region 330, transmitting data to and receiving data from processing unit 318 via bi-directional Bluetooth link 350.
  • Fig. 20 is a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18. The dashed lines in Fig. 20 indicate an engine compartment region 400, a passenger compartment region 410, a trunk compartment region 420, and a region 430 that represents the area external to the automobile.
  • Engine compartment 400 contains two IR sensors 402 that face forward to pick up heat signatures emanating from other automobiles.
  • Sensor 404 is a RF transmitter, receiver, and antenna that detects other automobiles.
  • Sensor 406 is a thermal sensor to monitor engine temperature.
  • Each of the sensors 402, 404, and 406 wirelessly transmits data to the signal interface unit 440 located in the engine compartment 400 with IEEE 802.11 wireless links 460.
  • Passenger compartment 410 contains a touch screen display 412 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands.
  • Car audio components 414 are also located within the passenger compartment.
  • Touch-screen display 412 and car audio components 414 are linked to a second signal interface unit 440 by bi-directional wireless Bluetooth links 470.
  • two IR sensors 416 are installed to monitor the occupancy state of the automobile. The two sensors 416 are linked to the second signal interface unit 440 by wireless IEEE 802.1 1 links 460.
  • Trunk compartment 420 contains GPS receiver and antenna 424 and multiband cellular receiver/transmitter/antenna 426.
  • the GPS subsystem 424 and cellular subsystem 426 are linked to a third signal interface unit 440 located in the trunk compartment via bi-directional wireless Bluetooth links 470.
  • a mobile PDA unit 432 is located outside of the automobile in region 430, transmitting data to and receiving data from the second signal interface unit 440 via bi-directional Bluetooth link 470.
  • the mobile PDA unit 432 can link to any of the signal interface units 440 within the automobile, it is merely shown connected to the second unit in the passenger compartment by way of example.
  • Each of the signal interface units 440 is coupled to a fiber-optic bus 450 installed to extend into all zones 400, 410, and 420 of the automobile.
  • the processing unit 422 is also located in the truck compartment 420 and is coupled to fiber-optic bus 450. However, processing unit 422 could be coupled to the fiber-optic bus at any location in any region 400, 410, or 420 depending on space requirements.
  • Fig. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20.
  • three zones 500, 510, and 520 represent the engine compartment, passenger compartment, and trunk compartment, respectively, of the automobile.
  • the signal interface units 540 are installed underneath the hood in the engine compartment 500, underneath the dome in the passenger compartment 510, and underneath the trunk lid in the trunk compartment 520.
  • the signal interface unit 540 in the passenger compartment 510 may even share a physical location with the dome light of the automobile.
  • the fiberoptic bus 550 runs from the engine compartment 500 to the trunk compartment 520 and the signal interface units 540 are coupled to it.
  • the processing unit 522 is installed on the floor of the trunk section 520 and is also coupled to the fiber-optic bus 550.
  • the sensor devices and other automobile system devices that are linked to the signal interfaces by wireless connections are not shown, but their physical locations would be optimized in the various zones of the automobile depending upon their functionality and purpose.
  • the system described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Sensor data is generated for areas around a vehicle. Any objects detected in the sensor data are identified and a kinematic state for the object determined. The kinematic states for the detected objects are compared with the kinematic state of the vehicle. If it is likely that a collision will occur between the detected objects and the local vehicle, a warning is automatically generated to notify the vehicle operator of the impending collision. The sensor data and kinematic state of the vehicle can be transmitted to other vehicles so that the other vehicles are also notified of possible collision conditions.

Description

METHOD AND APPARATUS FOR TRANSFERRING INFORMATION BETWEEN VEHICLES
BACKGROUND
Vehicle collisions are often caused when a driver can not see or is unaware of an oncoming object. For example, a tree may obstruct a drivers view of oncoming traffic at an intersection. The driver has to enter the intersection with no knowledge whether another vehicle may be entering the same intersection. After entering the intersection, it is often too late for the driver to avoid an oncoming car that has failed to properly yield.
There are other situations where a vehicle is at risk of a collision. For example, a pileup may occur on a busy freeway. A vehicle traveling at 60 miles per hour, or faster, may come upon the pileup with only have a few seconds to react. These few seconds are often too short an amount of time to avoid crashing into the other vehicles. Because the driver is suddenly forced to slam on the brakes, other vehicles in back of the driver's vehicle may possibly crash into the rear end of the driver's vehicle.
It is sometimes difficult to see curves in roads. For example, at night or in rainy, snowy or foggy weather it can be difficult to see when a road curves to the left of right. The driver may then focus on the lines in the road or on the lights of a car traveling up ahead. These driving practices are dangerous, since sudden turns, or other obstructions in the road, may not be seen by the driver. In the aircraft industry, there is a well-known phenomena where aircraft fresh off the production line typically incorporate avionics systems that were state of the art a decade ago or more. This gap exists because designs must be finalized well in advance of the production stage, as a practical matter there must be some point where a transition is made from a "paper" airplane to a functional one. It is no coincidence that roughly 70% of the lifecycle cost of a military aircraft system is composed of maintenance and subsequent upgrades necessary to keep the system viable.
The same effect is seen in the automobile industry to a certain degree. The electronics begin to age as soon as the design is finalized, but since wiring harnesses cannot easily be modified after installation, after-market modifications are prohibitively expensive. Even simple GPS navigation systems exist only in top-of- the-line automobiles, and if other options or safety features are later made available the consumer must typically purchase a brand new vehicle to enjoy those benefits. The present invention addresses this and other problems associated with the prior art.
SUMMARY OF THE INVENTION Sensor data is generated for areas around a vehicle. Any objects detected in the sensor data are identified and a kinematic state for the object determined. The kinematic states for the detected objects are compared with the kinematic state of the vehicle. If it is likely that a collision will occur between the detected objects and the local vehicle, a warning is automatically generated to notify the vehicle operator of the impending collision. The sensor data and kinematic state of the vehicle can be transmitted to other vehicles so that the other vehicles are also notified of possible collision conditions. Assorted vehicle subsystems, sensors, communication devices, and other electronic devices are connected to a processing unit by using a plurality of wireless links.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a diagram of an inter-vehicle communication system. FIG. 2 is a block diagram showing how the inter-vehicle communication system of FIG. 1 operates.
FIG. 3 is a diagram showing how sensor data can be exchanged between different vehicles.
FIG. 4 is a diagram showing Graphical User Interfaces (GUIs) are used for different vehicles that share sensor data.
FIG. 5 is a diagram showing how collision information can be exchanged between different vehicles.
FIGS. 6 and 7 are diagrams showing how kinetic state information for multiple vehicles can be used to identify road direction. FIGS. 8 and 9 are diagrams showing how the inter- vehicle communication system is used to help avoid collisions.
FIG. 10 is a diagram showing how an emergency signal is broadcast to multiple vehicles from a police vehicle.
FIGS. 11 and 12 are diagrams showing sensors are used to indicate proximity of a local vehicle to other objects.
FIGS. 13 and 14 show different sensor and communication envelopes that are used by the inter-vehicle communication system.
FIG. 15 is a block diagram showing the different data inputs and outputs that are coupled to an inter-vehicle communication processor. FIG. 16 is a block diagram showing how the processor in FIG. 16 operates. FIG. 17 is a block diagram illustrating a first embodiment of the present invention.
FIG. 18 is a block diagram illustrating a second embodiment of the present invention.
FIG. 19 is a block diagram of a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17.
FIG. 20 is block diagram of a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18.
FIG. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20.
DETAILED DESCRIPTION FIG. 1 shows a multi-vehicle communication system 12 that allows different vehicles to exchange kinematic state data. Each vehicle 14 may include one or more sensors 18 that gather sensor information around the associated vehicle 14. A transmitter/receiver (transceiver) in the vehicle 14 transmits to other vehicles kinematic state data 19 for objects detected by the sensors 18 and kinematic state data 17 for the vehicle itself. A Central Processing Unit (CPU) 20 in the vehicle 14 is coupled between the sensors 18 and transceivers 16. The CPUs 20 display the sensor information acquired from the local sensors 18 in the same vehicle and also displays, if appropriate, the kinematic state data 17 and 19 received from the other vehicles 14.
The CPU 20 for one of the vehicles, such as vehicle 14A, may identify an object 22 that is detected by the sensor 18 A. The CPU 20A identifies how far the object 22 is away from the vehicle 14A. The CPU 20 A may also generate a warning signal if the object 22 comes within a specific distance of the vehicle 14A. The CPU 20A then transmits the kinematic state data for object 22 to the other vehicles 14B and 14C that are within some range of vehicle 14 A. Referring to FIGS. 1 and 2, the CPU 20B from vehicle 14B establishes communication with the transmitting vehicle 14A in box 24. A navigation grid is established in box 26 that determines where the vehicle 14A is in relationship to vehicle 14B. This is accomplished by the vehicle 14A sending its kinematic state data 17 such as location, speed, acceleration, and direction to vehicle 14B. The vehicle 14B receives the kinematic state data for object 22 from vehicle 14A in box 28. The CPU 20B then determines the position of object 22 relative to vehicle 14B. The CPU 20B then displays the object on a digital map in vehicle 14B in box 32. Thus, the operator of vehicle 14B can be notified of the object 22 earlier than what would be typically possible using only the local sensors 14B. In another application, vehicle 14B receives the position of vehicle 14A and the information regarding object 22 through an intermediary vehicle 14C. The transceiver 16A in vehicle 14A transmits the kinematic state of vehicle 14A and the information regarding object 22 to vehicle 14C. The transceiver 16C in vehicle 14C then relays its own kinematic state data along with the kinematic state data of vehicle 14A and object 22 to vehicle 14B. The CPU 20B then determines from the kinematic state of vehicle 14A and the kinematic state of object 22, the position of object 22 is in relation to vehicle 14B. If the position of object 22 is within some range of vehicle 14B, the object 22 is displayed on a Graphical User Interface (GUI) inside of vehicle 14B (not shown). FIG. 3 shows an example of how the Inter-vehicle communication system 12 shown in FIG. 1 can be used to identify different objects that may not be detectable from a local vehicle. There are five vehicles shown in FIG. 3. Vehicle D is in an intersection 40. A vehicle A is heading into the intersection 40 from the east and another vehicle B is heading into the intersection 40 coming from the west. Vehicle E or vehicle F may not be able to see either vehicle A or vehicle B. For example, a building 44 obstructs easterly views by vehicles E and F and a tree 46 obstructs a westerly view by vehicle E and F.
Vehicle A or vehicle B may be entering the intersection 40 at a particular speed and distance that is likely to collide with vehicle E or vehicle F. Vehicle E or vehicle F could avoid the potential collision if notified in sufficient time. However, the tree 46 and building 44 prevent vehicles E and F from seeing either vehicle A or vehicle B until they have already entered the intersection 40.
The inter-vehicle communication system warns both vehicle E and vehicle F of the oncoming vehicles B and A. Vehicle D includes multiple sensors 42 that sense objects in front, such as vehicle C, in the rear, such as vehicle E, or on the sides, such as vehicles A and B. A processor in vehicle D (not shown) processes the sensor data and identifies the speed, direction and position of vehicles A and B. A transceiver 48 in vehicle D transmits the data identifying vehicles A and B to vehicle E. A transceiver 48 in vehicle E then relays the sensor data to vehicle F.
Thus, both vehicles E and F are notified about oncoming vehicles A and B even when vehicles A and B cannot be seen visually by the operators of vehicles E and F or detected electronically by sensors on vehicle E and F. Thus the sensing ranges for vehicles E and F are extended by receiving the sensing information from vehicle D.
FIG. 4 shows three different screens 50, 52, and 54 that are displayed by vehicles D, E, and F, respectively. Each of screens 50, 52, and 54 are Graphical User Interfaces or other display systems that display sensor data and vehicle information from one or more different vehicles. Referring to screen 50, vehicle D shows different motion vectors that represent objects detected by sensors 42 (FIG. 3). A motion vector 56 shows vehicle B approaching from the west, a motion vector 58 shows vehicle C moving in front of vehicle D in a northern direction, a motion vector 60 shows vehicle A approaching from the east and a motion vector 62 shows vehicle E approaching the back of vehicle D from a southern direction.
Screen 52 shows objects displayed by the GUI in vehicle E. Motion vector 64 shows vehicle D moving in front of vehicle E and motion vectors 60 and 56 show vehicles A and B coming toward vehicle D from the east and the west, respectively. Even if the vehicles A and B can not be detected by sensors in vehicle E, the vehicles are detected by sensors in vehicle D and then transmitted to vehicle E. Screen 54 shows the motion vectors displayed to an operator of vehicle F. The motion vectors 64 and 66 shows vehicles D and E traveling north in front of vehicle F. The vehicles A and B are shown approaching vehicle D from the east and west, respectively. The inter-vehicle communication system allows vehicles to effectively see around corners and other obstructions by sharing sensor information between different vehicles. This allows any of the vehicles to anticipate and avoid potential accidents. For example, the operator of vehicle E can see by the displayed motion vector 60 that vehicle A is traveling at 40 MPH. This provides the operator of vehicle E a warning that vehicle A may not be stopping at intersection 40 (FIG. 3). Even if vehicle E has the right of way, vehicle E can avoid a collision by slowing down or stopping while vehicle A passes through intersection 40.
In a similar manner, the motion vector 56 for vehicle B indicates deceleration and a current velocity of only 5 MPH. Deceleration may be indicated by a shorter motion vector 56 or by an alphanumeric display around the motion vector 56. The motion vector 56 indicates that vehicle B is slowing down or stopping at intersection 40. Thus, if vehicle B were the only other vehicle entering intersection 40, the operator of vehicle E is more confident about entering intersection 50 without colliding into another vehicle.
Referring to screen 54, vehicle F may not be close enough to intersection 40 to worry about colliding with vehicle A. However, screen 54 shows that vehicle E may be on a collision track with vehicle A. If vehicle E were following too close to vehicle D, then vehicle E could possibly run into the pileup that may occur between vehicle D and vehicle A. The operator of vehicle F seeing the possible collision between vehicles D and A in screen 54 can anticipate and avoid the accident by slowing down or stopping before entering the intersection 40. The operator of vehicle F may also try and prevent the collision by honk a horn.
FIG. 5 shows another example of how sensor data and other vehicle kinematic state data can be transmitted between different vehicles. Vehicles 70, 72, and 74 are all involved in an accident. At least one of the vehicles, in this case vehicle 70, broadcasts a collision indication message 76. The accident indication message 76 can be triggered by anyone of multiple detected events. For example, the collision indication message 76 may be generated whenever an airbag is deployed in vehicle 76. Alternatively, sensors 78 in the vehicle 70 detect the collision. The detected collision causes a processor in vehicle 70 to broadcast the collision indication message 76.
In one example, the collision indication message 76 is received by a vehicle 80 that is traveling in the opposite traffic lane. The vehicle 80 includes a transceiver 81 that in this example relays the collision indication message 76 to another vehicle 84 that is traveling in the same direction. Vehicle 84 relays the message to other vehicles 82 and 86 that are traveling in the direction of the on coming collision.
Processors 83 and 87 in the vehicles 82 and 86, respectively, receive the collision indication message 76 and generate a warning message that may either be annunciated or displayed to drivers of vehicles 82 and 86. In another example, the collision indication message 76 is received by vehicle 82 directly from vehicle 70. The processor 83 in vehicle 82 generates a warning indication and also relays the collision indication message 76 to vehicle 86. The collision indication message 76 and other sensor data and messages can be relayed by any vehicle traveling in any direction.
FIGS. 6 and 7 show an example of how the inter-vehicle communication system can be utilized to identify road direction. FIG. 6 shows three vehicles A, B, and C traveling along the same stretch of highway 88. Each vehicle includes a Global Positioning System (GPS) that periodically identifies a current longitude and latitude. Each vehicle A, B, and C generates kinematic state data 92 that includes position, velocity, acceleration or deceleration, and/or direction.
The kinematic state data 92 for each vehicle A, B, and C is broadcast to the other vehicles in the same vicinity. The vehicles A, B, and C receive the kinematic state data from the other vehicles and display the information to the vehicle driver. For example, in FIG. 7 shows a GUI 94 in vehicle A (FIG. 6). The GUI 94 shows any combination of the position, driving direction, speed, distance, and acceleration for the other vehicles B and C. Vectors 96 and 98 can visually represent this kinematic state data.
For example, the position of vector 98 represents the longitude and latitude of vehicle B and the direction of vector 98 represents the direction that vehicle B is traveling. The length of vector 98 represents the current speed and acceleration of vehicle 98. Displaying the kinematic state of other vehicles B and C allows the driver of vehicle A to anticipate curves and other turns in highway 88 (FIG. 6) regardless of the weather conditions.
Referring back to FIG. 6, the kinematic state data 92 for the vehicles A, B and C does not have to always be relayed by other vehicles. For example, the kinematic state data 92 can be relayed by a repeater located on a stationary tower 90. This may be desirable for roads with little traffic where there are generally long distances between vehicles on the same highway 88. There also may be transmitters 91 located on the sides of highway 88 that transmit location data 93. The transmitters may be located intermittently along different stretches of highway 88 to provide location references and to also identify dangerous curves in certain stretches of the highway 88.
The transmitters 91 may also send along with the location data 93 some indication that the data is being transmitted from a stationary reference post. The transmitters 91 can also include temperature sensors that detect different road conditions, such as ice. An ice warning is then generated along with the location data. The processors in the vehicles A, B and C then display the transmitters 91 as nonmoving objects 100 along with any road condition information in the GUI 94.
FIGS. 8 and 9 show in more detail how collision information is exchanged and used by different vehicles. In FIG. 8, vehicle A has collided with a tree 102. Upon impact with tree 102, the vehicle A deploys one or more airbags. A processor 104 in vehicle A detects the airbag deployment and automatically sends out an air bag deployment message 106 over a cellular telephone network to an emergency vehicle service such as AAA. At the same time, the processor 104 broadcasts the kinematic state data 108 of vehicle A. The kinematic state data 108 indicates a rapid deceleration of vehicle A. Along with the kinematic state data 108 the processor 104 may send a warning indication.
Another vehicle B receives GPS location data 112 from one or more GPS satellites 110. Onboard sensor data 114 is also monitored by processor 116 to determine the speed, direction, etc. of vehicle B. The onboard sensor data 114 may also include data from one or more sensors that are detecting objects within the vicinity of vehicle B.
The processor 116 in vehicle B determines a current location of vehicle B based on the GPS data 112 and the onboard sensor data 114. The processor 116 then determines if a danger condition exists by comparing the kinematic state of vehicle A with the kinematic state of vehicle B. For example, if vehicle A is within 50 feet of vehicle B, and vehicle B is traveling at 60 MPH, then processor 116 may determine that vehicle B is in danger of colliding with vehicle A. In this situation, a warning signal may be generated by processor 116. Alternatively, if vehicle A is 100 feet in front of vehicle B, and vehicle B is only traveling at 5 MPH, processor 116 may determine that no danger condition currently exists for vehicle B and no warning signal is generated.
FIG. 9 shows one example of how a GUI 105 in vehicle B displays information received from vehicle A and from local sensors. The processor 116 displays vehicle A directly in front of vehicle B. Either from sensor data transmitted from vehicle A or from local sensors, the processor 116 generates a motion vector 113 that identifies another vehicle C approaching from the left. The local sensors in vehicle B also detect another object 107 off to the left of vehicle B.
The processor 116 receives all of this sensor data information and generates a steering queue 109 that determines the best path for avoiding vehicle A, vehicle C and object 107. In this example, it is determined that vehicle B should move in a northeasterly direction to avoid colliding with all of the detected objects. The processor 116 can also calculate a time to impact 111 with the closest detected object by comparing the kinematic state of the vehicle B with the kinematic states of the detected objects.
FIG. 10 shows another example of how vehicle information may be exchanged between different vehicles. In this example, a police vehicle 120 is in pursuit of a chase vehicle 126. Police vehicle 120 may be entering an intersection 128. In order to avoid colliding with other vehicles that may be entering intersection 128, the police vehicle 120 broadcasts an emergency warning signal 124. The emergency warning signal 124 notifies all of the vehicles 122 that an emergency vehicle 120 is nearby and that the vehicles 122 should slowdown or stop.
Processors 130 in the vehicles 122 can generate an audible signal to the vehicle operator, display a warning icon on a GUI, and/or show the location of police vehicle 120 on the GUI. In another implementation, the processor 130 in each vehicle 122 receives the kinematic state of police vehicle 120 and determines a relative position of the local vehicle 122 in relation to the police vehicle 120. If the police vehicle 120 is within a particular range, the processor 130 generates a warning signal and may also automatically slow or stop the vehicle 122.
In another implementation, the police vehicle 120 sends a disable signal 132 to a processor (not shown) in the chase vehicle 126. The disable signal 132 causes the processor in chase vehicle 126 to automatically slow down the chase vehicle 126 and then eventually stop the chase vehicle 126. FIGS. 11 and 12 show another application for the sensors 136 that are located around vehicle A. Vehicles A and B are parked in parking slots 138 and 140, respectively. Vehicle A has pulled out of parking slot 138 and is attempting to negotiate around vehicle B. The operator of vehicle A cannot see how far vehicle A is from vehicle B. The sensors 136 detect objects that come within a certain distance of vehicle
A. These sensors 136 may be activated only when the vehicle A is traveling below a certain speed, or may be activated at any speed, or may be manually activated by the vehicle operator. In any case, the sensors 136 detect vehicle B and display vehicle B on a GUI 144 shown in FIG. 12. The processor in vehicle A may also determine the closest distance between vehicle A and vehicle B and also identify the distance to impact and the particular area of impact 145 on vehicle A.
As vehicle A moves within some specified distance of vehicle B, the processor 146 may generate a warning signal that is either annunciated or displayed to the vehicle operator on the GUI 144. This sensor system allows the vehicle operator to avoid a slow speed collision caused by the vehicle operator not being able to see the sides of the vehicle A. In another example, sensors on vehicle B (not shown) may generate a warning signal to processor 146 when vehicle A moves too close to vehicle B. FIG. 13 shows an example of sensor and communication envelopes that are generated by sensors and transceivers in vehicle A. A first local sensor envelope 150 is created around the vehicle A by multiple local sensors 158. The sensor data from the local sensor envelope 150 is used by a processor to detect objects located anywhere around vehicle A. Transceivers 156 are used to generate communication envelopes 152. The transceivers 156 allow communications between vehicles that are located generally in front and in back of vehicle A However, it should be understood that any variety of communication and sensor envelopes can be generated by transceivers and sensors in vehicle A.
FIG. 14 shows another example of different sensor envelopes that can be generated around vehicle A. A first type of sensor, such as an infrared sensor, may be located around vehicle A to generate close proximity sensor envelopes 160 and 162. A second type of sensor and antenna configuration, such as radar antennas, may be used to generate larger sensor envelopes 164, 166, and 168.
The local sensor envelopes 160 and 162 may be used to detect objects in close proximity to vehicle A. For example, parked cars, pedestrians, etc. The larger radar envelopes 164, 166 and 168 may be used for detecting objects that are further away from vehicle A. For example, envelopes 164, 166, and 168 may be used for detecting other vehicles that are longer distances from vehicle A. The different sensor envelopes may dynamically change according to how fast the vehicle A is moving. For example, envelope 164 may be used when vehicle A is moving at a relatively low speed. When vehicle A accelerates to a higher speed, object detection will be needed for longer distances. Thus, the sensors may dynamically change to larger sensor envelopes 166 and 168 when vehicle A is moving at higher speeds. Any combination of local sensor envelopes 160 and 162 and larger envelopes 164, 166, and 168 may be used.
FIG. 15 is a detailed diagram of the components in one of the vehicles used for gathering local sensor data and receiving external sensor data from other vehicles. A processor 170 receives sensor data from one or more local object detection sensors 172. The sensors may be infrared sensors, radar sensors, or any other type of sensing device that can detect objects. Communication transceivers 174 exchange sensor data, kinematic state data, and other notification messages with other vehicles. Any wireless communication device can be used for communicating information between the different vehicles including microwave, cellular, Citizen Band, two-way radio, etc.
A GPS receiver 176 periodically reads location data from GPS satellites. Vehicle sensors 178 include any of the sensors or monitoring devices in the vehicle that detect vehicle direction, speed, temperature, collision conditions, breaking state, airbag deployment, etc. Operator inputs 180 include any monitoring or selection parameter that may be input by the vehicle operator. For example, the operator may wish to view all objects within a 100 foot radius. In another situation, the operator may wish to view all objects within a one mile radius. The processor display the objects within the range selected by the operator on GUT 182. In another situation, the speed of the vehicle identified by vehicle sensors 178 may determine what data from sensors 172 or from transceivers 174 is used to display on the GUI 182. For example, at higher speeds, the processor may want to display objects that are further distances from the local vehicle. FIG. 16 is a block diagram showing how the processor in one of the vehicles operates. In block 190, the processor receives sensor data from sensors on the local vehicle. The processor performs image recognition algorithms on the sensor data in block 192. If an object is detected in block 194, kinematic state data for the object is determined in block 200. If the detected object is within a specified range in block 196, then the object is displayed on the GUI in block 198. For example, the current display range for the vehicle may only be for objects detected within 200 feet. If the detected object is outside of 200 feet, it will no be displayed on the GUI.
At the same time, the processor receives kinematic state data for other vehicles and objects detection data from the other vehicles in block 202. Voice data from the other vehicles can also be transmitted along with the kinematic state data. In a similar manner as blocks 196 and 198, if any object detected by another vehicle is within a current display range in block 206, then the other object is displayed on the GUI in block 208. At the same time, the processor determines the current kinematic state its own local vehicle in block 205.
The processor in block 210 compares the kinematic state information of the local vehicle with all of the other objects and vehicles that are detected. If a collision condition is eminent based on the comparison, then the processor generates a collision warning in block 212. A collision condition is determined in one example by comparing the current kinematic state of the local vehicle with the kinematic state of the detected objects. If the velocity vector (current speed and direction) of the local vehicle is about to interest with the velocity vector for another detected object, then a collision condition is indicated and a warning signal generated. Collision conditions are determined by analyzing the bearing rate of change of the detected object with respect to the local vehicle. For example, if the bearing rate of change continues to change, it is not likely that a collision condition will occur and no warning signal is generated. However, if the bearing rate of change remains constant for the detected object with respect to the local vehicle, the processor identifies a possible collision condition. When the range and speed between the detected object and the local vehicle are within a first probably of avoidance range, a first warning signal is generated. At a second probably of impact range, a second collision signal is generated.
Fig. 17 depicts a plurality of sensor devices 1011 located aboard an automobile. Though each of the sensors 1011 is annotated with the same number, this is merely to indicate that the sensors perform similar functions, and not to suggest that each of the sensor devices 1011 is exactly the same. Rather, each of the devices 1011 could be an LR sensor, a radar sensor, or another variety of sensor placed to monitor any condition within the automobile or exterior to the automobile that may be of use when implementing collision avoidance, situational awareness, navigation, or system diagnostic functions.
Each of the sensor devices 1011 is linked to a processing unit 1041 located within the automobile by a plurality of wireless links 1051. The wireless links 1051 are uni-directional in nature, because the sensor devices 1011 typically transmit only raw data to processing unit 1041.
There is also a plurality of devices 1021 linked to the processing unit 1041 by a plurality of bi-directional wireless links 1061. Again, the similar numbering system merely indicates that they are a class of devices that both transmit and receive data from the processing unit, and does not imply that each of the devices 1021 is exactly the same. For example, it is anticipated that the class of devices 1021 might include a security system, an environmental control system, a number of audio and video entertainment devices, a cellular phone, a GPS receiver and antenna, or personal digital assistants (PDA). In general the devices 1021 will be located within the automobile. However, some of the devices may be located outside the automobile, as in the case of a cellular phone or PDA.
There is also a graphic user interface (GUI) 1031 located in the automobile and linked to the processing unit 1041 by a bi-directional wireless or hardwired link 1061. The GUI is the means by which the driver of the automobile can input commands to control a variety of the devices 1021. The driver also receives system status data at GUI 1031 from the processing unit 1041. There are a variety of forms that GUI 1031 may take, including a touch-screen display or heads-up display similar to those typically found in military aircraft. Processing unit 1041 may transmit data directly to GUI 1031 from sensor devices 1011 or may first perform a sensor fusion operation when multiple sensors are monitoring the same condition. Processing unit 1041 may also transmit data received from one or more of the devices 1021 to GUI 1031. The uni-directional wireless links 1051 and the bi-directional wireless links 1061 may be one of several types, depending upon the specific sensor or system that is wirelessly linked to the processing unit. For example, one of the sensor devices 1011 might require an IEEE 802.11 protocol, while one of the devices 1021 utilizes a Motorola Bluetooth link. In addition to the 802.11 and Bluetooth links mentioned, the processing unit 1041 has the capability of interfacing with sensor devices 1021 or devices 1031 using an analog cellular link, a Cellular Digital Packet Data (CDPD) link, a Satcom link, or a hardwired link.
Finally, the number of sensor devices 1011 and 1021 or the pattern in which they are depicted in Fig. 17 should not be considered a limitation. The number of devices 1021 and 1031 and the physical location of 1011, 1021, 1031, and 1041 within the automobile will vary depending on the specific design.
Fig. 18 is an illustration of a second embodiment of the present invention. Like the first embodiment depicted in Fig. 17, there are a plurality of sensor devices 1011, a plurality of devices 1021, a GUI 1031, and a plurality of one-directional wireless links 1051 and bi-directional links 1061. The dashed lines divide the interior of an automobile into separate zones, with the engine, passenger, and trunk compartments represented by zone 2021, 2041, and 2061, respectively. Zone 2081 represents the area outside the automobile. The number of devices and wireless links located in each zone is arbitrary, there may be more or less depending on the specific design.
Each of the devices 1011, 1021, and 1031 is wirelessly linked with a signal interface unit 2031 that is located in the same zone. The signal interface units 2031 are coupled to a bus 2051 that is installed to run throughout all zones of the automobile. The processing unit 1041 is also coupled to the bus 2051. Once signals are received by the signal interface units 2041, they may be placed on bus 2051 and transmitted to the processing unit 1041. Similarly, signals are transmitted from processing unit 1041 to devices 1021 and GUI 1031 via the bus/signal interface route. Device 1021 is located outside of the automobile in zone 2081 to indicate that there may be devices such as PDAs or cellular phones that receive or transmit data to the processing unit 1041 via a bi-directional wireless link.
This zone bus structure takes advantage of the natural shielding offered by the different structural compartments of an automobile. Each zone contains a single signal interface unit that serves as the point where wireless signals are received and transmitted in each zone. The number of zones may vary depending on the type of automobile that the invention is installed in. For example, a sport-utility vehicle would require only two signal interface units 2031 because it effectively has only two zones, the engine and passenger/cargo compartment. The processing unit 1041 is shown located in zone 2041, but it might be moved to any zone depending on the space requirements of specific designs.
Both embodiments of the present invention described above will facilitate detection of people within the automobile, and based upon detection various functions may be implemented by processing unit 1041. For example, if a subset of the sensor devices 1011 happened to be IR sensors installed in the passenger compartment of an automobile, the sensors can indicate when a person is within the vehicle. Based upon this occupancy data, the processing unit could operate the lighting system more efficiently by turning off the dome light when the vehicle is parked and the last occupant leaves the vehicle, rather than the usual automatic shut off. As another example, typically keys must be in the ignition to operate the car radio and environmental controls. These systems could be enabled merely by a person's presence in the vehicle. The invention could also prevent airbags from being deployed in an accident for passenger seats where no passenger is sitting. An alarm system could be configured to disable the ignition when an unauthorized occupant is detected or to call 911 with the current location of the vehicle taken from the GPS system.
Fig. 19 is a specific instance of a vehicular wireless network according to the first embodiment of the present invention disclosed in Fig. 17. The dashed lines in Fig. 19 indicate an engine compartment region 300, a passenger compartment region 310, a trunk compartment region 320, and a region 330 that represents the area external to the automobile.
Engine compartment 300 contains two IR sensors 302 that face forward to pick up heat signatures emanating from other automobiles. Sensor 304 is a RF transmitter, receiver, and antenna that detects other automobiles. Sensor 306 is a thermal sensor to monitor engine temperature. Each of the sensors 302, 304, and 306 wirelessly transmits data to the processing unit 318 located in the passenger compartment 310 of the automobile with an IEEE 802.11 wireless link 340.
Passenger compartment 310 contains a touch screen display 312 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands. Car audio components 314 are also located within the passenger compartment. Touch-screen display 312 and car audio components 314 are linked to the processing unit 318 by bi-directional wireless Bluetooth links 350. Additionally, two IR sensors 316 are installed to monitor the occupancy state of the automobile. The two sensors 316 are linked to processing unit 318 by wireless IEEE 802.11 links 340.
Trunk compartment contains GPS receiver and antenna 322 and multiband cellular receiver/transmitter/antenna 324. The GPS subsystem 322 and cellular subsystem 324 are linked to processing unit 318 in the passenger compartment via bidirectional wireless Bluetooth links 350.
A mobile PDA unit 332 is located outside of the automobile in region 330, transmitting data to and receiving data from processing unit 318 via bi-directional Bluetooth link 350. Fig. 20 is a specific instance of a vehicular wireless network according to the second embodiment of the present invention disclosed in Fig. 18. The dashed lines in Fig. 20 indicate an engine compartment region 400, a passenger compartment region 410, a trunk compartment region 420, and a region 430 that represents the area external to the automobile. Engine compartment 400 contains two IR sensors 402 that face forward to pick up heat signatures emanating from other automobiles. Sensor 404 is a RF transmitter, receiver, and antenna that detects other automobiles. Sensor 406 is a thermal sensor to monitor engine temperature. Each of the sensors 402, 404, and 406 wirelessly transmits data to the signal interface unit 440 located in the engine compartment 400 with IEEE 802.11 wireless links 460.
Passenger compartment 410 contains a touch screen display 412 which allows the driver to see the status of various vehicle subsystems along with providing a means to input commands. Car audio components 414 are also located within the passenger compartment. Touch-screen display 412 and car audio components 414 are linked to a second signal interface unit 440 by bi-directional wireless Bluetooth links 470. Additionally, two IR sensors 416 are installed to monitor the occupancy state of the automobile. The two sensors 416 are linked to the second signal interface unit 440 by wireless IEEE 802.1 1 links 460. Trunk compartment 420 contains GPS receiver and antenna 424 and multiband cellular receiver/transmitter/antenna 426. The GPS subsystem 424 and cellular subsystem 426 are linked to a third signal interface unit 440 located in the trunk compartment via bi-directional wireless Bluetooth links 470.
A mobile PDA unit 432 is located outside of the automobile in region 430, transmitting data to and receiving data from the second signal interface unit 440 via bi-directional Bluetooth link 470. The mobile PDA unit 432 can link to any of the signal interface units 440 within the automobile, it is merely shown connected to the second unit in the passenger compartment by way of example. Each of the signal interface units 440 is coupled to a fiber-optic bus 450 installed to extend into all zones 400, 410, and 420 of the automobile. The processing unit 422 is also located in the truck compartment 420 and is coupled to fiber-optic bus 450. However, processing unit 422 could be coupled to the fiber-optic bus at any location in any region 400, 410, or 420 depending on space requirements.
Fig. 21 is a stylized profile of an automobile illustrating the physical location of some of the components described in Fig. 20. Again, three zones 500, 510, and 520 represent the engine compartment, passenger compartment, and trunk compartment, respectively, of the automobile. The signal interface units 540 are installed underneath the hood in the engine compartment 500, underneath the dome in the passenger compartment 510, and underneath the trunk lid in the trunk compartment 520. The signal interface unit 540 in the passenger compartment 510 may even share a physical location with the dome light of the automobile. The fiberoptic bus 550 runs from the engine compartment 500 to the trunk compartment 520 and the signal interface units 540 are coupled to it. The processing unit 522 is installed on the floor of the trunk section 520 and is also coupled to the fiber-optic bus 550. The sensor devices and other automobile system devices that are linked to the signal interfaces by wireless connections are not shown, but their physical locations would be optimized in the various zones of the automobile depending upon their functionality and purpose. The system described above can use dedicated processor systems, micro controllers, programmable logic devices, or microprocessors that perform some or all of the operations. Some of the operations described above may be implemented in software and other operations may be implemented in hardware.
For the sake of convenience, the operations are described as various interconnected functional blocks or distinct software modules. This is not necessary, however, and there may be cases where these functional blocks or modules are equivalently aggregated into a single logic device, program or operation with unclear boundaries. In any event, the functional blocks and software modules or described features can be implemented by themselves, or in combination with other operations in either hardware or software.

Claims

1. An inter-vehicle communication system, comprising: a local sensor in a local vehicle for gathering sensor data around the local vehicle; a receiver in the local vehicle for receiving sensor data from another vehicle; and a processor for identifying objects according to the sensor data gathered from both the local sensor and from the other vehicle.
2. An inter-vehicle communication system according to claim 1 including a transmitter for transmitting the sensor data from the local sensor to other vehicles.
3. An inter- vehicle communication system according to claim 1 wherein the processor generates a warning signal according to how close the detected objects are from the local vehicle.
4. An inter-vehicle communication system according to claim 1 wherein the processor generates a steering queue showing what direction the local vehicle should travel to avoid any identified objects.
5. An inter- vehicle communication system according to claim 1 wherein the processor identifies kinematic states for objects detected in the sensor data.
6. An inter-vehicle communication system according to claim 1 including a GPS receiver that receives location data for the local vehicle, the processor using the location data to determine a kinematic state for the local vehicle.
7. An inter-vehicle communication system according to claim 6 wherein the processor compares the kinematic state of the local vehicle with the kinematic states of the detected objects and generates a collision warning signal according to the comparison.
8. An inter-vehicle communication system according to claim 1 wherein the processor transmits kinematic state data for both the local vehicle and for any objects detected in the sensor data from the other vehicle.
9. An inter-vehicle communication system according to claim 8 wherein the kinematic state data includes both a direction and speed of both the local vehicle and any objects identified in the sensor data.
10. An inter-vehicle communication system according to claim 1 wherein the receiver gathers sensor data from the local vehicle and then relays that sensor data to a second vehicle.
11. An inter-vehicle communication system according to claim 1 wherein the processor broadcasts an emergency notification signal to the other vehicles.
12. An inter-vehicle communication system according to claim 11 wherein the emergency notification signal includes an airbag deployment indication.
13. An inter-vehicle communication system according to claim 1 including multiple sensors for sensing objects both on the sides and in front of the local vehicle.
14. An inter- vehicle communication system according to claim 13 including infrared sensors for generating sensor information around a local perimeter of the local vehicle and a radar sensor for generating sensor data outside of the local perimeter.
15. A method for detecting objects, comprising: generating sensor data for areas around a local vehicle; identifying an object in the sensor data; determining a kinematic state for the object identified in the sensor data; determining a kinematic state for the local vehicle; comparing the kinematic state of the object with the kinematic state of the vehicle; and generating a warning indication when the comparison indicates a possible collision between the identified object and the local vehicle.
16. A method according to claim 15 including generating sensor data in front, in back and on sides of the vehicle and identifying any objects that may be approaching the local vehicle from the front, back, or the sides.
17. A method according to claim 15 including displaying identified objects that come within a preselected perimeter of the local vehicle.
18. A method according to claim 17 including identifying a distance to impact between the identified objects and the local vehicle.
19. A method according to claim 17 including identifying where the identified objects are located in relationship to the local vehicle.
20. A method according to claim 15 including receiving the kinematic state of another vehicle and displaying the kinematic state of the local vehicle in relation to the other vehicle.
21. A method according to claim 15 including: generating sensing data in an area around a first vehicle; detecting an object in the sensing data; determining kinematic state for the detected object; determining kinematic state for the first vehicle; transmitting the kinematic state for the first vehicle and the object to an intermediary vehicle; determining kinematic state for the intermediary vehicle; transmitting the kinematic state for the object, the first vehicle and the intermediary vehicle from the intermediary vehicle to the local vehicle; and displaying the kinematic state for the object, the first vehicle and the second vehicle in relation to the kinematic state of the local vehicle.
22. A method according to claim 15 including receiving an emergency signal from a first vehicle that includes a kinematic state of the first vehicle and a danger indication signal and displaying the kinematic state and danger indication signal in the local vehicle.
23. A method according to claim 22 including automatically slowing down or stopping the local vehicle according to the emergency signal.
24. A method according to claim 15 including automatically transmitting a warning signal to other vehicles when an emergency condition occurs.
25. A method according to claim 24 wherein the emergency condition comprises activation of a collision air bag.
26. A method according to claim 15 including: receiving road condition data and an identifier identifying where the road condition is located; and displaying the location of the road condition on an electronic map.
27. A method according to claim 26 including transmitting the road condition data from the location where the road condition is located.
28. A method according to claim 27 including locating road condition transmitters along sides of the road that identify a geographical location and detect icy road conditions and transmitting geographical location and the icy road conditions in the road condition data.
29. A method according to claim 15 including generating a steering queue that provides a direction for the local vehicle to move to avoid the detected object.
30. A method according to claim 15 including identifying a distance to impact of the local vehicle with the detected object.
31. An apparatus, comprising: a plurality of devices located inside or upon an automobile, with each device possessing a wireless link; a graphic user interface located within said automobile, accessible to the driver of said automobile; and a processing unit located within said automobile that exchanges signals with the devices and the graphic user interface via a plurality of wireless links.
32. The apparatus of claim 31 wherein said devices comprise a plurality of sensors, including:
IR and radar sensors; a plurality of audio and video entertainment devices; a plurality of automobile subsystems including environmental controls, security systems, and lighting; a plurality of transmitters, receivers, and associated antenna; and a plurality of Personal Digital Assistants.
33. The apparatus of claim 31 wherein said wireless links comprise analog cellular links, Cellular Digital Packet Data (CDPD) links, Satcom links, IEEE 802.11 links, and Motorola Bluetooth links.
34. The apparatus of claim 31 wherein said graphic user interface comprises a heads-up display.
35. The apparatus of claim 31 wherein said processing unit contains a plurality of inertial reference sensors.
PCT/US2002/020403 2001-06-26 2002-06-26 Method and apparatus for detecting possible collisions and transferring information between vehicles Ceased WO2003001474A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2002349794A AU2002349794A1 (en) 2001-06-26 2002-06-26 Method and apparatus for detecting possible collisions and transferring information between vehicles

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US09/892,293 US20020140548A1 (en) 2001-03-30 2001-06-26 Method and apparatus for a vehicular wireless network
US09/892,293 2001-06-26
US09/892,333 US6615137B2 (en) 2001-06-26 2001-06-26 Method and apparatus for transferring information between vehicles
US09/892,333 2001-06-26

Publications (2)

Publication Number Publication Date
WO2003001474A2 true WO2003001474A2 (en) 2003-01-03
WO2003001474A3 WO2003001474A3 (en) 2008-01-03

Family

ID=27129011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2002/020403 Ceased WO2003001474A2 (en) 2001-06-26 2002-06-26 Method and apparatus for detecting possible collisions and transferring information between vehicles

Country Status (2)

Country Link
AU (1) AU2002349794A1 (en)
WO (1) WO2003001474A2 (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10326648A1 (en) * 2003-06-11 2005-01-13 Daimlerchrysler Ag Object detection using vehicle radar, classifying object data from several cooperating vehicles according to set criteria
FR2896594A1 (en) * 2006-01-24 2007-07-27 Renault Sas METHOD FOR PERCEPTION BY A VEHICLE OF ITS ENVIRONMENT
WO2008022817A1 (en) * 2006-08-21 2008-02-28 Continental Automotive Gmbh Driver assistance system for local and time assessment and prediction of the driving dynamics of a vehicle
WO2008061890A1 (en) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Method for wireless communication between vehicles
WO2008084280A1 (en) * 2007-01-08 2008-07-17 Sony Ericsson Mobile Communications Ab System and method for interactive broadcasting
WO2008110926A2 (en) 2007-03-12 2008-09-18 Toyota Jidosha Kabushiki Kaisha Road condition detecting system
EP1441321A3 (en) * 2003-01-21 2009-01-14 Robert Bosch Gmbh Method for information transmission between mobile stations
WO2009121738A3 (en) * 2008-04-03 2010-02-25 Siemens Aktiengesellschaft Method and device for recognizing a risk of collision in mobile units inside an industrial area
DE102008041749A1 (en) 2008-09-01 2010-03-04 Robert Bosch Gmbh Method for initializing communication of e.g. data between cars, involves transmitting data between vehicles, where transmitted data are related to standard vehicle and are dependent on vehicle parameters and characteristics
WO2011090417A1 (en) * 2010-01-19 2011-07-28 Volvo Technology Corporation Blind spot warning device and blind spot warning system
WO2011130861A1 (en) * 2010-04-19 2011-10-27 Safemine Ag Object proximity warning system and method
US8056857B2 (en) 2006-02-15 2011-11-15 Be Aerospace, Inc. Aircraft seat with upright seat back position indicator
WO2011161176A1 (en) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
US8595037B1 (en) 2012-05-08 2013-11-26 Elwha Llc Systems and methods for insurance based on monitored characteristics of an autonomous drive mode selection system
WO2014011545A1 (en) * 2012-07-09 2014-01-16 Elwha Llc Systems and methods for cooperative collision detection
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
CN104137164A (en) * 2012-02-25 2014-11-05 奥迪股份公司 Method for identifying a vehicle in vehicle-to-vehicle communication
US8886394B2 (en) 2009-12-17 2014-11-11 Bae Systems Plc Producing data describing states of a plurality of targets
EP2846172A1 (en) * 2013-09-09 2015-03-11 Nxp B.V. Warning system and method
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
US9000903B2 (en) 2012-07-09 2015-04-07 Elwha Llc Systems and methods for vehicle monitoring
CN104637344A (en) * 2013-11-11 2015-05-20 纬创资通股份有限公司 Vehicle early warning system and vehicle early warning method
ITVR20130267A1 (en) * 2013-12-03 2015-06-04 Emanuele Donatelli TRAFFIC PREVENTION SYSTEM AND ACCIDENT CONTROL
US9165469B2 (en) 2012-07-09 2015-10-20 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
CN105210129A (en) * 2013-04-19 2015-12-30 大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for preventing a following vehicle from driving up on a vehicle driving directly in front and use of the system
US9230442B2 (en) 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) 2013-07-31 2016-02-23 Elwha Llc Systems and methods for adaptive vehicle sensing systems
WO2016142603A1 (en) * 2015-03-09 2016-09-15 Peugeot Citroen Automobiles Sa Method and device for assisting with overtaking a vehicle in the presence of another, non-visible, oncoming vehicle
US9558667B2 (en) 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
US9776632B2 (en) 2013-07-31 2017-10-03 Elwha Llc Systems and methods for adaptive vehicle sensing systems
CN107769897A (en) * 2016-08-23 2018-03-06 瑞萨电子株式会社 Communication equipment and re-transmission controlling method
US10295662B2 (en) 2014-03-17 2019-05-21 Bae Systems Plc Producing data describing target measurements
WO2020086127A1 (en) * 2018-10-22 2020-04-30 Ebay Inc. Intervehicle communication and notification
WO2020229077A1 (en) * 2019-05-13 2020-11-19 Volkswagen Aktiengesellschaft Warning about a hazard situation in road traffic

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69322349T3 (en) * 1992-09-30 2004-06-09 Hitachi, Ltd. Information system for the driver of a vehicle
DE4237987B4 (en) * 1992-11-11 2004-07-22 Adam Opel Ag Electronic device
US5983161A (en) * 1993-08-11 1999-11-09 Lemelson; Jerome H. GPS vehicle collision avoidance warning and control system and method
US5572201A (en) * 1994-08-05 1996-11-05 Federal Signal Corporation Alerting device and system for abnormal situations
US5907293A (en) * 1996-05-30 1999-05-25 Sun Microsystems, Inc. System for displaying the characteristics, position, velocity and acceleration of nearby vehicles on a moving-map
DE19922608A1 (en) * 1999-05-17 2000-11-23 Media Praesent Ursula Nitzsche Wireless emergency signal transmission, especially to or between vehicles, involves using predefined coded RDS format message at low power in VHF radio band, preferably using free frequency

Cited By (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1441321A3 (en) * 2003-01-21 2009-01-14 Robert Bosch Gmbh Method for information transmission between mobile stations
DE10326648A1 (en) * 2003-06-11 2005-01-13 Daimlerchrysler Ag Object detection using vehicle radar, classifying object data from several cooperating vehicles according to set criteria
FR2896594A1 (en) * 2006-01-24 2007-07-27 Renault Sas METHOD FOR PERCEPTION BY A VEHICLE OF ITS ENVIRONMENT
US8056857B2 (en) 2006-02-15 2011-11-15 Be Aerospace, Inc. Aircraft seat with upright seat back position indicator
WO2008022817A1 (en) * 2006-08-21 2008-02-28 Continental Automotive Gmbh Driver assistance system for local and time assessment and prediction of the driving dynamics of a vehicle
WO2008061890A1 (en) * 2006-11-23 2008-05-29 Continental Automotive Gmbh Method for wireless communication between vehicles
US8886386B2 (en) 2006-11-23 2014-11-11 Continental Automotive Gmbh Method for wireless communication between vehicles
WO2008084280A1 (en) * 2007-01-08 2008-07-17 Sony Ericsson Mobile Communications Ab System and method for interactive broadcasting
US7826789B2 (en) 2007-01-08 2010-11-02 Sony Ericsson Mobile Communications Ab System and method for interactive broadcasting
WO2008110926A3 (en) * 2007-03-12 2008-11-27 Toyota Motor Co Ltd Road condition detecting system
US8362889B2 (en) 2007-03-12 2013-01-29 Toyota Jidosha Kabushiki Kaisha Road condition detecting system
JP2008225786A (en) * 2007-03-12 2008-09-25 Toyota Motor Corp Road condition detection system
WO2008110926A2 (en) 2007-03-12 2008-09-18 Toyota Jidosha Kabushiki Kaisha Road condition detecting system
WO2009121738A3 (en) * 2008-04-03 2010-02-25 Siemens Aktiengesellschaft Method and device for recognizing a risk of collision in mobile units inside an industrial area
DE102008041749A1 (en) 2008-09-01 2010-03-04 Robert Bosch Gmbh Method for initializing communication of e.g. data between cars, involves transmitting data between vehicles, where transmitted data are related to standard vehicle and are dependent on vehicle parameters and characteristics
US8779934B2 (en) 2009-06-12 2014-07-15 Safemine Ag Movable object proximity warning system
US9129509B2 (en) 2009-06-12 2015-09-08 Safemine Ag Movable object proximity warning system
US8994557B2 (en) 2009-12-11 2015-03-31 Safemine Ag Modular collision warning apparatus and method for operating the same
US8886394B2 (en) 2009-12-17 2014-11-11 Bae Systems Plc Producing data describing states of a plurality of targets
WO2011090417A1 (en) * 2010-01-19 2011-07-28 Volvo Technology Corporation Blind spot warning device and blind spot warning system
WO2011130861A1 (en) * 2010-04-19 2011-10-27 Safemine Ag Object proximity warning system and method
AU2010351500B2 (en) * 2010-04-19 2014-09-11 Safemine Ag Object proximity warning system and method
WO2011161176A1 (en) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
CN103080953A (en) * 2010-06-23 2013-05-01 大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
CN102947870A (en) * 2010-06-23 2013-02-27 大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for validating information
KR101942109B1 (en) * 2010-06-23 2019-04-11 콘티넨탈 테베스 아게 운트 코. 오하게 Method and system for validating information
US9393958B2 (en) 2010-06-23 2016-07-19 Continental Teves Ag & Co. Ohg Method and system for validating information
WO2011161177A1 (en) * 2010-06-23 2011-12-29 Continental Teves Ag & Co. Ohg Method and system for validating information
KR20130121816A (en) * 2010-06-23 2013-11-06 콘티넨탈 테베스 아게 운트 코. 오하게 Method and system for validating information
US9096228B2 (en) 2010-06-23 2015-08-04 Continental Teves Ag & Co. Ohg Method and system for accelerated object recognition and/or accelerated object attribute recognition and use of said method
CN104137164A (en) * 2012-02-25 2014-11-05 奥迪股份公司 Method for identifying a vehicle in vehicle-to-vehicle communication
CN104137164B (en) * 2012-02-25 2016-03-02 奥迪股份公司 Method for identifying a vehicle in vehicle-to-vehicle communication
US9165198B2 (en) 2012-02-25 2015-10-20 Audi Ag Method for identifying a vehicle during vehicle-to-vehicle communication
US8595037B1 (en) 2012-05-08 2013-11-26 Elwha Llc Systems and methods for insurance based on monitored characteristics of an autonomous drive mode selection system
US9000903B2 (en) 2012-07-09 2015-04-07 Elwha Llc Systems and methods for vehicle monitoring
US9558667B2 (en) 2012-07-09 2017-01-31 Elwha Llc Systems and methods for cooperative collision detection
WO2014011545A1 (en) * 2012-07-09 2014-01-16 Elwha Llc Systems and methods for cooperative collision detection
US9165469B2 (en) 2012-07-09 2015-10-20 Elwha Llc Systems and methods for coordinating sensor operation for collision detection
CN105210129A (en) * 2013-04-19 2015-12-30 大陆-特韦斯贸易合伙股份公司及两合公司 Method and system for preventing a following vehicle from driving up on a vehicle driving directly in front and use of the system
US9230442B2 (en) 2013-07-31 2016-01-05 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9269268B2 (en) 2013-07-31 2016-02-23 Elwha Llc Systems and methods for adaptive vehicle sensing systems
US9776632B2 (en) 2013-07-31 2017-10-03 Elwha Llc Systems and methods for adaptive vehicle sensing systems
EP2846172A1 (en) * 2013-09-09 2015-03-11 Nxp B.V. Warning system and method
CN104637344A (en) * 2013-11-11 2015-05-20 纬创资通股份有限公司 Vehicle early warning system and vehicle early warning method
ITVR20130267A1 (en) * 2013-12-03 2015-06-04 Emanuele Donatelli TRAFFIC PREVENTION SYSTEM AND ACCIDENT CONTROL
US10295662B2 (en) 2014-03-17 2019-05-21 Bae Systems Plc Producing data describing target measurements
FR3033539A1 (en) * 2015-03-09 2016-09-16 Peugeot Citroen Automobiles Sa METHOD AND DEVICE FOR ASSISTING THE EXCEEDING OF A VEHICLE IN THE PRESENCE OF ANOTHER VEHICLE INVISIBLE AND CIRCULATING AT CONTRESENS
WO2016142603A1 (en) * 2015-03-09 2016-09-15 Peugeot Citroen Automobiles Sa Method and device for assisting with overtaking a vehicle in the presence of another, non-visible, oncoming vehicle
CN107769897A (en) * 2016-08-23 2018-03-06 瑞萨电子株式会社 Communication equipment and re-transmission controlling method
CN107769897B (en) * 2016-08-23 2022-02-08 瑞萨电子株式会社 Communication device and retransmission control method
US10723366B2 (en) 2018-10-22 2020-07-28 Ebay Inc. Intervehicle communication and notification
US10703386B2 (en) 2018-10-22 2020-07-07 Ebay Inc. Intervehicle communication and notification
KR20210047333A (en) * 2018-10-22 2021-04-29 이베이 인크. Vehicle-to-vehicle communication and notification
CN112955353A (en) * 2018-10-22 2021-06-11 电子湾有限公司 Inter-vehicle communication and notification
JP2022502792A (en) * 2018-10-22 2022-01-11 イーベイ インク.Ebay Inc. Vehicle-to-vehicle communication and notification
WO2020086127A1 (en) * 2018-10-22 2020-04-30 Ebay Inc. Intervehicle communication and notification
JP7326438B2 (en) 2018-10-22 2023-08-15 イーベイ インク. Vehicle-to-vehicle communication and notifications
KR102621430B1 (en) * 2018-10-22 2024-01-09 이베이 인크. Vehicle-to-vehicle communication and notification
CN112955353B (en) * 2018-10-22 2024-07-23 电子湾有限公司 Inter-vehicle communication and notification
WO2020229077A1 (en) * 2019-05-13 2020-11-19 Volkswagen Aktiengesellschaft Warning about a hazard situation in road traffic
CN113785339A (en) * 2019-05-13 2021-12-10 大众汽车股份公司 Warning of dangerous situations in road traffic
US11790782B2 (en) 2019-05-13 2023-10-17 Volkswagen Aktiengesellschaft Warning about a hazardous situation in road traffic

Also Published As

Publication number Publication date
AU2002349794A8 (en) 2008-02-28
WO2003001474A3 (en) 2008-01-03
AU2002349794A1 (en) 2003-01-08

Similar Documents

Publication Publication Date Title
WO2003001474A2 (en) Method and apparatus for detecting possible collisions and transferring information between vehicles
US6615137B2 (en) Method and apparatus for transferring information between vehicles
EP3856595B1 (en) Automotive driver assistance
US11414073B2 (en) Automotive driver assistance
EP3856593B1 (en) Automotive driver assistance
US8941510B2 (en) Hazard warning system for vehicles
US6791471B2 (en) Communicating position information between vehicles
US11375351B2 (en) Method and system for communicating vehicle position information to an intelligent transportation system
US10462225B2 (en) Method and system for autonomously interfacing a vehicle electrical system of a legacy vehicle to an intelligent transportation system and vehicle diagnostic resources
US20150042491A1 (en) Hazard warning system for vehicles
WO2004047047A1 (en) Method and system for avoiding traffic collisions
CN108297880A (en) Divert one's attention driver notification system
JP2022544533A (en) Systems for communicating dangerous vehicles and road conditions
CN103318086A (en) Automobile tailgating prevention and safe driving information exchange caution control system
JP2002123896A (en) Vehicle collision warning device
US10685563B2 (en) Apparatus, systems, and methods for detecting, alerting, and responding to an emergency vehicle
JPH1173595A (en) Method for forming traffic information and telematic device for vehicle
JP5025623B2 (en) Information providing apparatus and information providing method
JP4478330B2 (en) Equipment to improve traffic safety
US7407028B2 (en) Navigation-based safety restraint system and method
KR102742458B1 (en) On-vehicle device for driving guide
JP4748121B2 (en) Transportation support system, vehicle-mounted device, portable device and base station
US20090105901A1 (en) System for utilizing vehicle data and method of utilizing vehicle data
US20230422012A1 (en) Transmission of ecall information using intelligent infrastructure
JP2013050834A (en) Mobile communication device and travel support method

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ OM PH PL PT RO RU SD SE SG SI SK SL TJ TM TN TR TT TZ UA UG US UZ VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP

WWW Wipo information: withdrawn in national office

Country of ref document: JP