WO2025099513A1 - A system and method of providing traffic visualization and customized driving support - Google Patents
A system and method of providing traffic visualization and customized driving support Download PDFInfo
- Publication number
- WO2025099513A1 WO2025099513A1 PCT/IB2024/060037 IB2024060037W WO2025099513A1 WO 2025099513 A1 WO2025099513 A1 WO 2025099513A1 IB 2024060037 W IB2024060037 W IB 2024060037W WO 2025099513 A1 WO2025099513 A1 WO 2025099513A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- location
- elements
- processing unit
- integrated processing
- features
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/0104—Measuring and analyzing of parameters relative to traffic conditions
- G08G1/0108—Measuring and analyzing of parameters relative to traffic conditions based on the source of data
- G08G1/0112—Measuring and analyzing of parameters relative to traffic conditions based on the source of data from the vehicle, e.g. floating car data [FCD]
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
- G08G1/04—Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096708—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control
- G08G1/096716—Systems involving transmission of highway information, e.g. weather, speed limits where the received information might be used to generate an automatic action on the vehicle control where the received information does not generate an automatic action on the vehicle control
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0967—Systems involving transmission of highway information, e.g. weather, speed limits
- G08G1/096766—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission
- G08G1/096783—Systems involving transmission of highway information, e.g. weather, speed limits where the system is characterised by the origin of the information transmission where the origin of the information is a roadside individual element
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/0968—Systems involving transmission of navigation instructions to the vehicle
- G08G1/0969—Systems involving transmission of navigation instructions to the vehicle having a display in the form of a map
Definitions
- the embodiments herein generally relate to providing visualization of traffic for providing driving support. More particularly, the disclosure relates to a system and method providing traffic visualization for customized driving support.
- Roadways are one of the major and convenient modes of facilitating transportation between any given geographical location, terrain and distance. Roads have been extensively constructed across difficult topographies enabling vehicles to travel and transport goods to any given remote location. Consequently, the number of vehicles plying on these roads at any point of time has increased significantly.
- the main objective of the present disclosure is to provide a system and method of providing traffic visualization and customized driving support.
- Another objective of the present disclosure is to provide a system and method of providing both real-time driving support and predictive driving support for assisting driving of the vehicles and preventing accidents.
- Still another objective of the present disclosure is to provide a system and method of providing traffic visualization through a graphical representation based on factors impacting driving at the location.
- Yet another objective of the present disclosure is to provide a system and method of customizing driving support for each vehicle at a location based on relative factors.
- an embodiment herein provides a system and method of providing traffic visualization and customized driving support.
- the system of providing traffic visualization and customized driving support comprises the following.
- a vision sensor is provided for extracting features from a visual data detected at a location in a region, wherein the features capable of impacting driving at the location.
- An environmental sensor is provided for detecting parameters of environmental conditions at a location in a region.
- An analysis system is connected to the vision sensor and the environmental sensor for generating a first set of elements and a second set of elements based on the features and parameters.
- a real-time monitoring unit is provided in the analysis system for identifying features and parameters capable of causing impact on driving at the location for generating the first set of elements.
- a vehicle monitoring unit is provided in the analysis system for identifying features of other vehicles capable of causing impact on driving at the location for generating the second set of elements.
- An integrated processing unit is connected to plurality of the analysis systems for creating an audio and visual representation of the region by combining the first set of elements and second set of elements, and the visual data at plurality of the locations.
- An onboard user device is connected to the integrated processing unit for receiving a customized graphical representation of the region based on a location of the onboard user device.
- the onboard user device transmits a location of the vehicle to the integrated processing unit, and the customized representation unit customizes the graphical representation based on a live location of the vehicle, thereby providing graphical representation of a road at the location, ahead of the location and behind the location of the vehicle for providing driving support.
- the integrated processing unit is connected to other integrated processing units in the region through the long-range communication control.
- Each integrated processing unit transmits the audio and visual representation at a location to another integrated processing unit at another location in the region.
- the onboard user device is connected to the integrated processing unit through the long-range communication control for receiving the customized graphical representation, thereby facilitating providing driving support in regions independent of internet access.
- the features include characteristics of road including road conditions of potholes, speed breakers, zebra crossings, lane markings, traffic signs, traffic signals, land-slides, profile of road, driving compliance of vehicle, vehicle type including car, truck, bus, two-wheeler, traffic density, characteristics of vehicles including speed, direction of movement, and load on the vehicle, obstacles on road, traffic signs, and lane markings.
- the real-time monitoring unit generates first set of elements based on identifying the features of characteristics of road, obstacles on road, traffic signs, and lane markings, and parameters of rain, fog.
- the vehicle monitoring unit generates second set of elements based on the features of driving compliance of vehicle, vehicle type, traffic density, characteristics of vehicles, and load on the vehicle.
- the visual representation including a map of the region based on the visual data, and the first set of elements and the second set of elements indicated on the map for providing driving support at the region.
- the analysis system indicates the first set of elements and the second set of elements using one or a combination of colour, symbol, text, image and audio for representing the identified features.
- the analysis system and the integrated processing unit constitute a microcontroller.
- the onboard user device includes but not limited to a microcontroller, a GPS unit and a display interface.
- the display interface is provided for accessing a driver support system, and the driver support system is provided displaying the customized graphical representation received from the integrated processing unit.
- the system can include a cloud platform connected to the vision sensor, the environmental sensor, and the GPS unit, and the onboard user device.
- the cloud platform can perform functionalities of the analysis system and the integrated processing unit for enhancing scalability and efficiency.
- a method of providing traffic visualization and customized driving support comprises the following steps. First, extracting features from a visual data of a location at a region, by a vision sensor. Then, detecting parameters of environmental conditions at a location in the region, by an environmental sensor. Then, generating a first set of elements and a second set of elements capable of impacting driving at the location based on the features, by an analysis system. Then, transmitting the generated elements to an integrated processing unit, by the analysis system. Next, creating a graphical representation using the visual data and the elements at the location, by the integrated processing unit. Then, customizing the graphical representation based on a location received from a user device of a vehicle, by the integrated processing unit, and finally displaying the customized graphical representation for providing driving support, by the onboard user device.
- generating the elements includes identifying features and parameters having impact on driving at the location for generating first set of elements, by a real-time monitoring unit, and identifying features of other vehicles having impact on driving at the location for generating second set of elements, by a vehicle monitoring unit.
- Fig. la illustrates a system of providing traffic visualization and customized driving support, according to an embodiment herein;
- Fig. lb illustrates a network of system of providing traffic visualization and customized driving support, according to an embodiment herein;
- FIG. 2a illustrates the system of providing traffic visualization and customized driving support connected to a cloud platform, according to an embodiment herein;
- Fig. 2b illustrates the network of system of providing traffic visualization and customized driving support connected to a cloud platform, according to an embodiment herein;
- Fig. 3 illustrates a method of providing traffic visualization and customized driving support, according to an embodiment herein.
- Fig. la illustrates a system providing traffic visualization and customized driving support.
- the visualization and support system 101 can include a visualization system 103, a long-range communication control 115, an integrated processing unit 117 and a user device 127.
- the visualization system 103 can include a plurality of vision sensors 105, environmental sensors 108, and a plurality of analysis systems 109.
- vision sensor 105 and environmental sensor 107 are mounted at a location in a region.
- the location can be based on crucial maneuvers in the region.
- the vision sensor 105 can be provided for recording visual data at the location and extracting features from the visual data of the location.
- the features can include but not limited to characteristics of road including road conditions of potholes, sharp curve, speed breakers, landslides, zebra crossings, and lane markings, traffic signs, traffic signals, profile of road, driving compliance of vehicle, characteristics of vehicles including type, speed, direction of movement, number plate of the vehicle, colour of the vehicle, and loading, obstacles on road, accident on road, number of vehicles, and weather condition.
- the vision sensor can identify and extract the features present in the visual data at the location.
- the extracted features of the visual data at the location can be transmitted to the analysis system 109.
- the vision sensors can include but not limited to thermal, and HD imaging sub systems, and three-dimensional LIDAR systems.
- a three-dimensional capture system can be provided in the visualization system 103.
- the three-dimensional capture system can include VHF radios and light detection sub systems.
- the environmental sensor 107 can be provided for detecting parameters of environmental conditions at the location.
- the environmental conditions can include but not limited to rain, temperature, fog, humidity, wind.
- the detected parameters of the environmental conditions can be transmitted to the analysis system 109.
- the environmental sensor 107 can include but not limited to temperature sensor, rain detecting sensor, fog detecting sensor, humidity sensor and wind sensor.
- the vision sensor 105 and the environmental sensor 107 can be mounted in an enclosure.
- the vision sensor 105 and the environmental sensor 107 can include a GPS unit 107.
- the vision sensor 105 can transmit coordinates of the location obtained from the GPS unit 107 with the visual data of the location to the analysis system 109.
- the environmental sensor 107 can transmit coordinates of the location obtained from the GPS unit 107 with the parameters of the environmental conditions at the location to the analysis system 109.
- the analysis system 109 can be configured for receiving the visual data with the extracted features from a plurality of vision sensors 105 at plurality of locations in the region and receiving the parameters of environmental conditions from a plurality of environmental sensors at plurality of locations in the region.
- the analysis system 109 can include but not limited to a real-time monitoring unit 111 and an artificial intelligence unit 111.
- the artificial intelligence unit 111 can include a vehicle monitoring unit 113.
- the analysis system 109 can identify the features and parameters of environmental conditions impacting driving at the location and generate the elements based on the identified features and parameters.
- the real-time monitoring unit 111 can be configured for identifying features and parameters capable of causing an impact on driving at the location for generating a first set of elements based on the identified features and parameters.
- the real-time monitoring unit 111 can generate the first set of elements based on identification of the features including but not limited to road characteristics, obstacles, and parameters of environmental conditions including but not limited to rain and fog.
- the vehicle monitoring unit 113 can be configured for identifying features of other vehicles having an impact on driving at the location for generating a second set of elements based on the identified features and parameters.
- the vehicle monitoring unit 113 can generate the set of elements based on the features including but not limited to driving compliance of vehicle, vehicle characteristics, traffic density.
- the first set of elements and the second set of elements can be indicated on the visual data by the real-time monitoring unit 111 and the vehicle monitoring unit 113 using one or a combination of colour, symbol, text, image and audio for representing the identified features.
- the first set of elements and the second set of elements generated by the plurality of the analysis systems 109 in the region can be transmitted to the integrated processing unit 117 through the long range communication control 115.
- the long range communication control 115 can include but not limited to LTE (long term evaluation) based wireless broadband radio, LoRa, Bluetooth, Wi-Fi, bidirectional LTE band radios, satellite radios and LoRa WLAN.
- the long range communication control 115 can be a long range low latency full duplex communication system.
- the system can include a bi-directional audio communication system.
- the audio communication system can include an operator device of an operator connecting to the user device 127 for facilitating communication. Thereby, the user can immediately connect and communicate to the operator for informing updates during emergency situations at the location.
- the integrated processing unit 117 and the analysis system 109 can include a microcontroller.
- the plurality of integrated processing units 117 can be connected to each other in the region through the long range communication controller 115.
- the integrated processing unit 117 can be connected to the onboard user device 127 in a vehicle 125.
- the integrated processing unit 117 can be configured for creating an audio and visual representation of the region by combining the first set of elements, and second set of elements and the visual data at plurality of the locations in the region.
- the integrated processing unit 117 can include but not limited to a customized representation unit 119.
- the customized representation unit 119 can include a visual processing unit 121 and an audio processing unit 123.
- the visual processing unit 121 can create a graphical representation from the visual data and the elements received from the plurality of artificial intelligence units 109 in the region.
- the graphical representation can include but not limited to a map.
- the first set of elements and the second set of elements can be indicated in the graphical representation for providing driving support in the region.
- the audio processing unit 123 can create an audio representation based on the first set of elements and the second set of elements received from the plurality of analysis systems 109 in the region.
- the audio processing unit 123 can convert details of the first set of elements and the second set of elements, and coordinates of the first set of elements and the second set of elements into the audio representation.
- the audio representation can emit audio of coordinates of the first set of elements and the second set of elements with details of the first set of elements and the second set of elements for providing driving support in the region.
- the onboard user device 127 can include but not limited to an electronic device, a smart phone.
- a GPS unit 131 can be present in the onboard user device 127.
- the GPS unit 131 can transmit a location of the onboard user device 127 in the vehicle 125 to the integrated processing unit 117.
- the integrated processing unit 117 can customize the visual representation and the audio representation based on the location of the onboard user device.
- the visual representation and the audio representation can be customized relative to the location of the onboard user device for providing driving support at the location of the onboard user device.
- the integrated processing unit 117 can include a local storage for storing the created visual representation and the audio representation.
- the onboard audio device can be connected to one of the plurality of integrated processing units 117 through long range communication control for receiving the visual and audio representation.
- the integrated processing units 117 facilitate transmitting of the visual and audio representation to the onboard user devices 127 independent of internet availability at the remote locations.
- a driving assistance system 129 can be accessed using the onboard user device 127.
- the driving assistance system 129 can be accessed using the onboard user device 127 for displaying the visual representation and emitting the audio representation created by the customized representation unit 119.
- the driving assistance system 129 can be accessed through a QR code link.
- the QR code link can be provided at multiple locations in the region for facilitating linking of the onboard user device 127 to the integrated processing unit.
- the user can select one or both of the visual representation and the audio representation using the driving assistance system 129 in the onboard user device 127.
- the integrated processing unit can include communication protocols including but not limited to video streaming, dedicated data collection, loT messaging and GIS positioning IT systems.
- the unit can be capable of handling variable message and data payloads.
- the integrated processing unit 117 and the visualization system 103 can be powered by a battery.
- the battery can be recharged through a renewable source including but not limited to solar panel, windmill.
- the battery can include 24 hours of battery back-up with power management system.
- the enclosure of the visualization system 103, and enclosure of the integrated processing unit 117 can include local security systems such as anti-vandal and theft protection sub systems for securing and preventing theft and/or vandalism.
- the enclosure can include fire, short circuit and malfunction protection suit to facilitate functioning in different weather conditions.
- the enclosure can include a rugged military grade all terrain usable enclosure.
- the vision sensor can identify potholes, blockage/obstacle on road, number of vehicles, number plate of vehicle, driving compliance of the vehicles including speed of vehicles and direction of movement of vehicles, pedestrians at the location from the visual data.
- the artificial intelligence unit 110 can generate the first set of elements and the second set of elements.
- the first set of elements can include indicating dangerous potholes on the road ahead, indicating sharp curves on the road ahead that can cause probable collision based on location of the onboard user device, and indicating pedestrians at the location ahead.
- the second set of elements can include determining alert on probable collision based on driving compliance of vehicles at the location and speed of the vehicles at road ahead, and traffic density at the location based on a thermal map.
- the integrated processing unit 117 can create the graphical representation consisting of the visual data and the first set of elements and the second set of elements being indicated on the visual data.
- the graphical representation can be created on the map of the region with the real-time visual representation based on the location of the first set of elements and the second set of elements.
- the graphical representation can include profile of bend road ahead, oncoming vehicular traffic, transit time updates, road impediments, live footage of road ahead, audio prompts, geo location based rules, statistics of location, emergency contacts, and information of performing first aid.
- the integrated processing unit On receiving location of the onboard user device 127, the integrated processing unit can transmit a customized graphical representation based on a location of the onboard user device.
- Fig. lb illustrates a network of system of providing traffic visualization and customized driving support.
- the network can include the visualization system 103 being placed at plurality of location at a region.
- the location can be curves, and/or other crucial road conditions of the region.
- the visualization system can include a vision sensor and environmental sensor with a GPS unit connected to the analysis system 109.
- the visualization system 103 can be mounted in an enclosure for being protected from varying weather conditions.
- the plurality of visualization systems 103 is connected to the integrated processing unit 117 through the long-range communication control 115.
- the integrated processing unit 117 is provided at a location in the region, wherein each integrated processing unit 117 is connected to plurality of visualization systems 103 nearby to the integrated processing unit 117.
- each of the integrated processing units 117 are connected to each other for facilitating transmission of the audio and visual representation.
- the plurality of onboard user devices 127 are connected to the integrated processing unit 117 through the long-range communication control 115 for accessing the driver support system 129.
- the long-range communication control 115 can include but not limited to cellular broad band technology, local area wireless technology systems.
- Fig. 2a illustrates the system of providing traffic visualization and customized driving support connected to a cloud platform.
- the visualization system 103 including the vision sensor 105, the environmental sensor 107 and the GPS unit 108 can be connected through the long range communication control 115 to a cloud platform 201.
- the cloud platform is a virtual on demand system for processing and analyzing data received from the visualization system 103.
- the cloud platform 201 can perform functionalities of the analysis system 109 and the integrated processing unit 117.
- the cloud platform 201 can perform identifying the features and parameters of environmental conditions impacting driving at the location and generate the elements based on the identified features and parameters.
- the first set of elements and the second set of elements similar to that of the real-time monitoring unit 111 and the vehicle monitoring unit 113 can be generated.
- the cloud platform 201 can create an audio and visual representation of the region by combining the first set of elements, and second set of elements and the visual data, similar to the integrated processing unit 117.
- the onboard user device 127 can be connected to the cloud platform 201 for accessing the audio representation and the visual representation.
- the cloud platform 201 can be connected to the analysis system 109 and the integrated processing unit 117 (not shown in figure) through the long-range communication control 115 for performing real-time monitoring and controlling of the analysis system 109 and the integrated processing unit 117.
- the cloud platform 201 is provided in addition to the analysis system 109 and the integrated processing unit 117 for improving scalability and efficiency.
- the cloud platform 201 can be updated with long term developments remotely for updating functioning of the system.
- the cloud platform 201 can store information including GPS location, and constitute a real time geospatial database for the location of the visualization system 103.
- Fig. 2b illustrates the network of system of providing traffic visualization and customized driving support connected to a cloud platform.
- the plurality of visualization systems 103 provided at multiple locations at the region can be connected to the cloud platform 201 through the long-range communication control 115.
- the plurality of user devices 127 can be connected to the cloud platform 201 through the long-range communication control 115.
- Fig. 3 illustrates an onboard user device of the system of providing traffic visualization and customized driving support.
- the onboard user device 127 can include a controller 301, a communication interface 303, a display interface 305, and the GPS unit 131.
- the communication interface 303 can be provided for communicating with the integrated processing units 117 provided in the region through the long- range communication control 115.
- the communication interface 303 can provide compatibility of communication with the long-range communication control 115 including but not limited to Lora, Bluetooth and Wi-Fi.
- the controller 301 can receive current location from the GPS unit 131 and transmit the location of the onboard user device 127 to the integrated processing unit 117 for processing customized graphical representation.
- the display interface 305 can be provided for displaying data received from the integrated processing unit 117 and transmitting feedback from user.
- the display interface 305 can be provided for connecting to the driver support system 129 through the communication interface 303.
- the driver support system 129 can include but not limited to an application, website.
- a live scenario 307 option, an upcoming scenario 309 option and a user interactive control 311 can be accessed in the driver support system 129.
- the live scenario 307 can provide a real-time customized graphical representation at the location of the onboard user device 127.
- the upcoming scenario 309 can provide a real-time customized graphical representation at a location ahead of the onboard user device 127.
- the user interactive control 311 can facilitate feedback from the user, wherein the feedback can include but not limited to alerts at the location.
- the onboard user device can project the graphical representation on a window panel for displaying.
- the onboard user device can include a HUD display with audio navigation prompts.
- Fig. 4 illustrates a flow chart of a method of providing traffic visualization and customized driving support.
- the method of providing traffic visualization and customized driving support includes the following steps. Initially, extracting 401 features from a visual data of a location in a region, by a vision sensor. The visual data of the location in a region is recorded by the vision sensor. A plurality of visual sensors is provided at multiple locations in the region.
- the environmental conditions can include but not limited to rain, temperature, fog, humidity, and wind.
- generating 403 elements capable of impacting driving at the location based on the features by an artificial intelligence unit.
- generating the elements includes identifying features having impact on driving at the location for generating first set of elements based on the identified features, by a real-time monitoring unit, and identifying features of other vehicles having an impact on driving at the location for generating second set of elements based on the identified features, by a vehicle monitoring unit.
- the method can include generating elements capable of impacting driving at the location based on the features, by a cloud platform.
- generating the elements includes identifying features having impact on driving at the location for generating first set of elements based on the identified features, by a real-time monitoring unit, and identifying features of other vehicles having an impact on driving at the location for generating second set of elements based on the identified features, by a vehicle monitoring unit.
- the cloud platform can create a graphical representation using the visual data and the elements at the location.
- the graphical representation can include audio representation and a visual representation. Thereby, the cloud platform can facilitate improving scalability and efficiency of the system.
- a main advantage of the present disclosure is that the system and method provides traffic visualization and customized driving support.
- system and method provides traffic visualization and customized driving support based on both real-time factors and predictive factors.
- system and method generate a graphical representation for providing traffic visualization and customized driving support.
- Yet another advantage of the present disclosure is that the system and method provide customization of assistance in driving for each vehicle in a region.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Atmospheric Sciences (AREA)
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Traffic Control Systems (AREA)
Abstract
The present disclosure explains a system of providing traffic visualization and customized driving support. The system includes the following. A vision sensor (105) provided to extract features from visual data, and an environmental sensor (108) to detect parameters of environmental conditions at a location. An analysis system (109) to generate elements, wherein a real-time monitoring unit (111) identifying features and parameters capable of causing impact on driving for the first set of elements and a vehicle monitoring unit (113) identifying features of other vehicles capable of causing impact on driving for the second set of elements. An integrated processing unit (117) provided for creating a representation of the region by combining the elements, and the visual data. A user device (127) is connected to the integrated processing unit (117) for receiving a customized graphical representation of the region based on a location of the onboard user device (127).
Description
A SYSTEM AND METHOD OF PROVIDING TRAFFIC
VISUALIZATION AND CUSTOMIZED DRIVING SUPPORT
FIELD
[0001] The embodiments herein generally relate to providing visualization of traffic for providing driving support. More particularly, the disclosure relates to a system and method providing traffic visualization for customized driving support.
BACKGROUND AND PRIOR ART
[0002] Roadways are one of the major and convenient modes of facilitating transportation between any given geographical location, terrain and distance. Roads have been extensively constructed across difficult topographies enabling vehicles to travel and transport goods to any given remote location. Consequently, the number of vehicles plying on these roads at any point of time has increased significantly.
[0003] However, the increasing traffic on the road has resulted in a rise in the frequency of accidents. Difficult terrains, inexperienced drivers, driving without compliance to rules and other factors are few of the main causes of such incidents. Further, owing to the remote location, topography and terrain, it is not possible to manually supervise the traffic at all given times.
[0004] Conventionally, to prevent accidents and facilitate smooth driving on the roads, warning signs are put up on accident prone areas, and traffic signs are installed to alert the driver of the roads ahead. However, these measures provide limited assistance in unpredictable situations such as increased traffic in narrow roads, haphazard driving of neighboring vehicles, environmental conditions, and other sudden changes in the terrain of the road. Hence, these disadvantages have constrained drivers in travelling across the existing network of roadways to a full extent.
[0005] Therefore, there is a need for a system to provide customized driving support to each vehicle for preventing accidents on roads. Moreover, there is a need for a system and method of providing visualization of traffic along with customized driving support.
OBJECTS
[0006] Some of the objects of the present disclosure are described herein below: [0007] The main objective of the present disclosure is to provide a system and method of providing traffic visualization and customized driving support.
[0008] Another objective of the present disclosure is to provide a system and method of providing both real-time driving support and predictive driving support for assisting driving of the vehicles and preventing accidents.
[0009] Still another objective of the present disclosure is to provide a system and method of providing traffic visualization through a graphical representation based on factors impacting driving at the location.
[00010] Yet another objective of the present disclosure is to provide a system and method of customizing driving support for each vehicle at a location based on relative factors.
[00011] The other objectives and advantages of the present disclosure will be apparent from the following description when read in conjunction with the accompanying drawings, which are incorporated for illustration of preferred embodiments of the present disclosure and are not intended to limit the scope thereof.
SUMMARY
[00012] In view of the foregoing, an embodiment herein provides a system and method of providing traffic visualization and customized driving support.
[00013] In accordance with an embodiment, the system of providing traffic visualization and customized driving support comprises the following. A vision sensor is provided for extracting features from a visual data detected at a location in a region, wherein the features capable of impacting driving at the location. An environmental sensor is provided for detecting parameters of environmental conditions at a location in a region. An analysis system is connected to the vision sensor and the environmental sensor for generating a first set of elements and a second set of elements based on the features and parameters. A real-time monitoring unit is provided in the analysis system for identifying features and parameters capable of causing impact on driving at the location for generating the first set of elements. A vehicle monitoring unit is provided in the analysis system for identifying features of other vehicles capable of causing impact on driving at the location for generating the second set of elements. An integrated processing unit is connected to plurality of the analysis systems for creating an audio and visual representation of the region by combining the first set of elements and second set of elements, and the visual data at plurality of the locations. An onboard user device is connected to the integrated processing unit for receiving a customized graphical representation of the region based on a location of the onboard user device.
[00014] In an embodiment, the onboard user device transmits a location of the vehicle to the integrated processing unit, and the customized representation unit customizes the graphical representation based on a live location of the vehicle,
thereby providing graphical representation of a road at the location, ahead of the location and behind the location of the vehicle for providing driving support.
[00015] In an embodiment, the integrated processing unit is connected to other integrated processing units in the region through the long-range communication control. Each integrated processing unit transmits the audio and visual representation at a location to another integrated processing unit at another location in the region.
[00016] In an embodiment, the onboard user device is connected to the integrated processing unit through the long-range communication control for receiving the customized graphical representation, thereby facilitating providing driving support in regions independent of internet access.
[00017] In an embodiment, the features include characteristics of road including road conditions of potholes, speed breakers, zebra crossings, lane markings, traffic signs, traffic signals, land-slides, profile of road, driving compliance of vehicle, vehicle type including car, truck, bus, two-wheeler, traffic density, characteristics of vehicles including speed, direction of movement, and load on the vehicle, obstacles on road, traffic signs, and lane markings.
[00018] In an embodiment, the real-time monitoring unit generates first set of elements based on identifying the features of characteristics of road, obstacles on road, traffic signs, and lane markings, and parameters of rain, fog.
[00019] In an embodiment, the vehicle monitoring unit generates second set of elements based on the features of driving compliance of vehicle, vehicle type, traffic density, characteristics of vehicles, and load on the vehicle.
[00020] In an embodiment, the visual representation including a map of the region based on the visual data, and the first set of elements and the second set of elements indicated on the map for providing driving support at the region.
[00021] In an embodiment, the analysis system indicates the first set of elements and the second set of elements using one or a combination of colour, symbol, text, image and audio for representing the identified features.
[00022] In an embodiment, the analysis system and the integrated processing unit constitute a microcontroller.
[00023] In an embodiment, the onboard user device includes but not limited to a microcontroller, a GPS unit and a display interface. The display interface is provided for accessing a driver support system, and the driver support system is provided displaying the customized graphical representation received from the integrated processing unit.
[00024] In accordance with an embodiment, the system can include a cloud platform connected to the vision sensor, the environmental sensor, and the GPS unit, and the onboard user device. The cloud platform can perform functionalities of the analysis system and the integrated processing unit for enhancing scalability and efficiency.
[00025] In accordance with an embodiment, a method of providing traffic visualization and customized driving support, comprises the following steps. First, extracting features from a visual data of a location at a region, by a vision sensor. Then, detecting parameters of environmental conditions at a location in the region, by an environmental sensor. Then, generating a first set of elements and a
second set of elements capable of impacting driving at the location based on the features, by an analysis system. Then, transmitting the generated elements to an integrated processing unit, by the analysis system. Next, creating a graphical representation using the visual data and the elements at the location, by the integrated processing unit. Then, customizing the graphical representation based on a location received from a user device of a vehicle, by the integrated processing unit, and finally displaying the customized graphical representation for providing driving support, by the onboard user device.
[00026] In an embodiment, generating the elements includes identifying features and parameters having impact on driving at the location for generating first set of elements, by a real-time monitoring unit, and identifying features of other vehicles having impact on driving at the location for generating second set of elements, by a vehicle monitoring unit.
[00027] These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
BRIEF DESCRIPTION OF DRAWINGS
[00028] The detailed description is set forth with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the
figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical items.
[00029] Fig. la illustrates a system of providing traffic visualization and customized driving support, according to an embodiment herein;
[00030] Fig. lb illustrates a network of system of providing traffic visualization and customized driving support, according to an embodiment herein;
[00031] Fig. 2a illustrates the system of providing traffic visualization and customized driving support connected to a cloud platform, according to an embodiment herein;
[00032] Fig. 2b illustrates the network of system of providing traffic visualization and customized driving support connected to a cloud platform, according to an embodiment herein; and
[00033] Fig. 3 illustrates a method of providing traffic visualization and customized driving support, according to an embodiment herein.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[00034] The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments and detailed in the following description. Descriptions of well- known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein may be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
[00035] As mentioned above, there is a need for a system to provide customized driving support to each vehicle for preventing accidents on roads. Moreover, there
is a need for a system and method of providing visualization of traffic along with customized driving support. The embodiments herein achieve this by providing “A system and method of providing traffic visualization and customized driving support”. Referring now to the drawings and more particularly to Fig. la and Fig.3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
[00036] Fig. la illustrates a system providing traffic visualization and customized driving support. The visualization and support system 101 can include a visualization system 103, a long-range communication control 115, an integrated processing unit 117 and a user device 127.
[00037] In an embodiment, the visualization system 103 can include a plurality of vision sensors 105, environmental sensors 108, and a plurality of analysis systems 109.
[00038] In an embodiment, vision sensor 105 and environmental sensor 107 are mounted at a location in a region. The location can be based on crucial maneuvers in the region.
[00039] In an embodiment, the vision sensor 105 can be provided for recording visual data at the location and extracting features from the visual data of the location. In an embodiment, the features can include but not limited to characteristics of road including road conditions of potholes, sharp curve, speed breakers, landslides, zebra crossings, and lane markings, traffic signs, traffic signals, profile of road, driving compliance of vehicle, characteristics of vehicles including type, speed, direction of movement, number plate of the vehicle, colour of the vehicle, and loading, obstacles on road, accident on road, number of vehicles, and weather condition. The vision sensor can identify and extract the features present in the visual data at the location. The extracted features of the visual data at the location can be transmitted to the analysis system 109.
[00040] In an embodiment, the vision sensors can include but not limited to thermal, and HD imaging sub systems, and three-dimensional LIDAR systems.
[00041] In an embodiment, a three-dimensional capture system can be provided in the visualization system 103. The three-dimensional capture system can include VHF radios and light detection sub systems.
[00042] In an embodiment, the environmental sensor 107 can be provided for detecting parameters of environmental conditions at the location. The environmental conditions can include but not limited to rain, temperature, fog, humidity, wind. The detected parameters of the environmental conditions can be transmitted to the analysis system 109. In an embodiment, the environmental sensor 107 can include but not limited to temperature sensor, rain detecting sensor, fog detecting sensor, humidity sensor and wind sensor.
[00043] In an embodiment, the vision sensor 105 and the environmental sensor 107 can be mounted in an enclosure.
[00044] In an embodiment, the vision sensor 105 and the environmental sensor 107 can include a GPS unit 107. The vision sensor 105 can transmit coordinates of the location obtained from the GPS unit 107 with the visual data of the location to the analysis system 109. The environmental sensor 107 can transmit coordinates of the location obtained from the GPS unit 107 with the parameters of the environmental conditions at the location to the analysis system 109.
[00045] In an embodiment, the analysis system 109 can be configured for receiving the visual data with the extracted features from a plurality of vision sensors 105 at plurality of locations in the region and receiving the parameters of environmental conditions from a plurality of environmental sensors at plurality of locations in the region.
[00046] In an embodiment, the analysis system 109 can include but not limited to a real-time monitoring unit 111 and an artificial intelligence unit 111. The artificial intelligence unit 111 can include a vehicle monitoring unit 113.
[00047] The analysis system 109 can identify the features and parameters of environmental conditions impacting driving at the location and generate the elements based on the identified features and parameters.
[00048] The real-time monitoring unit 111 can be configured for identifying features and parameters capable of causing an impact on driving at the location for generating a first set of elements based on the identified features and parameters. The real-time monitoring unit 111 can generate the first set of elements based on identification of the features including but not limited to road characteristics, obstacles, and parameters of environmental conditions including but not limited to rain and fog.
[00049] The vehicle monitoring unit 113 can be configured for identifying features of other vehicles having an impact on driving at the location for generating a second set of elements based on the identified features and parameters. The vehicle monitoring unit 113 can generate the set of elements based on the features including but not limited to driving compliance of vehicle, vehicle characteristics, traffic density.
[00050] In an embodiment, the first set of elements and the second set of elements can be indicated on the visual data by the real-time monitoring unit 111 and the vehicle monitoring unit 113 using one or a combination of colour, symbol, text, image and audio for representing the identified features.
[00051] In an embodiment, the first set of elements and the second set of elements generated by the plurality of the analysis systems 109 in the region can be transmitted to the integrated processing unit 117 through the long range communication control 115. In an embodiment, the long range communication
control 115 can include but not limited to LTE (long term evaluation) based wireless broadband radio, LoRa, Bluetooth, Wi-Fi, bidirectional LTE band radios, satellite radios and LoRa WLAN. In an embodiment, the long range communication control 115 can be a long range low latency full duplex communication system.
[00052] In an embodiment, the system can include a bi-directional audio communication system. The audio communication system can include an operator device of an operator connecting to the user device 127 for facilitating communication. Thereby, the user can immediately connect and communicate to the operator for informing updates during emergency situations at the location.
[00053] In an embodiment, the integrated processing unit 117 and the analysis system 109 can include a microcontroller. The plurality of integrated processing units 117 can be connected to each other in the region through the long range communication controller 115. In an embodiment, the integrated processing unit 117 can be connected to the onboard user device 127 in a vehicle 125.
[00054] In an embodiment, the integrated processing unit 117 can be configured for creating an audio and visual representation of the region by combining the first set of elements, and second set of elements and the visual data at plurality of the locations in the region. The integrated processing unit 117 can include but not limited to a customized representation unit 119. The customized representation unit 119 can include a visual processing unit 121 and an audio processing unit 123.
[00055] In an embodiment, the visual processing unit 121 can create a graphical representation from the visual data and the elements received from the plurality of artificial intelligence units 109 in the region. The graphical representation can include but not limited to a map. The first set of elements and the second set of
elements can be indicated in the graphical representation for providing driving support in the region.
[00056] In an embodiment, the audio processing unit 123 can create an audio representation based on the first set of elements and the second set of elements received from the plurality of analysis systems 109 in the region. The audio processing unit 123 can convert details of the first set of elements and the second set of elements, and coordinates of the first set of elements and the second set of elements into the audio representation. The audio representation can emit audio of coordinates of the first set of elements and the second set of elements with details of the first set of elements and the second set of elements for providing driving support in the region.
[00057] The onboard user device 127 can include but not limited to an electronic device, a smart phone. A GPS unit 131 can be present in the onboard user device 127. The GPS unit 131 can transmit a location of the onboard user device 127 in the vehicle 125 to the integrated processing unit 117.
[00058] In an embodiment, the integrated processing unit 117 can customize the visual representation and the audio representation based on the location of the onboard user device. The visual representation and the audio representation can be customized relative to the location of the onboard user device for providing driving support at the location of the onboard user device.
[00059] In an embodiment, the integrated processing unit 117 can include a local storage for storing the created visual representation and the audio representation.
[00060] In an embodiment, the onboard audio device can be connected to one of the plurality of integrated processing units 117 through long range communication control for receiving the visual and audio representation. In an embodiment, the integrated processing units 117 facilitate transmitting of the visual and audio
representation to the onboard user devices 127 independent of internet availability at the remote locations.
[00061] In an embodiment, a driving assistance system 129 can be accessed using the onboard user device 127. The driving assistance system 129 can be accessed using the onboard user device 127 for displaying the visual representation and emitting the audio representation created by the customized representation unit 119.
[00062] In an embodiment, the driving assistance system 129 can be accessed through a QR code link. The QR code link can be provided at multiple locations in the region for facilitating linking of the onboard user device 127 to the integrated processing unit.
[00063] In an embodiment, the user can select one or both of the visual representation and the audio representation using the driving assistance system 129 in the onboard user device 127.
[00064] In an embodiment, the integrated processing unit can include communication protocols including but not limited to video streaming, dedicated data collection, loT messaging and GIS positioning IT systems. The unit can be capable of handling variable message and data payloads.
[00065] In an embodiment, the integrated processing unit 117 and the visualization system 103 can be powered by a battery. In an embodiment, the battery can be recharged through a renewable source including but not limited to solar panel, windmill. The battery can include 24 hours of battery back-up with power management system.
[00066] In an embodiment, the enclosure of the visualization system 103, and enclosure of the integrated processing unit 117 can include local security systems such as anti-vandal and theft protection sub systems for securing and preventing theft and/or vandalism. The enclosure can include fire, short circuit and
malfunction protection suit to facilitate functioning in different weather conditions. The enclosure can include a rugged military grade all terrain usable enclosure.
[00067] In an exemplary embodiment, the vision sensor can identify potholes, blockage/obstacle on road, number of vehicles, number plate of vehicle, driving compliance of the vehicles including speed of vehicles and direction of movement of vehicles, pedestrians at the location from the visual data. Based on the features, the artificial intelligence unit 110 can generate the first set of elements and the second set of elements. The first set of elements can include indicating dangerous potholes on the road ahead, indicating sharp curves on the road ahead that can cause probable collision based on location of the onboard user device, and indicating pedestrians at the location ahead. The second set of elements can include determining alert on probable collision based on driving compliance of vehicles at the location and speed of the vehicles at road ahead, and traffic density at the location based on a thermal map. Then, the identified first set of elements and the second set of elements at the location can be transmitted to the nearby integrated processing unit 117 connected through the long range communication control 115. The integrated processing unit 117 can create the graphical representation consisting of the visual data and the first set of elements and the second set of elements being indicated on the visual data. The graphical representation can be created on the map of the region with the real-time visual representation based on the location of the first set of elements and the second set of elements. In an embodiment, the graphical representation can include profile of bend road ahead, oncoming vehicular traffic, transit time updates, road impediments, live footage of road ahead, audio prompts, geo location based rules, statistics of location, emergency contacts, and information of performing first aid.
[00068] On receiving location of the onboard user device 127, the integrated processing unit can transmit a customized graphical representation based on a location of the onboard user device.
[00069] Fig. lb illustrates a network of system of providing traffic visualization and customized driving support. In an embodiment, the network can include the visualization system 103 being placed at plurality of location at a region. The location can be curves, and/or other crucial road conditions of the region. The visualization system can include a vision sensor and environmental sensor with a GPS unit connected to the analysis system 109. The visualization system 103 can be mounted in an enclosure for being protected from varying weather conditions.
[00070] The plurality of visualization systems 103 is connected to the integrated processing unit 117 through the long-range communication control 115. The integrated processing unit 117 is provided at a location in the region, wherein each integrated processing unit 117 is connected to plurality of visualization systems 103 nearby to the integrated processing unit 117. In an embodiment, each of the integrated processing units 117 are connected to each other for facilitating transmission of the audio and visual representation.
[00071] The plurality of onboard user devices 127 are connected to the integrated processing unit 117 through the long-range communication control 115 for accessing the driver support system 129. In an embodiment, the long-range communication control 115 can include but not limited to cellular broad band technology, local area wireless technology systems.
[00072] Fig. 2a illustrates the system of providing traffic visualization and customized driving support connected to a cloud platform. The visualization system 103 including the vision sensor 105, the environmental sensor 107 and the GPS unit 108 can be connected through the long range communication control 115 to a cloud platform 201.
[00073] In an embodiment, the cloud platform is a virtual on demand system for processing and analyzing data received from the visualization system 103.
[00074] The cloud platform 201 can perform functionalities of the analysis system 109 and the integrated processing unit 117. The cloud platform 201 can perform identifying the features and parameters of environmental conditions impacting driving at the location and generate the elements based on the identified features and parameters. The first set of elements and the second set of elements, similar to that of the real-time monitoring unit 111 and the vehicle monitoring unit 113 can be generated. The cloud platform 201 can create an audio and visual representation of the region by combining the first set of elements, and second set of elements and the visual data, similar to the integrated processing unit 117.
[00075] In an embodiment, the onboard user device 127 can be connected to the cloud platform 201 for accessing the audio representation and the visual representation.
[00076] The cloud platform 201 can be connected to the analysis system 109 and the integrated processing unit 117 (not shown in figure) through the long-range communication control 115 for performing real-time monitoring and controlling of the analysis system 109 and the integrated processing unit 117.
[00077] The cloud platform 201 is provided in addition to the analysis system 109 and the integrated processing unit 117 for improving scalability and efficiency. The cloud platform 201 can be updated with long term developments remotely for updating functioning of the system. In an embodiment, the cloud platform 201 can store information including GPS location, and constitute a real time geospatial database for the location of the visualization system 103.
[00078] Fig. 2b illustrates the network of system of providing traffic visualization and customized driving support connected to a cloud platform. In an embodiment, the plurality of visualization systems 103 provided at multiple locations at the
region can be connected to the cloud platform 201 through the long-range communication control 115. In an embodiment, the plurality of user devices 127 can be connected to the cloud platform 201 through the long-range communication control 115.
[00079] Fig. 3 illustrates an onboard user device of the system of providing traffic visualization and customized driving support. In an embodiment, the onboard user device 127 can include a controller 301, a communication interface 303, a display interface 305, and the GPS unit 131.
[00080] The communication interface 303 can be provided for communicating with the integrated processing units 117 provided in the region through the long- range communication control 115. The communication interface 303 can provide compatibility of communication with the long-range communication control 115 including but not limited to Lora, Bluetooth and Wi-Fi.
[00081] In an embodiment, the controller 301 can receive current location from the GPS unit 131 and transmit the location of the onboard user device 127 to the integrated processing unit 117 for processing customized graphical representation. [00082] In an embodiment, the display interface 305 can be provided for displaying data received from the integrated processing unit 117 and transmitting feedback from user. The display interface 305 can be provided for connecting to the driver support system 129 through the communication interface 303. The driver support system 129 can include but not limited to an application, website. A live scenario 307 option, an upcoming scenario 309 option and a user interactive control 311 can be accessed in the driver support system 129. The live scenario 307 can provide a real-time customized graphical representation at the location of the onboard user device 127. The upcoming scenario 309 can provide a real-time customized graphical representation at a location ahead of the onboard
user device 127. The user interactive control 311 can facilitate feedback from the user, wherein the feedback can include but not limited to alerts at the location.
[00083] In an embodiment, the onboard user device can project the graphical representation on a window panel for displaying. The onboard user device can include a HUD display with audio navigation prompts.
[00084] Fig. 4 illustrates a flow chart of a method of providing traffic visualization and customized driving support. In an embodiment, the method of providing traffic visualization and customized driving support includes the following steps. Initially, extracting 401 features from a visual data of a location in a region, by a vision sensor. The visual data of the location in a region is recorded by the vision sensor. A plurality of visual sensors is provided at multiple locations in the region.
[00085] Next, detecting 402 parameters of environmental conditions at a location in the region, by an environmental sensor. In an embodiment, the environmental conditions can include but not limited to rain, temperature, fog, humidity, and wind.
[00086] Then, generating 403 elements capable of impacting driving at the location based on the features, by an artificial intelligence unit. In an embodiment, generating the elements includes identifying features having impact on driving at the location for generating first set of elements based on the identified features, by a real-time monitoring unit, and identifying features of other vehicles having an impact on driving at the location for generating second set of elements based on the identified features, by a vehicle monitoring unit.
[00087] Next, transmitting 404 the generated elements to an integrated processing unit using a long range communication controller, by the artificial intelligence unit.
[00088] Then, creating 405 a graphical representation using the visual data and the elements at the location, by the integrated processing unit. The graphical representation including audio representation and a visual representation.
[00089] Next, customizing 406 the graphical representation based on a location received from a user device of a vehicle, by the integrated processing unit.
[00090] Finally, displaying 407 the customized graphical representation for providing driving support on a driver support system, by the onboard user device.
[00091] In an embodiment, the method can include generating elements capable of impacting driving at the location based on the features, by a cloud platform. In an embodiment, generating the elements includes identifying features having impact on driving at the location for generating first set of elements based on the identified features, by a real-time monitoring unit, and identifying features of other vehicles having an impact on driving at the location for generating second set of elements based on the identified features, by a vehicle monitoring unit. Then, the cloud platform can create a graphical representation using the visual data and the elements at the location. The graphical representation can include audio representation and a visual representation. Thereby, the cloud platform can facilitate improving scalability and efficiency of the system.
[00092] Next, customizing 406 the graphical representation based on a location received from a user device of a vehicle, by the integrated processing unit.
[00093]
[00094] A main advantage of the present disclosure is that the system and method provides traffic visualization and customized driving support.
[00095] Another advantage of the present disclosure is that the system and method provides traffic visualization and customized driving support based on both real-time factors and predictive factors.
[00096] Still another advantage of the present disclosure is that the system and method generate a graphical representation for providing traffic visualization and customized driving support.
[00097] Yet another advantage of the present disclosure is that the system and method provide customization of assistance in driving for each vehicle in a region.
[00098] The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein
Claims
1. A system of providing traffic visualization and customized driving support, comprising: a vision sensor (105) extracting features from a visual data detected at a location in a region, wherein the features capable of impacting driving at the location; an environmental sensor (107) detecting parameters of environmental conditions at a location in a region; an analysis system (109) connected to the vision sensor (105) and the environmental sensor (107) for generating a first set of elements and a second set of elements based on the features and parameters; wherein a real-time monitoring unit (111) provided in the analysis system (109) for identifying features and parameters at the location capable of causing impact on driving for generating the first set of elements; wherein a vehicle monitoring unit (113) provided in the analysis system (109) for identifying features of other vehicles capable of causing impact on driving at the location for generating the second set of elements; an integrated processing unit (117) connected to plurality of the analysis systems (107) for creating an audio and visual representation of the region by combining the first set of elements and second set of elements, and the visual data at plurality of the locations; and an onboard user device (127) connected to the integrated processing unit (117) for receiving a customized graphical representation of the region based on a location of the onboard user device (127).
2. The system as claimed in claim 1, wherein the onboard user device (127) transmitting a location of the vehicle to the integrated processing unit (117); and
the customized representation unit (119) customizing the graphical representation based on a live location of the vehicle, thereby providing graphical representation of a road at the location, ahead of the location and behind the location of the vehicle for providing driving support.
3. The system as claimed in claim 1, wherein the integrated processing unit (117) connected to other integrated processing units (117) in the region through the long range communication control (115); and wherein each integrated processing unit (117) transmitting the audio and visual representation at a location to another integrated processing unit (117) at another location in the region.
4. The system as claimed in claim 3, wherein the onboard user device (127) connected to the integrated processing unit (117) through the long-range communication control (117) for receiving the customized graphical representation, thereby facilitating providing driving support in regions independent of internet access.
5. The system as claimed in claim 1, wherein the features including characteristics of road including road conditions of potholes, speed breakers, zebra crossings, lane markings, traffic signs, traffic signals, land-slides, driving compliance of vehicle, profile of road, vehicle type including car, truck, bus, two-wheeler, traffic density, characteristics of vehicles including speed, direction of movement, and load on the vehicle, obstacles on road, traffic signs, and lane markings.
6. The system as claimed in claim 5, wherein the real-time monitoring unit (110) generating first set of elements based on identifying the features of characteristics of road, obstacles on road, traffic signs, and lane markings, and parameters of rain, fog.
7. The system as claimed in claim 5, wherein the vehicle monitoring unit (113) generating second set of elements based on the features of driving compliance of vehicle, vehicle type, traffic density, characteristics of vehicles, and load on the vehicle.
8. The system as claimed in claim 1, wherein the visual representation including a map of the region based on the visual data; and the first set of elements and the second set of elements indicated on the map for providing driving support at the region.
9. The system as claimed in claim 8, wherein the analysis system (109) indicating the first set of elements and the second set of elements using one or a combination of colour, symbol, text, image and audio for representing the identified features.
10. The system as claimed in claim 1, wherein the analysis system (109) and the integrated processing unit (117) including a microcontroller.
11. The system as claimed in claim 1, wherein the onboard user device (127) including a microcontroller (301), a GPS unit (131) and a display interface (305); wherein the display interface (305) provided for accessing a driver support system (129); and the driver support system (129) displaying the customized graphical representation received from the integrated processing unit (117).
12. The system as claimed in claim 1, wherein a QR code facilitating accessing of the driver support system (129) on the onboard user device (127) through scanning.
13. The system as claimed in claim 1, wherein a cloud platform (201) connected to the vision sensor (105), the environmental sensor (107), and the GPS unit (108), and the onboard user device (127); and
wherein the cloud platform (201) performing functionalities of the analysis system (109) and the integrated processing unit (117) for enhancing scalability and efficiency.
14. A method of providing traffic visualization and customized driving support, comprising the steps of: extracting (401) features from a visual data of a location at a region, by a vision sensor; detecting (402) parameters of environmental conditions at a location in the region, by an environmental sensor; generating (403) a first set of elements and a second set of elements capable of impacting driving at the location based on the features, by an analysis system; transmitting (404) the generated elements to an integrated processing unit, by the analysis system; creating (405) a graphical representation using the visual data and the elements at the location, by the integrated processing unit; customizing (406) the graphical representation based on a location received from a user device of a vehicle, by the integrated processing unit; and displaying (407) the customized graphical representation for providing driving support, by the onboard user device.
15. The method as claimed in claim 14, wherein generating the elements including: identifying features and parameters having impact on driving at the location for generating first set of elements, by a real-time monitoring unit; and identifying features of other vehicles having impact on driving at the location for generating second set of elements, by a vehicle monitoring unit.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN202341075848 | 2023-11-07 | ||
| IN202341075848 | 2023-11-07 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025099513A1 true WO2025099513A1 (en) | 2025-05-15 |
Family
ID=95695210
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IB2024/060037 Pending WO2025099513A1 (en) | 2023-11-07 | 2024-10-14 | A system and method of providing traffic visualization and customized driving support |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025099513A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9191574B2 (en) * | 2001-07-31 | 2015-11-17 | Magna Electronics Inc. | Vehicular vision system |
| US9766625B2 (en) * | 2014-07-25 | 2017-09-19 | Here Global B.V. | Personalized driving of autonomously driven vehicles |
-
2024
- 2024-10-14 WO PCT/IB2024/060037 patent/WO2025099513A1/en active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9191574B2 (en) * | 2001-07-31 | 2015-11-17 | Magna Electronics Inc. | Vehicular vision system |
| US9766625B2 (en) * | 2014-07-25 | 2017-09-19 | Here Global B.V. | Personalized driving of autonomously driven vehicles |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11037382B2 (en) | System and method for evaluating operation of environmental sensing systems of vehicles | |
| US9373255B2 (en) | Method and system for producing an up-to-date situation depiction | |
| US10048700B1 (en) | Generating state information for autonomous vehicles | |
| RU2686159C2 (en) | Detection of water depth for planning and monitoring vehicle route | |
| US10621451B1 (en) | Image and video compression for remote vehicle assistance | |
| US9175966B2 (en) | Remote vehicle monitoring | |
| US10282999B2 (en) | Road construction detection systems and methods | |
| US9558408B2 (en) | Traffic signal prediction | |
| US12217605B2 (en) | Technology for situational modification of autonomous vehicle operation | |
| US20150106010A1 (en) | Aerial data for vehicle navigation | |
| US20170287335A1 (en) | Smart vehicle | |
| US20160357188A1 (en) | Smart vehicle | |
| US20160357187A1 (en) | Smart vehicle | |
| US20160357262A1 (en) | Smart vehicle | |
| US11512973B2 (en) | Method and device for outputting lane information | |
| US9652982B2 (en) | Method and system for learning traffic events, and use of the system | |
| CN102362301A (en) | Information providing device for vehicle | |
| CN115571144A (en) | Accident intensity estimation for a vehicle | |
| KR20220077125A (en) | Information processing apparatus, information processing system, and information processing method | |
| US20240395140A1 (en) | Data Consumable for Intelligent Transport System | |
| CN112824150A (en) | System and method for communicating anticipated vehicle maneuvers | |
| CN114655260A (en) | Control system of unmanned tourist coach | |
| CN203465850U (en) | Artificial intelligence driving safety warning system | |
| US11348191B2 (en) | System and method for vehicle reporting electrical infrastructure and vegetation twining | |
| WO2025099513A1 (en) | A system and method of providing traffic visualization and customized driving support |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24888156 Country of ref document: EP Kind code of ref document: A1 |