[go: up one dir, main page]

WO2025012920A1 - Système de surveillance d'un utilisateur dans véhicule et procédé associé - Google Patents

Système de surveillance d'un utilisateur dans véhicule et procédé associé Download PDF

Info

Publication number
WO2025012920A1
WO2025012920A1 PCT/IN2024/050377 IN2024050377W WO2025012920A1 WO 2025012920 A1 WO2025012920 A1 WO 2025012920A1 IN 2024050377 W IN2024050377 W IN 2024050377W WO 2025012920 A1 WO2025012920 A1 WO 2025012920A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
vehicle
processing unit
predefined
image sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/IN2024/050377
Other languages
English (en)
Inventor
Sumeet Shekhar
Rajan Sippy
Naresh Adepu
Atharva KADETHANKAR
Manish Sharma
Siddapura NAGARAJU PRASHANTH
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
TVS Motor Co Ltd
Original Assignee
TVS Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by TVS Motor Co Ltd filed Critical TVS Motor Co Ltd
Publication of WO2025012920A1 publication Critical patent/WO2025012920A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/12Details of acquisition arrangements; Constructional details thereof
    • G06V10/14Optical characteristics of the device performing the acquisition or on the illumination arrangements
    • G06V10/141Control of illumination

Definitions

  • the present invention relates to monitoring of a user in a vehicle. More particularly, the present invention relates to a system and a method for monitoring the user in the vehicle.
  • a saddle type vehicle faces technical challenges for implementing the user monitoring system.
  • the saddle type vehicle is more susceptible to vibrations and movements due to road condition, and this impacts the accuracy and reliability of the sensors and camera.
  • the saddle type vehicle is more susceptible to noise and hence, may impact the overall efficiency of the user monitoring systems.
  • the saddle type vehicles have space constraints as well and hence, cannot accommodate a huge user monitoring system.
  • the automation also needs to allow the vehicle to make informed decisions on the safety of the user, in addition to overall improvement in user experience.
  • the user monitoring system is not present due to various constraints such as cost, technical challenges, space, and the like. Further, the existing systems do not allow the vehicle to intervene in the middle of the trip, which severely restricts user experience and has significant safety concerns.
  • the present invention relates to a system for monitoring a user in a vehicle.
  • the system has one or more image sensors for capturing real-time images of the user riding the vehicle.
  • the system further has a processing unit configured to receive the real-time images of the user from the one or more image sensors during riding condition of the vehicle.
  • the processing unit is further configured to determine one or more user activities based on the real-time images received from the one or more image sensors.
  • the processing unit is then configured to determine whether the one or more user activities correspond to a set of predefined user conditions
  • the system further has a feedback module configured to receive an input from the processing unit if the one or more user activities correspond to the set of predefined user conditions.
  • the feedback module is then configured to generate an output command.
  • the output command generated by the feedback module includes at least one of an indication to the user or activating one or more rider assistance and comfort functions.
  • the processing unit has one or more modules, the one or more modules being configured to receive real time images of the user for determining the one or more user activities during riding condition of the vehicle.
  • the processing unit has an analysis module, the analysis module configured to determine one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the system has an illumination sensor unit, the illumination sensor unit is in communication with the processing unit and is configured to detect a level of ambient light around the vehicle, and the processing unit is configured to switch on a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light.
  • the system has an auxiliary sensor unit, the auxiliary sensor unit is in communication with the processing unit and is configured to detect one or more vehicle parameters. The auxiliary sensor unit is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold. The processing unit is further configured to switch off the one or more image sensors and the illumination sensor unit if the one or more vehicle parameters are below the first predetermined threshold.
  • the present invention relates to a method for monitoring a user in a vehicle.
  • the method has the steps of capturing, by one or more image sensors, real-time images of the user riding the vehicle; receiving, by a processing unit, the real-time images of the user riding the vehicle captured by the one or more image sensors; determining, by the processing unit, one or more user activities based on the real-time images received from the one or more image sensors; determining, by the processing unit, whether the one or more user activities correspond to a set of predefined user conditions; and generating, by a feedback module, an output command if the one or more user activities correspond to the set of predefined user conditions.
  • the output command generated by the feedback module has at least one of an indication to the user or activating one or more rider assistance and comfort functions.
  • the one or more modules of the processing unit being configured to receive real time images of the user for determining the one or more user activities during riding condition of the vehicle.
  • the processing unit has an analysis module, the analysis module configured to determine one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activity during the predefined time, wherein the set of predefined user conditions includes one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the method further has the step of collecting, a data set in relation to one or more activities of the user; processing, the data set for further analysis by filtering and transforming of the data set; annotating, the data set; feeding, the annotated data set into the one or more modules; and training, the one or more modules to determine the one or more user activities based on the real-time images of the user.
  • the method further has the steps of detecting, by an illumination sensor unit, an ambient light around the vehicle; and switching on, by the processing unit, a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light.
  • the method further has the steps of detecting, by an auxiliary sensor unit, one or more vehicle parameters; determining, by the auxiliary sensor, whether the one or more vehicle parameters are below a first predetermined threshold; and switching off, by the processing unit, the one or more image sensors and the illumination sensor unit if the one or more vehicle parameters are below the first predetermined threshold.
  • Figure 1 illustrates a system for monitoring a user in a vehicle, in accordance with an embodiment of the present invention.
  • FIG. 2 illustrates the user activities and the predefined user conditions, in accordance with an embodiment of the invention.
  • Figure 3 illustrates the steps involved in the method for monitoring a user in a vehicle, in accordance with an embodiment of the invention.
  • Figure 4 illustrates the steps involved in the method for training the one or more modules for monitoring a user in a vehicle, in accordance with an embodiment of the invention.
  • FIG. 5 illustrates a process flow of system and method for monitoring the user in the vehicle, in accordance with an embodiment of the present invention.
  • Figure 6 illustrates a software architecture for the system and method for monitoring the user in the vehicle, in accordance with an embodiment of the present invention.
  • the present invention relates to monitoring of a user in a vehicle. More particularly, the present invention relates to a system and a method for monitoring a user in a vehicle.
  • the system and method of the present invention are typically used in a vehicle such as a two wheeled vehicle, or a three wheeled vehicle including trikes, or a four wheeled vehicle, or other multi-wheeled vehicles as required.
  • Figure 1 illustrates a system 100 for monitoring a user in a vehicle, in accordance with an embodiment of the present invention.
  • the monitoring of the user is done based on certain predefined conditions of the user in real-time during riding condition of the vehicle. These conditions are the static and dynamic characteristics of the user that define the behaviour or performance of the user when the vehicle is on the road and the user is riding the vehicle.
  • the predefined user conditions include but are not limited to the following:
  • Drowsiness The user is monitored based on the fatigue and the drowsiness of the user. For example, the user may fall asleep while riding the vehicle or experience decreased alertness and hence, leading to an increased risk of accident. Therefore, it becomes important to monitor the user in such situations.
  • Distraction The user may be distracted while riding the vehicle and the same may lead to road accidents. For example, the users often engage in activities such as texting, browsing the internet, calling, and the like. Therefore, it becomes important to monitor the user in such distracted situations.
  • Impaired Riding The users, sometimes, may be riding the vehicle under the influence of drugs or alcohol consumption. It reduces the overall riding skills of the user and hence, may lead to a significant risk to the safety of the user. Therefore, it becomes important to monitor the user in such erratic situations.
  • the users may experience sudden medical emergencies while riding the vehicle, such as a heart attack, seizure, and the like.
  • the user monitoring systems can help detect abnormal behaviour, vital signs, or other indicators of medical distress. Without such monitoring, the timely identification of a medical emergency becomes difficult, leading to accidents of the users.
  • the user monitoring systems are essential for evaluating the performance and behaviour of the user. They can provide valuable data on aspects such as user pose, behaviour, and actions. This information can be utilized for training the user, identifying areas of improvement, and promoting safe riding habits in the user.
  • the predefined user conditions are detected dynamically by the system 100 during the course of riding the vehicle by the user in real-time.
  • the system 100 has one or more image sensors 110 being configured for capturing real-time images of the user riding the vehicle.
  • the one or more image sensors 110 captures the user in the real-time and generates the live feed of the user in the form of images and videos for further processing.
  • the one or more image sensors 110 comprise one or more of, but not limited to a camera, a Red-Green-Blue camera, Red-Green-Blue + Infrared camera, Infrared camera, monochrome camera, thermal camera, RADAR, and the like.
  • the camera is configured to capture the images and the videos of the user in the real-time.
  • the system 100 has a processing unit 120 that is configured to receive the real-time images of the user from the one or more image sensors 110.
  • the processing unit 120 is further configured to determine one or more user activities based on the real-time images received from the one or more image sensors 110.
  • the processing unit 120 includes one or more modules, the one or more modules being configured to determine the one or more user activities during riding condition of the vehicle based on the real time images.
  • the one or more modules comprises a plurality of Artificial Intelligence based models having machine learning and deep machine learning capabilities.
  • the one or more user activities corresponds to one or more of: head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user.
  • the processing unit 120 is configured to determine whether the one or more user activities correspond to one or more of a set of predefined user conditions.
  • the processing unit 120 includes an analysis module 122.
  • the analysis module 122 is configured to determine one or more user activities from the set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time.
  • the predetermined frequency of time ranges between 5 seconds to 5 minutes.
  • the analysis module 122 determines that one or more user activities, say yawning occurs a greater number of times than a threshold value in a span of 5 minutes, the analysis module 122 determines that the same corresponds to one or more of the set of predefined rider conditions.
  • the set of predefined user conditions comprises one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to a set of predefined user conditions based on the frequency of the one or more of the user activity during the predefined time,, interdependency of activities, activity threshold, data correlation, data dependency, and the like to determine the drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the processing unit 120 is configured to detect one or more of a head movement, yawn, lip movement, gaze, eye blink, hand pose, presence of mobile phone and seating & standing pose.
  • the inputs of the aforementioned user activities are received by the analysis module 122.
  • the analysis module 122 determines one or more of a set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time. As illustrated, the analysis module 122 determines that the one or more user activities correspond to a fatigue condition based on the frequency of one or more of the following detected user activities within the predetermined time - yawn, gaze, eye blink, hand pose and seating & standing pose.
  • the analysis module 122 determines that the one or more user activities correspond to a distraction condition based on the frequency of one or more of the following detected user activities within the predetermined time - lip movement, gaze, presence of mobile phone, and head movement. Similarly, the analysis module 122 determines that the one or more user activities correspond to a drowsiness condition based on the frequency of one or more of the following detected user activities within the predetermined time - yawn, gaze, eye blink and head movement. Similarly, the analysis module 122 determines that the one or more user activities correspond to an abnormal driving condition based on the frequency of one or more of the following detected user activities within the predetermined time - presence of mobile phone, hand pose and seating & standing position. Further, the analysis module 122 determines that the one or more user activities correspond to a bad riding pose condition based on the frequency of one or more of the following detected user activities within the predetermined time - hand pose and seating & standing pose.
  • a user ‘X’ is riding the vehicle.
  • the user ‘X’ yawns for n number of times and blinks his eye for m number of times in five minutes.
  • the camera captures the images and videos of the user yawning and blinking the eyes in real-time.
  • the processing unit analyses the images and videos and determines that the user ‘X’ is in fatigue condition. Based on the analysis, the feedback module is activated, and it generates an indication to the user and also activates the rider assistance unit. Based on the rider assistance unit, the speed of the vehicle is reduced, and the seat of the vehicle is cooled down to provide the comfort to the user ‘X’.
  • a user ‘Y’ is continuously riding the vehicle for five hours in cruise control mode
  • the processing unit analyses the data based on geographical factors, road conditions, and climatic factors of the area. Based on the analysis, the feedback module is activated, and it generates an indication to the user and also activates the rider assistance unit as well. Based on the rider assistance unit, the speed of the vehicle is reduced to improve the safety of the user ‘Y’.
  • the system 100 has a feedback module 130.
  • the feedback module 130 is in communication with the processing unit 120.
  • the feedback module 130 is configured to receive an input from the processing unit 120 if the one or more user activities correspond to one or more of the set of predefined user conditions.
  • the feedback module 130 is configured to generate an output command if the one or more user activities correspond to one or more of the set of predefined user conditions.
  • the output command generated by the feedback module 130 includes at least one of an indication to the user or activating one or more rider assistance and comfort functions.
  • the one or more rider assistance and comfort functions comprises one or more of cooling of a seat of the vehicle, limiting a speed of the vehicle to a predetermined value of vehicle speed, adaptive cruise control functionality, and automatic braking.
  • the feedback module 130 generates an output command for cooling of the seat of the vehicle when a fatigue condition is detected by the processing unit 120.
  • the speed of the vehicle may be limited to 30% of the maximum speed limit in case an output command is generated by the feedback module 130.
  • the indication to the user comprises one or more of voice indication, video indication, haptic indication, display indication, and user activity report. In an embodiment, if the user is riding the vehicle for a predefined duration, for example more than five hours, the indication is also sent to the user based on the road condition, climate details, geographical location, and the like.
  • the system 100 further includes an illumination sensor unit 140.
  • the illumination sensor unit 140 is in communication with the processing unit 120.
  • the illumination sensor unit 140 is configured to detect a level of ambient light around the vehicle.
  • the processing unit 120 is configured to switch on a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of ambient light. For example, the brightness of the instrument cluster is increased if the detected ambient light is below a predetermined threshold.
  • the system 100 further includes an auxiliary sensor unit 150.
  • the auxiliary sensor unit 150 is in communication with the processing unit 120 and is configured to detect one or more vehicle parameters.
  • the auxiliary sensor unit 150 is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold.
  • the one or more vehicle parameters comprise a state of charge of the battery of the vehicle
  • the first predetermined threshold is the state of charge of the battery ranging between 15-20%.
  • the processing unit 120 is further configured to switch off the one or more image sensors 110 and the illumination sensor unit 140 if the one or more vehicle parameters are below the first predetermined threshold.
  • the vehicle parameters include but is not limited state of charge of a battery of the vehicle. Thus, provision of the auxiliary sensor unit 150 prevents deep discharge of the battery.
  • the present invention relates to a method 200 for monitoring a user in a vehicle.
  • the steps involved in the method 200 for monitoring the user in the vehicle are illustrated in Figure 3.
  • one or more image sensors 110 are activated.
  • the one or more image sensors 110 are activated when the vehicle is switched ON by the user. This activates all the sensors of the vehicle for monitoring the user during the riding condition in real-time.
  • the one or more image sensors 110 captures real-time images of the user riding the vehicle. Therefore, once the one or more image sensors 110 are activated, it captures the images and the videos of the user in the real-time and sends it to the processing unit 120 for further processing.
  • the processing unit 120 receives the real-time images of the user riding the vehicle captured by the one or more image sensors 110.
  • the processing unit 120 determines one or more user activities based on the real-time images received from the one or more image sensors 110 and processed by the processing unit 120.
  • the processing unit 120 includes one or more modules, the one or more modules is configured to determine the one or more one or more user activities.
  • one or more modules of the processing unit 120 are configured to receive real time images of the user for determining the one or more user activities during the riding condition of the vehicle. .
  • the one or more user activities corresponds to one or more of: head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user.
  • the processing unit 120 determines whether the one or more user activities correspond to one or more of a set of predefined user conditions.
  • the processing unit 120 includes an analysis module 122.
  • the analysis module 122 is configured to determine one or more user activities from the set of predefined user conditions, based on the frequency of the one or more of the user activities during the predefined time.
  • the predetermined frequency of time ranges between 5 seconds to 5 minutes. For example, if the analysis module 122 determines that one or more user activities, say yawning occurs a greater number of times than a threshold value in a span of 5 minutes, the analysis module 122 determines that the same corresponds to one or more of the set of predefined rider conditions.
  • the set of predefined user conditions comprises one or more of: drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to a set of predefined user conditions based on the frequency of the one or more of the user activity during the predefined time, interdependency of activities, activity threshold, data correlation, data dependency, and the like to determine the drowsiness of the user, fatigue of the user, distraction of the user, abnormal riding of the vehicle, and bad posture of the user.
  • the feedback module 130 generates an output command if the one or more user activities correspond to one or more of the set of predefined user conditions.
  • the output command generated by the feedback module 130 includes at least one of an indication to the user or activating one or more rider assistance and comfort functions.
  • the one or more rider assistance and comfort functions comprises one or more of cooling of a seat of the vehicle, limiting a speed of the vehicle to a predetermined value of vehicle speed, adaptive cruise control functionality, and automatic braking.
  • the speed of the vehicle may be limited to a predefined percentage of the maximum speed limit in case an output command is generated by the feedback module 130.
  • the indication to the user comprises one or more of voice alert, video alert, haptic alert, display alert, and user activity report.
  • the indication is also sent to the user based on the road condition, climate details, geographical location, and the like for the safety of the user. For example, if the user is drowsy, then the feedback module generates an output command and sends an indication to the user.
  • the rider assistance unit is also activated which cools down the seat of the user and reduces the speed of the vehicle below a predetermined speed.
  • the method includes detecting, by an illumination sensor unit 140, an ambient light around the vehicle.
  • the method then includes switching on, by the processing unit 120, a vehicle lighting system or an illuminator unit if the ambient light is below a predetermined threshold value of the ambient light. This ensures the safe riding condition of the user in the real-time.
  • the method includes detecting, by an auxiliary sensor unit 150, one or more vehicle parameters; determining, by the auxiliary sensor unit 150, whether the one or more vehicle parameters are below a first predetermined threshold; and switching off, by the processing unit 120, the one or more image sensors 110 and the illumination sensor unit 140 if the one or more vehicle parameters are below the first predetermined threshold.
  • the vehicle parameters include but is not limited to aggressiveness factor, which is indicative of variation in throttle input, other rider control inputs such as braking, clutch actuation, and the like, lean angle data indicative of the lean of the vehicle by the user, and the illumination around the vehicle.
  • the one or more vehicle parameters comprise a state of charge of the battery of the vehicle, and the first predetermined threshold is the state of charge of the battery ranging between 15- 20%.
  • a data set is collected in relation to one or more activities of the user.
  • the inputs are received in relation to one or more activities of the user from at least one of an Artificial Intelligence model or one or more image sensors 110.
  • the one or more activities of the user corresponds to head movement of the user, lip movement of the user, yawn by the user, eye movement of the user, blinking by the user, hand position of the user, usage of mobile device by the user, and sitting or standing by the user.
  • a set of data related to the physical activity of the user are collected for training the one or more modules for making predictions and decisions.
  • the collected data set is processed for further analysis.
  • the processing is done by filtering and transforming of the data set.
  • the set of data are processed to prepare it for further analysis and modelling. It further involves transforming and cleaning the data to improve the quality, reliability, and compatibility of the data.
  • the data set is annotated.
  • the processed data is annotated, for example, by annotating with specific attributes to the data points to provide context and meaning to the information for easy processing.
  • the annotated data set is fed into the one or more modules.
  • the one or more modules are trained to determine the one or more user activities based on the real-time images of the user.
  • the one or more modules are evaluated.
  • the one or more modules is configured to determine the activities of the user and the same is then deployed in the processing unit 120.
  • FIG. 5 illustrates a process flow of the present invention in accordance with an embodiment of the invention.
  • the one or more image sensors 110 capture real time images of the user.
  • the one or more image sensors 110 capture the real time images of the user as a video stream, which is then converted to video input, and this video input is then encoded for further processing.
  • the processing unit 120 Y1 detects the one or more user activities based on the Artificial Intelligence model.
  • the processing unit 120 receives input from the image sensors 110 in relation to one or more of the following: a head movement detection, yawn detection, lip movement detection, seating & standing pose detection, hand pose detection, mobile detection, gaze detection and eye blink detection.
  • the analysis module 122 of the processing unit 120 determines whether the one or more user activities correspond to one or more of the set of predefined user conditions which include a fatigue detection, a drowsiness detection, distraction detection, bad pose detection and abnormal driving detection. Based on the inputs from the analysis module 122 of the processing unit 120, the feedback module 130 generates indications for the user using one or more of a voice alert functionality, a haptic alert functionality, a display alert functionality, or a user activity report. Further, the feedback module 130 generates an output command for seat cooling, limiting of vehicle speed referred to as limp/lymph home mode and limiting of vehicle autonomous functionalities.
  • FIG. 6 illustrates the software architecture in relation to the present invention.
  • the software architecture has a user activity analysis module 122.
  • the processing unit 120 comprises the plurality of modules including the analysis module 122.
  • the analysis module 122 of the processing unit 120 is operatively coupled to a plurality of microservices 176.
  • Microservices are libraries which are a part of the processing unit 120.
  • the processing unit 120 receives the real time images from the one or more image sensors 1 10 through a hardware abstraction layer 172 and an operating system 170. Based on the inputs, the analysis module 122 receives inputs of one or more user activities from a user activity detection module 124.
  • the user activity is analysed by the analysis module 122 and the user activity interpretation module 123 determines whether the one or more user activities correspond to one or more of a set of predefined user conditions based on user activity frequency and time.
  • the processing unit 120 also has a debug module 178 for limiting chances of error in the functioning of the processing unit 120.
  • the present invention provides a system and a method for monitoring a user in a vehicle, wherein the system monitors a physical state of the user using one or more modules in real time, which enhances the overall user experience and the safety of the user. Further, the present invention allows for providing an accurate, efficient, and reliable system for monitoring the user according to the riding style of the user.
  • the present invention generates an indication to be sent to the user and therefore, enhances the safety of the user in real-time.
  • the indication is generated based on the user physical characteristics such as drowsiness detection, distraction detection, impaired riding detection, medical emergency detection, bad pose detection, and the like.
  • the present invention allows for providing the comfort to the user based on the user physical characteristics.
  • the present system being customised to generate the indication in real-time without the intervention of the user and therefore, it increases the performance, handling, market attractiveness of the vehicle. Further the present invention allows the vehicle to intervene in a middle of a trip, which allows the vehicle to make better informed and correct decisions, thus enhancing the safety and monitoring of the user.
  • implementation of the system and method of the present invention is done in the real-time based on the activities of the user and the vehicle parameters, thus ensuring better safety and monitoring of the user in the vehicle.
  • the present system is cost- effective and reliable and hence, the system can be integrated with the vehicle for the safety of the user.
  • a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
  • a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein.
  • the term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

La présente invention concerne un système (100) et un procédé (200) de surveillance d'un utilisateur dans un véhicule. Le système (100) comprend un ou plusieurs capteurs d'image (110) pour capturer des images en temps réel de l'utilisateur conduisant le véhicule. Le système (100) comprend une unité de traitement (120) conçue pour recevoir les images en temps réel de l'utilisateur à partir du ou des capteurs d'image (110) et déterminer une ou plusieurs activités d'utilisateur sur la base des images en temps réel reçues en provenance du ou des capteurs d'image (110). L'unité de traitement (120) est conçue pour déterminer si la ou les activités d'utilisateur correspondent à un ensemble de conditions d'utilisateur prédéfinies. Le système (100) comporte un module de rétroaction (130) conçu pour recevoir une entrée provenant de l'unité de traitement si la ou les activités d'utilisateur correspondent à l'ensemble de conditions d'utilisateur prédéfinies. Le module de rétroaction (130) est conçu pour générer une commande de sortie.
PCT/IN2024/050377 2023-07-10 2024-04-11 Système de surveillance d'un utilisateur dans véhicule et procédé associé Pending WO2025012920A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN202341046204 2023-07-10
IN202341046204 2023-07-10

Publications (1)

Publication Number Publication Date
WO2025012920A1 true WO2025012920A1 (fr) 2025-01-16

Family

ID=94214947

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IN2024/050377 Pending WO2025012920A1 (fr) 2023-07-10 2024-04-11 Système de surveillance d'un utilisateur dans véhicule et procédé associé

Country Status (1)

Country Link
WO (1) WO2025012920A1 (fr)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180137375A1 (en) * 2015-07-17 2018-05-17 Hitachi Automotive Systems, Ltd. Onboard environment recognition device

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180137375A1 (en) * 2015-07-17 2018-05-17 Hitachi Automotive Systems, Ltd. Onboard environment recognition device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANSARI SHAHZEB, NAGHDY FAZEL, DU HAIPING, PAHNWAR YASMEEN NAZ: "Driver Mental Fatigue Detection Based on Head Posture Using New Modified reLU-BiLSTM Deep Neural Network", IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, IEEE, PISCATAWAY, NJ, USA, vol. 23, no. 8, 1 August 2022 (2022-08-01), Piscataway, NJ, USA , pages 10957 - 10969, XP093267390, ISSN: 1524-9050, DOI: 10.1109/TITS.2021.3098309 *

Similar Documents

Publication Publication Date Title
KR20200063193A (ko) 운전 관리 방법 및 시스템, 차량 탑재 지능형 시스템, 전자 기기, 매체
CN110588512A (zh) 一种危险驾驶识别与预警装置、方法及系统
CN112220480A (zh) 基于毫米波雷达和相机融合的驾驶员状态检测系统及车辆
Guria et al. Iot-enabled driver drowsiness detection using machine learning
Biju et al. Drowsy driver detection using two stage convolutional neural networks
WO2025012920A1 (fr) Système de surveillance d'un utilisateur dans véhicule et procédé associé
Aboagye et al. Design and Development of Computer Vision-Based Driver Fatigue Detection and Alert System
Xie et al. Revolutionizing road safety: YOLOv8-powered driver fatigue detection
CN113312958B (zh) 一种基于司机状态的派单优先度调整方法及装置
Adarsh et al. Drowsiness detection system in real time based on behavioral characteristics of driver using machine learning approach
WO2025037348A1 (fr) Système de surveillance d'un utilisateur dans un véhicule et procédé associé
AU2021105935A4 (en) System for determining physiological condition of driver in autonomous driving and alarming the driver using machine learning model
Arya et al. Effective Driver Fatigue Management and Prevention Using Cloud-Integrated RNN Models
KR102476829B1 (ko) 딥러닝을 이용한 졸음탐지방법 및 이를 이용한 졸음운전 방지시스템
CN118163804A (zh) 一种基于多模态感知的车辆控制方法、装置、设备及介质
Adhikari Using visual and vehicular sensors for driver behavior analysis: A survey
Sudha et al. Real time driver fatigue surveillance system using machine learning
Rani et al. Driver snoozing system with infrared technology
Dutta et al. Highway Hypnosis Detection and Prevention System
Boucetta et al. Deep learning based driver’s fatigue detection framework
Chauhan et al. Driver Drowsiness Detection and Alarm System Using Deep Learning
CN120048073B (zh) 多模态生物特征融合的驾驶员疲劳状态实时监测方法
Adarsh et al. Drowsiness Detection System in Real Time
Poojan Pal et al. Focus Drive and Posture Monitoring System
Cauvery et al. Distracted Driving Detection With Object Detections as the Basis Using Deep Learning Algorithm

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24839036

Country of ref document: EP

Kind code of ref document: A1