WO2025008832A1 - Système de surveillance d'un équipement de sécurité vestimentaire dans un véhicule et procédé associé - Google Patents
Système de surveillance d'un équipement de sécurité vestimentaire dans un véhicule et procédé associé Download PDFInfo
- Publication number
- WO2025008832A1 WO2025008832A1 PCT/IN2024/050408 IN2024050408W WO2025008832A1 WO 2025008832 A1 WO2025008832 A1 WO 2025008832A1 IN 2024050408 W IN2024050408 W IN 2024050408W WO 2025008832 A1 WO2025008832 A1 WO 2025008832A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- safety gear
- wearable safety
- processing unit
- vehicle
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
Definitions
- the present invention relates to monitoring of a wearable safety gear. More particularly, the present invention relates to a system and method for monitoring a wearable safety gear in a vehicle.
- wearable safety gears such as helmets are essential safety gear for users of two-wheeled vehicles such as motorcycles, scooters, or three wheelers such as trikes. Helmets play a crucial role in protecting the head of the user and reducing the severity of injuries in the event of an accident. Helmets are specifically designed to protect the head from impact during accidents. Helmets help minimize the risk of head injuries, including traumatic brain injuries, skull fractures, and concussions. Wearing a helmet significantly reduces the risk of fatalities in two-wheeler accidents. Studies have shown that helmets can reduce the risk of death by up to 42% for motorcycle riders and 29% for bicyclists.
- helmets Apart from injury prevention in case of accidents, helmets often come with visors or face shields that provide clear visibility and protect the rider's eyes from dust, debris, insects, wind, and harsh weather conditions. This improves the overall riding experience and reduces the risk of accidents caused by impaired vision.
- the existing system only detect whether the user is wearing the helmet.
- the conventional systems fail to ensure that rider is wearing the helmet properly, i.e. with proper strapping in locked condition and the helmet fully covering the head of the rider.
- the conventional systems also use a traditional image processing unit, which takes a higher time to process images and are not compatible with computational processes. This leads to a large time lag in the detection of the wearing the helmet.
- the present invention relates to a system for monitoring a wearable safety gear in a vehicle.
- the system has one or more image sensors configured to capture real time images of a user riding the vehicle.
- the system has a processing unit configured to receive the real time images of the user from the one or more image sensors.
- the processing unit has one or more processing modules, wherein the one or more processing modules being configured to determine one or more conditions of a user in relation to the wearable safety gear based on the real time images of the user.
- the system has a feedback module configured to receive an input from the processing unit if any of the one or more conditions of the user in relation to the wearable safety gear is true, and the feedback module being configured to generate an output command.
- one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, and the user not wearing the wearable safety gear in a predetermined manner.
- the processing unit has a first processing module and a second processing module.
- the first processing module is configured to determine whether the user is wearing the wearable safety gear based on the real time images of the user
- the second processing module is configured to determine whether the user is wearing the wearable safety gear in a predetermined manner based on the real time images of the user.
- the output command generated by the feedback module includes at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter.
- the feedback module is configured to generate the output command for limiting the vehicle operating parameter to the predetermined value after the alert has been communicated to the user for a period of predetermined time.
- the wearable safety gear has a quality helmet, and the predetermined manner of wearing the wearable safety gear includes the quality being effectively positioned.
- the first processing module has a first computational model and the second processing module has a second computational model.
- the first computational model is trained based on collected data in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations
- the second computational model is trained based on collected data in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear.
- the first processing module and the second processing module receive input from one or more vehicle sensors in relation to a geographical location, a geographical condition, and one or more climatic conditions and determine whether the user is wearing the wearable safety gear and whether the user is wearing the wearable safety gear in the predetermined manner.
- the system has an illumination sensor unit.
- the illumination sensor unit is in communication with the processing unit and being configured to detect a level of ambient light around the vehicle, and the processing unit is configured to switch on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
- the system has an auxiliary sensor unit.
- the auxiliary sensor unit is in communication with the processing unit and is configured to detect one or more vehicle parameters.
- the processing unit is configured to: determine whether the one or more vehicle parameters are below a first predetermined threshold; and switch off the one or more image sensors and the illumination sensor unit or switch off the system, if the one or more vehicle parameters are below the first predetermined threshold.
- the processing unit has a vision processing unit in communication with the first processing module and the second processing module.
- the vision processing unit is configured to receive inputs from a hardware through an operating system and a hardware abstraction layer.
- the present invention relates to a method for monitoring a wearable safety gear in a vehicle.
- the method has the steps of: capturing real time images of a user riding the vehicle; receiving real time images of the user riding the vehicle captured by the one or more image sensors; determining one or more conditions of the user in relation to the wearable safety gear based on the real time images of the user; receiving an input from the processing unit if any one of the one or more conditions of the user in relation to the wearable safety gear is true; and generating an output command.
- one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, and the user not wearing the wearable safety gear in a predetermined manner.
- the method has the steps of: determining whether the user is wearing the wearable safety gear based on the real time images of the user; and determining whether the user is wearing the wearable safety gear in a predetermined manner based on the real time images of the user.
- the output command generated by the feedback module includes at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter.
- an output command for limiting the vehicle operating parameter to the predetermined value is generated after the indication has been communicated to the user for a period of predetermined time.
- the wearable safety gear has a helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear includes the helmet being effectively positioned.
- the method has the steps of: detecting an ambient light around the vehicle; and switching on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light.
- the method has the steps of: detecting one or more vehicle parameters; determining whether the one or more vehicle parameters are below a first predetermined threshold; and switching off the one or more image sensors and the illumination sensor unit or the system, if the one or more vehicle parameters are below the first predetermined threshold.
- the present invention relates to a method for training one or more computational models for monitoring a wearable safety gear.
- the method has the steps of: collecting, by a processing unit, a data set in relation respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, and in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear; processing the data set for further analysis by filtering and transforming of the data set; annotating the data set; feeding the annotated data set into a first computational model and a second computational model; training the first computational model to determine whether the user is wearing the wearable safety gear; and training the second computational model to determine whether the user is wearing the wearable safety gear in a predetermined manner.
- Figure 1 illustrates a system for monitoring a wearable safety gear in a vehicle, in accordance with an embodiment of the present invention.
- Figure 2 illustrates in a method for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
- Figure 3A illustrates a method for training one or more computational models for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
- Figure 3B illustrates a method for training one or more computational models for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
- Figure 4 illustrates a process flow of system and method for monitoring a wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
- Figure 5 illustrates a software architecture for the system and method for monitoring the wearable safety gear in the vehicle, in accordance with an embodiment of the present invention.
- the present invention relates to monitoring of a wearable safety gear. More particularly, the present invention relates to a system and method for monitoring a wearable safety gear in a vehicle.
- the system and method of the present invention are typically used in a vehicle such as a two wheeled vehicle, or a three wheeled vehicle including trikes, or a four wheeled vehicle, or other multi-wheeled vehicles as required.
- FIG. 1 illustrates a system 100 for monitoring a wearable safety gear in a vehicle.
- the system 100 comprises one or more image sensors 1 10.
- the one or more image sensors 110 are configured to capture real time images of a user riding the vehicle.
- the one or more image sensors 1 10 capture a series of real time images, or video feed or live feed of the user riding the vehicle.
- the real time images of the user are captured as soon as the vehicle is in a riding condition.
- the real time images, or video feed or live feed are a series of individual image frames, which can be analysed for monitoring the wearable safety gear.
- the one or more image sensors 1 10 comprises one or more of a camera, a Red-Green-Blue wavelength camera, a Red-Green-Blue-Infrared wavelength camera, an Infrared camera, a Monochrome camera, a Thermal camera, a Radio Detection and Ranging camera, a Light Detection and Ranging camera, or a Time-of-Flight camera.
- the system 100 further comprises a processing unit 120.
- the processing unit 120 is configured to receive the real time images of the user from the one or more image sensors 1 10. Further, the processing unit 120 has one or more processing modules. The one or more processing modules of the processing unit 120 are configured to determine one or more conditions of a user in relation to the wearable safety gear based on the real time images of the user. In an embodiment, the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location and the user not wearing the wearable safety gear in a predetermined manner. Thus, the processing unit 120 is configured not only to determine whether the user is wearing the wearable safety gear, but also determines whether the user is wearing the wearable safety gear in a predetermined manner and whether the wearable safety gear is of optimum quality.
- the processing unit 120 comprises a first processing module 122 and a second processing module 124.
- the first processing module 122 and the second processing module 124 are Artificial Intelligence Modules equipped with machine learning and/or deep machine learning models.
- the first processing module 122 is configured to determine whether the user is wearing the wearable safety gear as per the bylaws of the geographical location, based on the real time images of the user. If the first processing module 122 determines that the user is wearing the wearable safety gear, the second processing module 124 is configured to determine whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user.
- the first processing module 122 of the processing unit 120 analyses one or more frames from the real time images or live feed to determine whether the user is wearing a safety gear. If in some frame, the first processing module 122 determines that the user is wearing the wearable safety gear, the said frame is analysed by the second processing module 124 to determine whether the user is wearing the wearable safety gear in the predetermined manner. [031 ] As illustrated in Figure 1 , the system further has a feedback module 130. If the processing unit 120 detects or determines that any one of the one or more user conditions is true, the feedback module 130 receives an input from the processing unit 120. Whenever, the feedback module 130 receives an input from the processing unit 120, the feedback module 130 generates an output command. Based on the output command, an appropriate action is taken.
- the output command generated by the feedback module 130 comprises at least one of an indication to the user or limiting a vehicle operating parameter to a predetermined value of the vehicle operating parameter.
- the processing unit 120 determines that the user is not wearing the wearable safety gear, or the wearable safety gear is not in compliance with the bylaws of the geographical location or is not wearing the wearable safety gear in the predetermined manner, the feedback module 130 generates the output command based on which the user is alerted by an indication.
- the bylaws of the geographical location may prescribe a specific shape, size, configuration and quality of the wearable safety gear.
- the feedback module 130 is also capable of generating the output command based on which the vehicle operating parameter is limited, for example speed of the vehicle being limited to the predetermined vehicle speed.
- the predetermined vehicle speed is 30% of maximum vehicle speed. Restriction of the speed of the vehicle not only reduces the chances of accidents of the vehicles, but also prevents the vehicle from coming to an abrupt stop if the user removes the wearable safety gear during vehicle riding or tampers with the manner in which they are wearing the safety gear. The elimination of the vehicle coming to an abrupt stop also reduces the chances of unforeseen accidents. Further, limiting the speed also ensures reduction of severity of accidents, if any.
- the feedback module 130 is configured to generate the output command for limiting the vehicle operating parameter to the predetermined value after the indication has been communicated to the user for a period of predetermined time.
- the predetermined time ranges between 10 seconds to 5 minutes. If even after the predetermined time of the indication being generated, the processing unit 120 determines that the user is not wearing the wearable safety gear or not wearing the wearable safety gear in the predetermined manner, the speed of the vehicle is limited based on the output command from the feedback module 130.
- the wearable safety gear comprises a helmet of quality, shape and size prescribed by the bylaws of the geographical location.
- the helmet means any helmet which complies with the bylaws and safety regulations of a region in which the vehicle is being used.
- the predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned.
- the helmet being effectively positioned means a strap of the helmet being in an engaged condition.
- the system 100 is configured to determine whether the user is wearing the helmet, as well as if the user is wearing the helmet with the strap of the helmet being engaged.
- the processing unit 120 If the user is not wearing the quality helmet or is wearing the helmet but has not engaged or locked the strap, the processing unit 120 generates an input for the feedback module 130, and the feedback module 130 then generates an output command for the alert to the user and limitation of vehicle speed.
- the first processing module 122 comprises a first computational model and the second processing module 124 comprises a second computational model.
- the first computational model is artificial intelligence based model having machine learning and deep machine learning capabilities.
- the first computational model is trained based on collected data in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations.
- the second computational model is an artificial intelligence based model having machine learning and deep machine learning capabilities and the second computational model is trained based on collected data in respect of different shapes of wearable safety gears and effective positioning of the wearable safety gear.
- the first processing module 122 and the second processing module 124 receive input from one or more vehicle sensors in relation to a geographical location, a geographical condition, and one or more climatic conditions and determine whether the user is wearing the wearable safety gear and whether the user is wearing the wearable safety gear in the predetermined manner.
- the collected data allows training of the first computational model and the second computational model based on the local laws, regulations and practices of that region, and different ways in which the wearable safety gear is worn. For example, if in a local region according to the local regulations, wearing of a wearable safety gear is not mandatory, the processing unit 120 will not generate an indication in accordance with the local bylaws and regulations.
- the system 100 is also configured to create a log record of number of times and the duration for which the user has ridden the vehicle without the wearable safety gear being worn in the predetermined manner.
- the log can be sent to the user through an application or any other web service.
- the system further comprises an illumination sensor unit 140.
- the illumination sensor unit 140 is in communication with the processing unit 120.
- the illumination sensor unit 140 is configured to detect a level of ambient light around the vehicle.
- the processing unit 120 is configured to switch on a vehicle lighting system or any dedicated light, if the ambient light is below a predetermined threshold value of ambient light. For example, if during riding conditions such as overcast conditions, or night time riding conditions when the ambient light is low, to ensure that the one or more real time images are captured appropriately, the processing unit 120 switches on the vehicle lighting system, such as a bulb, or increase the brightness of the display cluster to increase the ambient light around the user.
- the system 100 further has an auxiliary sensor unit 150.
- the auxiliary sensor unit 150 is in communication with the processing unit 120.
- the auxiliary sensor unit 150 is configured to detect one or more vehicle parameters.
- the processing unit 120 receives the one or more vehicle parameters from the auxiliary sensor unit 150 and is configured to determine whether the one or more vehicle parameters are below a first predetermined threshold. If the processing unit 120 determines that the one or more vehicle parameters are below the predetermined threshold, the processing unit 120 is configured to switch off the one or more image sensors 110 or the illumination sensor unit 140 or switch off the system 100.
- the one or more vehicle parameters comprises a State of Charge of a battery of the vehicle, and the predetermined threshold of the State of Charge (SOC) of the battery is 15-20%. If based on the input from the auxiliary sensor unit 150, the processing unit 120 determines that the SOC of the battery is lower than 15-20%, the processing unit 120 switches off the system 100, or switches off the one or more image sensors 1 10 or the illumination sensor unit 140. Such switching off of the system 100 or the one or more image sensors 1 10 and the illumination sensor unit 140 prevents deep discharging of the battery.
- SOC State of Charge
- the present invention provides a method 200 for monitoring a wearable safety gear in a vehicle.
- Figure 2 illustrates the method 200 for monitoring the wearable safety gear in a vehicle.
- the one or more image sensors 1 10 are activated.
- the one or more image sensors 1 10 are activated as soon as the vehicle is started and remain activated during vehicle riding conditions.
- real time images of the user of the vehicle are captured by the one or more image sensors 1 10.
- the real time images of the user of the vehicle captured by the one or more image sensors 1 10 are received by the processing unit 120.
- a series of real time images, or video feed or live feed of the user riding the vehicle is captured by the one or more image sensors 1 10 and received by the processing unit 120.
- the real time images, or video feed or live feed are a series of individual image frames, which can be analysed for monitoring the wearable safety gear.
- the method has the step of determining one or more conditions of the user in relation to the wearable safety gear based on the real time images of the user.
- the one or more conditions of the user in relation to wearable safety gear comprises the user not wearing the wearable safety gear, wearable safety not being of optimum quality, wearable safety gear not being in compliance with the bylaws of the geographical location and the user not wearing the wearable safety gear in a predetermined manner.
- whether the user is wearing the wearable safety gear is determined by one or more processing modules of the processing unit 120 based on the real time images of the user. One or more frames of the real time images are used for determining whether the user is wearing the wearable safety gear.
- step 206 If at step 206, it is determined that one or more conditions of the user in relation to the wearable safety gear is true, for example, the user is not wearing the wearable safety gear, the method moves to step 208.
- step 208 when it is determined that the user is not wearing the wearable safety gear, an input is received by the feedback module 130 from the processing unit 120, based on which, at step 210, the feedback module 130 generates an output command based on which the action is taken. [041 ]
- step 212 if it is determined that the user is wearing the wearable safety gear, the method moves to step 212.
- step 212 it is determined that whether one or more conditions of the user in relation to the wearable safety gear is true, for example, if the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user. If at step 212, it is determined that the user is not wearing the wearable safety gear in the predetermined manner, the method moves to step 214.
- step 214 when it is determined that the user is not wearing the wearable safety gear in the predetermined manner, an input is received by the feedback module 130 from the processing unit 120, based on which, at step 214, the feedback module 130 generates an output command based on which the action is taken.
- step 212 it is determined that none of the conditions of the user in relation to the wearable safety gear are true, such as the user is wearing the wearable safety gear in the predetermined manner, no input is received by the feedback module 130 from the processing unit 120, and the method 200 reverts to step 206.
- the method 200 comprises the steps of determining, by a first processing module 122 of the processing unit 120, whether the user is wearing the wearable safety gear as per the bylaws of the geographical location, based on the real time images of the user.
- the method 200 further has the step of determining, by a second processing module of the processing unit 120, whether the user is wearing the wearable safety gear in the predetermined manner based on the real time images of the user.
- One or more frames from the real time images or live feed are analysed by the first processing module 122 to determine whether the user is wearing a safety gear. If in some frame, it is determined that the user is wearing the wearable safety gear, the said frame is analysed by the second processing module 124 to determine whether the user is wearing the wearable safety gear in the predetermined manner.
- the output command generated by the feedback module 130 comprises at least one of an indication to the user or limiting one or more vehicle operating parameters to a predetermined value of the vehicle operating parameter.
- the output command is generated based on which the user is alerted.
- the output command is generated based on which the vehicle operating parameter is limited to the predetermined value of the vehicle operating parameter.
- the vehicle operating parameter comprises vehicle speed and the predetermined vehicle speed is 30% of maximum vehicle speed.
- the output command for limiting the one or more vehicle operating parameters to the predetermined value is generated after the indication has been communicated to the user for a period of predetermined time.
- an indication is generated for the user for a predetermined time. If even after the predetermined time of the alert being generated, it is determined that the user is not wearing the wearable safety gear or not wearing the wearable safety gear in the predetermined manner, the speed of the vehicle is restricted based on the output command from the feedback module 130.
- the wearable safety gear comprises the helmet of quality, shape and size prescribed by the bylaws of the geographical location, and the predetermined manner of wearing the wearable safety gear comprises the helmet being effectively positioned.
- the helmet being effectively positioned comprises the strap of the helmet being in an engaged condition. For example, if the user is not wearing the helmet or is wearing the helmet but has not engaged or locked the strap, the input is generated for the feedback module 130 by the processing unit 120, and the output command is then generated by the feedback module for the alert to the user and restriction of vehicle speed.
- the method 200 has the steps of detecting an ambient light around the vehicle by the illumination sensor unit 140. Further, the method 200 has the step of switching on a vehicle lighting system if the ambient light is below a predetermined threshold value of ambient light. For example, if during riding conditions such as overcast conditions, or night time riding conditions when the ambient light is low, to ensure that the one or more real time images are captured appropriately, the illuminator, such as a bulb is switched on to increase the ambient light around the user.
- the method 200 further has the steps of detecting one or more vehicle parameters by the auxiliary sensor unit 150.
- the method further has the steps of determining whether the one or more vehicle parameters are below a first predetermined threshold by the processing unit 120; and switching off the one or more image sensors 110 or the illumination sensor unit 140, if the one or more vehicle parameters are below the first predetermined threshold.
- the one or more vehicle parameters comprises the State of Charge of a battery of the vehicle, and the predetermined threshold of the State of Charge (SOC) of the battery is 15-20%. If it is determined that the SOC of the battery is lower than 15-20%, the one or more image sensors 110 or the illumination sensor unit 140 are switched off.
- the present invention relates to a method 300, 400 for training one or more computational models for monitoring a wearable safety gear.
- the method steps involved in the method 300, 400 for training the one or more computational models for monitoring the wearable safety gear have been illustrated in Figure 3A and Figure 3B.
- a data set is collected by the processing unit 120.
- the data set is in respect of bylaws or safety regulations in relation to wearable safety gear across various geographical locations, different shapes of wearable safety gears and effective positioning of the wearable safety gear.
- the data set is processed by the processing unit 120 for further analysis by filtering and transforming the data set.
- the filtering and transforming is done to improve the quality, reliability and compatibility of the data set.
- the data set is annotated by the processing unit 120.
- the data set is annotated with specific attributes to the data points to provide context and meaning to the information for easy processing by the processing unit 120.
- the annotated data set is fed into the first computational model and at step 310, the first computational model is trained to determine whether the user is wearing the wearable safety gear.
- the first computational model is evaluated, after which at step 314, the final first computational model is obtained for determining whether the user is wearing the wearable safety gear, and the first computational model is deployed in the vehicle.
- the annotated data set is fed into the second computational model and at step 410, the second computational model is trained to determine whether the user is wearing the wearable safety gear in the predetermined manner. Thereafter at step 412, the second computational model is evaluated, after which at step 414, the final second computational model is obtained for determining whether the user is wearing the wearable safety gear in the predetermined manner, and the second computational model is deployed in the vehicle.
- FIG. 4 illustrates a process flow of the present invention in accordance with an embodiment of the invention.
- the one or more image sensors 1 10 capture real time images of the user.
- the one or more image sensors 1 10 capture the real time images of the user as a video stream, which is then converted to video input, and this video input is then encoded for further processing.
- the processing unit 120 breaks down the encoded video input into a plurality of frames by a frame grabber functionality. Thereafter each of the frames is processed through the digital image processing.
- the processing unit 120 detects whether the user is wearing the helmet through the helmet detection functionality using the artificial intelligence module, i.e. the first processing module 122.
- the processing unit 120 determines whether the user is wearing the helmet in the predetermined manner through the helmet strap detection functionality using the artificial intelligence module, i.e. the second processing module 124. Based on the inputs from the processing unit 120, the feedback module 130 generates indications for the user using one or more of a voice alert functionality, a haptic alert functionality or a display alert functionality.
- FIG. 5 illustrates the software architecture in relation to the present invention.
- the software architecture has a vision processing unit 170.
- the vision processing unit 170 is operatively coupled to a plurality of microservices 172, namely microservice 1 , microservice 2 and microservice 3.
- microservices are predefined libraries for supporting and enabling the functioning of the software architecture.
- Microservice 1 is in relation to capturing of the real time images using a hardware 178 such as the one or more image sensors.
- the microservice 1 receives the real time images from the hardware 178 through a hardware abstraction layer 176 and an operating system 174, and communicates the real time images to the vision processing unit 170.
- the first processing module 122 and the second processing module 124 coupled with the vision processing unit 170 determine whether the user is wearing the wearable and whether the user is wearing the wearable safety gear in the predetermined manner.
- microservice 2 is in relation to detection of ambient light, wherein the microservice 2 receives input from the relevant hardware 178, namely the illumination sensor unit 140 through operating system 174 and hardware abstraction layer 176 to be sent to the vision processing unit 170 for detection of ambient light. Based on the detection of the ambient light, the processing unit 120 determines whether to switch on the vehicle lighting system.
- microservice 3 is in relation to detection of the state of charge of the battery, wherein microservice 3 receives input from relevant hardware 178 through the hardware abstraction layer 176 and the operating system 174 to be sent to the vision processing unit 170 for detection of state of charge of the battery. Based on the detection of the state of charge of the battery, the processing unit 120 determines whether to switch off the system or the illumination sensor 140. Further, a debug module 180 is provided for debugging the software as per requirement.
- the present invention provides a system and method for monitoring a wearable safety gear which is capable of determining whether the user is wearing the safety gear, as well as determining whether the user is wearing the safety gear in the predetermined manner.
- the present invention provides a real time feedback to the user based on real time images of the user, thus providing instant alerts/speed reduction of the vehicle if the user is not wearing the wearable safety gear or is not wearing the wearable safety gear in the predetermined manner. The reduction in speed and alerts greatly reduce the chances of accidents.
- the present invention ensures that the vehicle does not come to an abrupt stop is the user removes the wearable safety gear or tampers with the wearable safety gear during vehicle riding, which reduces the chances of unforeseen accidents.
- the present invention provides for the system and method to be synchronized with the local laws, regulations and practices.
- the present invention reduces the dependency on smart devices such as smart helmets, which reduces the chances of disruption and interdependence of multiple hardware devices.
- the limitation of the smart helmets being configured for a specific vehicle is also obviated by the present invention. Situations such as the user forgetting to carry the specific smart device and the same leading to incapacitation of the vehicle is also eliminated in the present invention.
- the reduced interdependence also reduces the time lag in the detection of whether the user is wearing the wearable safety gear and whether the user is wearing the safety gear in the predetermined manner.
- the present invention also allows for faster processing of the real time images of the user for detection of the one or more conditions of the user in relation to the wearable safety gear, which not only reduces the required processing capabilities and processing time, but also enhances safety.
- a computer-readable storage medium refers to any type of physical memory on which information or data readable by a processor may be stored.
- a computer-readable storage medium may store instructions for execution by one or more processors, including instructions for causing the processor(s) to perform steps or stages consistent with the embodiments described herein.
- the term “computer-readable medium” should be understood to include tangible items and exclude carrier waves and transient signals, i.e., be non-transitory. Examples include random access memory (RAM), read-only memory (ROM), volatile memory, non-volatile memory, hard drives, CD ROMs, DVDs, flash drives, disks, and any other known physical storage media.
- Debug 200 Method for Monitoring a Wearable Safety Gear in a vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Evolutionary Computation (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Databases & Information Systems (AREA)
- Computing Systems (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Emergency Alarm Devices (AREA)
- Traffic Control Systems (AREA)
Abstract
La présente invention concerne un système (100) et un procédé (200) de surveillance d'un équipement de sécurité vestimentaire dans un véhicule. Le système (100) comprend un ou plusieurs capteurs d'image (110) configurés pour capturer des images en temps réel d'un utilisateur conduisant le véhicule ; et une unité de traitement (120) configurée pour recevoir les images en temps réel de l'utilisateur. L'unité de traitement a un ou plusieurs modules de traitement configurés pour déterminer une ou plusieurs conditions d'un utilisateur par rapport à l'équipement de sécurité vestimentaire. Le système (100) a un module de rétroaction (130) configuré pour recevoir une entrée provenant de l'unité de traitement (120) si l'une quelconque des conditions de l'utilisateur par rapport à l'équipement de sécurité vestimentaire est vraie, et le module de rétroaction (130) étant configuré pour générer une instruction de sortie.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CONC2025/0016255A CO2025016255A2 (es) | 2023-07-06 | 2025-11-25 | Un sistema para monitorizar un equipo de seguridad portátil en un vehículo y método del mismo |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN202341045515 | 2023-07-06 | ||
| IN202341045515 | 2023-07-06 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025008832A1 true WO2025008832A1 (fr) | 2025-01-09 |
Family
ID=94171923
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/IN2024/050408 Pending WO2025008832A1 (fr) | 2023-07-06 | 2024-04-19 | Système de surveillance d'un équipement de sécurité vestimentaire dans un véhicule et procédé associé |
Country Status (2)
| Country | Link |
|---|---|
| CO (1) | CO2025016255A2 (fr) |
| WO (1) | WO2025008832A1 (fr) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180137375A1 (en) * | 2015-07-17 | 2018-05-17 | Hitachi Automotive Systems, Ltd. | Onboard environment recognition device |
| US10959479B1 (en) * | 2019-12-27 | 2021-03-30 | Robert Bosch Gmbh | Apparatus and warning system for intelligent helmet |
| WO2022034610A1 (fr) * | 2020-08-09 | 2022-02-17 | Tvs Motor Company Limited | Système et procédé de sécurité pour la détection et l'avertissement d'un équipement de sécurité en temps réel |
-
2024
- 2024-04-19 WO PCT/IN2024/050408 patent/WO2025008832A1/fr active Pending
-
2025
- 2025-11-25 CO CONC2025/0016255A patent/CO2025016255A2/es unknown
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180137375A1 (en) * | 2015-07-17 | 2018-05-17 | Hitachi Automotive Systems, Ltd. | Onboard environment recognition device |
| US10959479B1 (en) * | 2019-12-27 | 2021-03-30 | Robert Bosch Gmbh | Apparatus and warning system for intelligent helmet |
| WO2022034610A1 (fr) * | 2020-08-09 | 2022-02-17 | Tvs Motor Company Limited | Système et procédé de sécurité pour la détection et l'avertissement d'un équipement de sécurité en temps réel |
Also Published As
| Publication number | Publication date |
|---|---|
| CO2025016255A2 (es) | 2025-12-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| EP3461672B1 (fr) | Appareil d'affichage, appareil de commande d'affichage et véhicule | |
| KR102871906B1 (ko) | 퍼스널 모빌리티 및 그 제어 방법 | |
| US9787892B2 (en) | Safety device for motorcycle riders and method for operating a safety device for motorcycle riders | |
| US20170158118A1 (en) | Systems and Methods for Motorbike Collision Avoidance | |
| US9580009B1 (en) | Systems and methods for motorbike collision avoidance | |
| US9956933B2 (en) | Safety system for a motor bike and method for triggering a safety system | |
| CN112590987B (zh) | 一种摩托车及其监控预警方法 | |
| US20210053559A1 (en) | Vehicle hazard avoiding apparatus and method, and storage medium | |
| IT201800010591A1 (it) | Sistema integrato di sicurezza attiva per individui a rischio di incidente stradale | |
| CN113920310A (zh) | 方向盘脱手检测方法及装置 | |
| CN106504554A (zh) | 识别交通灯状态信息的方法及装置 | |
| CN212624073U (zh) | 一种安全驾驶预警系统 | |
| CN113479165A (zh) | 辅助驾驶方法、装置、智能头盔及处理器 | |
| WO2025008832A1 (fr) | Système de surveillance d'un équipement de sécurité vestimentaire dans un véhicule et procédé associé | |
| JP7380380B2 (ja) | 運転支援装置 | |
| CN112258813A (zh) | 一种车辆主动安全控制方法和设备 | |
| JP6414549B2 (ja) | 二輪車乗車数判定方法、二輪車乗車数判定システム、二輪車乗車数判定装置及びプログラム | |
| CN104376685A (zh) | 一种防疲劳驾驶的车辆控制方法 | |
| US20200198715A1 (en) | Lighting and a communication system for providing communication between a vehicle and a helmet | |
| JP7259793B2 (ja) | 運転支援装置 | |
| CN205632285U (zh) | 一种实时记录的盲区侦测警示系统 | |
| AU2021105086A4 (en) | System and method for automatic helmet detection to enhance the rider safety using deep learning | |
| WO2022034610A1 (fr) | Système et procédé de sécurité pour la détection et l'avertissement d'un équipement de sécurité en temps réel | |
| CN119099628A (zh) | 基于驾驶员健康状态的紧急状态响应方法及装置 | |
| Somnath et al. | Intelligent human life saver using smart helmet with IoT application |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24835555 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2501007957 Country of ref document: TH |
|
| REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112025026033 Country of ref document: BR |