[go: up one dir, main page]

WO2025038169A1 - Rollover prediction and alert for all-terrain vehicle - Google Patents

Rollover prediction and alert for all-terrain vehicle Download PDF

Info

Publication number
WO2025038169A1
WO2025038169A1 PCT/US2024/034045 US2024034045W WO2025038169A1 WO 2025038169 A1 WO2025038169 A1 WO 2025038169A1 US 2024034045 W US2024034045 W US 2024034045W WO 2025038169 A1 WO2025038169 A1 WO 2025038169A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
imu
terrain vehicle
neural network
network model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/034045
Other languages
French (fr)
Inventor
Farzaneh KHORSANDI KOUHANESTANI
Guilherme DE MOURA ARAUJO
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of California Berkeley
University of California San Diego UCSD
Original Assignee
University of California Berkeley
University of California San Diego UCSD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of California Berkeley, University of California San Diego UCSD filed Critical University of California Berkeley
Publication of WO2025038169A1 publication Critical patent/WO2025038169A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/112Roll movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/11Pitch movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W40/114Yaw movement
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/55Remote control arrangements
    • B60K2360/56Remote control arrangements using mobile devices
    • B60K2360/566Mobile devices displaying vehicle information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2300/00Indexing codes relating to the type of vehicle
    • B60W2300/36Cycles; Motorcycles; Scooters
    • B60W2300/362Buggies; Quads
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • B60W2520/105Longitudinal acceleration
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/14Yaw
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/16Pitch
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/18Roll
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2530/00Input parameters relating to vehicle conditions or values, not covered by groups B60W2510/00 or B60W2520/00
    • B60W2530/201Dimensions of vehicle

Definitions

  • Crash detection and prevention systems exist, primarily for vehicles such as cars and trucks. However, due to the unique shape and usage of ATVs, existing systems cannot accurately predict and prevent accidents such as rollover of ATVs. For instance, crash detection and notification systems are limited because they cannot warn the driver/rider ahead of time; thus, drivers cannot prevent the crash.
  • Collision prediction systems also exist, but are mainly used for automobiles, while rollover prediction systems are more popular among farm machinery such as tractors. Regardless of the crash type, conventional crash prediction systems often consist of expensive sensors such as radar, light detection and ranging (LIDAR), camera, and Global Positioning System (GPS).
  • An alternative budget solution consists of using collision prediction models. Collision prediction models take the state of multiple vehicles as input, i.e., there must be a continuous information exchange, which makes this approach vulnerable to communication errors.
  • rollover prediction models which are primarily based on the vehicle’s center of gravity (CG), and terrain slope angles (roll and pitch).
  • an all-terrain vehicle rollover prediction apparatus comprises an inertial measurement unit (IMU) configured to be fastened to an all-terrain vehicle; at least one local processor physically wired to the IMU; a memory connected with the at least one local processor, the memory having instructions executable by the at least one processor for: looking up, from a data table, dimensions of the all-terrain vehicle; receiving IMU data from the IMU; estimating a roll angle and a pitch angle from the IMU data; determining a yaw rate from the IMU data; calculating a turning radius from the yaw rate; inputting, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle
  • IMU inertial measurement unit
  • the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz.
  • the neural network model has exactly two hidden layers.
  • each hidden layer of the neural network model has 32 neurons.
  • the memory has further instructions executable by the at least one processor for: calculating a difference between a respective average and each of the speed, the dimensions, the roll angle, the pitch angle, and the turning radius and dividing each difference by a respective scale factor before inputting to the neural network model.
  • the average is a mean or a median
  • the scale factor is a standard deviation or a variance, of the respective speed, dimensions, roll angle, pitch angle, and turning radius as determined from training data with which the neural network model was trained.
  • the neural network model was trained using a categorical cross entropy loss function.
  • the memory has further instructions executable by the at least one processor for receiving a user selection of a make or model of the all-terrain vehicle, wherein the looking up is based on the user selection.
  • the memory has further instructions executable by the at least one processor for receiving user entries of the dimensions, wherein the looking up is based on the user entries.
  • the looked up dimensions of the all-terrain vehicle include one or more of a weight, a width, a length, a wheelbase, and a seat height.
  • the memory has further instructions executable by the at least one processor for inputting, to the neural network model, a presence of a rider or a non-presence of a rider for autonomous driving.
  • the value is a scalar representing a probability of roll over.
  • the apparatus further includes an aural or visual indicator onboard the all-terrain vehicle configured to warn a rider based on the alert.
  • the apparatus further includes a transmitter configured to send a wireless message based on the alert.
  • the apparatus further includes a global positioning system (GPS) receiver, wherein the memory has further instructions executable by the at least one processor for: obtaining data from the GPS receiver and data from the IMU and combining the data from the GPS receiver and the data from the IMU to estimate the speed of the all-terrain vehicle.
  • the at least one local processor is rigidly connected with the IMU.
  • the apparatus further includes a housing enclosing the IMU, the at least one local processor, and the memory.
  • the at least one local processor is rigidly connected with the IMU.
  • the apparatus further includes a housing enclosing the IMU, the at least one local processor, and the memory.
  • FIG. 1 is a simplified block diagram of an example of a system for predicting ATV rollover according to some embodiments.
  • FIG. 2 is a simplified block diagram of an example of components of the system for predicting ATV rollover of FIG. 1, according to some embodiments.
  • FIG. 3A is a simplified flowchart of a method for predicting ATV rollover, according to some embodiments.
  • FIG. 3B is a simplified flowchart of a method for detecting ATV rollover, according to some embodiments.
  • FIG. 3A is a simplified flowchart of a method for detecting ATV rollover, according to some embodiments.
  • FIG. 4 is an example database schema for managing ATV data, according to some embodiments.
  • FIG. 5A is an example user interface for ATV monitoring, according to some embodiments.
  • FIG. 5B is another example user interface for ATV monitoring, according to some embodiments.
  • FIG. 6 is an example dashboard user interface for ATV monitoring, according to some embodiments.
  • FIG. 7 is another example user interface for ATV monitoring, according to some embodiments.
  • FIG. 8 is another example user interface for ATV monitoring, according to some embodiments.
  • FIG. 9A is another example user interface for ATV monitoring, according to some embodiments.
  • FIG. 9B is another example user interface for ATV monitoring, according to some embodiments. [0035] FIG.
  • FIG. 10 is another example user interface for ATV monitoring, according to some embodiments.
  • FIG. 11 is a block diagram illustrating an example computer system, according to at least one embodiment.
  • DETAILED DESCRIPTION [0037] As noted above, existing crash prediction and reaction systems have shortcomings including requiring complex hardware and being unsuitable for ATV applications. The systems and methods of the present disclosure overcome these issues and provide accurate, low-cost rollover prediction tailored to ATVs.
  • an AI-based incident prediction and notification system includes a deep neural network trained to calculate the likelihood of a rollover event based on the ride’s parameters (e.g., vehicle’s speed, roll, pitch, and turning radius) and ATV parameters (e.g., weight, height, width, wheelbase, and seat height).
  • ride’s parameters e.g., vehicle’s speed, roll, pitch, and turning radius
  • ATV parameters e.g., weight, height, width, wheelbase, and seat height.
  • the systems and methods of the present disclosure provide simplicity in terms of inputs and easiness of implementation for real-time applications (e.g., a computing time smaller than 1 millisecond (ms)).
  • the system takes as input ATV dimensions that are easy to determine or readily available on the internet and ride parameters that are measured by its embedded system in real-time.
  • FIG. 1 is a simplified block diagram of an example of a system 100 for predicting ATV rollover according to some embodiments.
  • the system 100 may include an all-terrain vehicle rollover prediction apparatus including an all-terrain vehicle (ATV) 102, a server 110 coupled to a database 112, and a mobile application 104 executing on a user device such as a smartphone.
  • ATV all-terrain vehicle
  • the ATV 102, the server 110, and the mobile application 104 can further be coupled to a satellite network 106 and a rescue service 108.
  • Various communication protocols 114 can be implemented for communication between the components of the system 100.
  • the ATV 102 may be coupled to, or include, components for collecting and processing data.
  • the collected data can include ride parameters such as velocity, 3-axis acceleration and rotation, and device location.
  • the system 100 may include an inertial measurement unit (IMU) configured to be fastened to the ATV 102, at least one local processor physically wired to the IMU, and a memory connected with the at least one local processor, the memory having instructions executable by the at least one processor for performing the methods described herein. Additional components of the system 100 are described in further detail below with respect to FIG. 2.
  • the server computer 110 includes functionality to receive, transmit, and analyze data via the ATV 102, the mobile application 104, the satellite network 106, and/or the rescue service 108.
  • the server computer 110 may include a processor coupled to a memory, a network interface, and a computer-readable medium, as further described below with respect to FIG. 11.
  • the server computer 110 includes or is communicatively coupled to a database 112.
  • the database 112 is a remote (e.g., cloud) database.
  • the server computer 110 may receive data from the ATV 102, satellite network 106, mobile application 104, and/or rescue service 108, and record some or all of that data, or derivatives thereof, to the database 112.
  • the server computer 110 may further transmit some or all of the data to the ATV 102, satellite network 106, mobile application 104, and/or rescue service 108.
  • the mobile application 104 can include functionality for displaying information collected by the system 100 or derivatives thereof. For example, the mobile application visualizes collected parameters. Alternatively, or additionally, the mobile application stores ride history and parameters. In some implementations, some, or all, of this data is stored to the database 112. [0044] In some embodiments, the mobile application 104 includes functionality to allow users (e.g., ATV owners) to add, manage, monitor, and track riding parameters and relevant statistics for their ATVs.
  • users e.g., ATV owners
  • the application can read the riding parameters from the database and will display the information in an analytical form.
  • Bluetooth Low Energy (BLE) is used to transfer information directly from the processing device shown in FIG. 2 to the application so a user can access riding parameters in real time.
  • the satellite network 106 can be used to transmit emergency messages off-board. Common methods for remote data transfer such as WiFi and cellular networks may be unavailable in locations where ATVs are used (e.g., remote or rural locations). Additionally, or alternatively, the satellite network 106 is used for GPS location determining functions.
  • the satellite network 106 is the Iridium Satellite Network and the system includes a Rock7 RockBLOCK Iridium Satellite Modem to communicate with the satellite network 106.
  • the mobile application 104 and/or other components of the system 100 sends emergency alerts to one or more rescue services 108.
  • satellites 106 are used, so that such alerts can be transmitted without the need for cellular or WiFi connections, which is useful in remote areas where ATVs are commonly used.
  • the devices are in communication with one or more third-party rescue services 108, such as Noonlight ® , which is used in conjunction with the server 110 to contact first responders.
  • FIG. 2 is a simplified block diagram of an example of components of the system 200 for predicting ATV rollover of FIG. 1, according to some embodiments. The components shown in FIG. 2 may be installed in the ATV 102. As shown in FIG.
  • the system components 200 include a processing device 202, an inertial measurement unit (IMU) 204, a modem 206, a relay 208, an antenna 210, a GPS receiver 211, an ATV engine 212, one or more batteries, such as a 12 volt (V) battery 218 and a portable battery 214, and a converter 216.
  • a processing device 202 an inertial measurement unit (IMU) 204, a modem 206, a relay 208, an antenna 210, a GPS receiver 211, an ATV engine 212, one or more batteries, such as a 12 volt (V) battery 218 and a portable battery 214, and a converter 216.
  • V 12 volt
  • the system components 200 include a processing device 202, an IMU 204, a modem 206, a relay 208, an antenna 210, a GPS receiver 211, an ATV engine 212, one or more batteries, such as a 12 volt (V) battery 218 and
  • the processing device 202 includes at least one processor.
  • the processing device 202 is a Raspberry Pi® (e.g., a Raspberry Pi ® 4) or other microcontroller.
  • the processing device 202 is a data computation device or devices, operable to carry out instructions.
  • the processing device 202 may include a CPU that includes at least one high-speed data processor adequate to execute program components for executing user and/or system-generated requests.
  • the CPU may be a microprocessor such as Raspberry Pi ® 4; AMD ® ’s Athlon, Duron, and/or Operon; IBM ® and/or Motorola ® ’s Power PC ® ; IBM ® ’s and Sony ® ’s Cell processor; Intel ® ’s Celeron ® , Itanium ® , Pentium ® , Xeon ® , and/or XScale ® ; and/or the like processor(s).
  • the processing device 202 may be local to, e.g., installed on, the ATV 102 shown in FIG. 1. [0049]
  • the processing device 202 may include, or be coupled to, a memory.
  • the memory may be any magnetic, electronic, or optical memory.
  • the memory may include any number of memory modules.
  • An example of memory is dynamic random access memory (DRAM).
  • the memory can include a computer-readable medium, which may include software modules comprising programming for performing the methods described herein.
  • the inertial measurement unit (IMU) 204 includes electronics configured for measuring and reporting the acceleration, angular rate, and/or other parameters of the ATV.
  • the IMU 204 may include one or more accelerometers, gyroscopes, and/or magnetometers.
  • the IMU 204 is fastened to the all-terrain vehicle 102 depicted in FIG. 1 and is coupled (e.g., physically wired and/or rigidly connected) to the processing device 202.
  • the IMU 204 logs accelerometer and gyroscope data locally. This data may be collected at a sample rate (e.g., a user-determined sample rate). The data collected by the IMU 204 is provided to the processing device 202 for analysis (e.g., to determine if a rollover has occurred or is likely to occur, as described herein).
  • the inertial measurement unit (IMU) 204 is an IMU such as the Adafruit ® LSM9DS1.
  • the IMU includes an accelerometer, which measures acceleration on three axes. The accelerometer is capable of capturing data used in determining the vehicle’s pitch, roll, and yaw.
  • the IMU includes a gyroscope, which measures rotation velocity.
  • the IMU includes a magnetometer, which measures magnetic fields.
  • the antenna 210 includes electronics configured for recreiving and amplifying radio signals from satellites and converting the radio signals to electornic signals.
  • the antenna 210 may, for example, be a GPS or GNSS antenna, as depicted in FIG. 2.
  • the global positioning system (GPS) receiver 211 includes electronics configured for receiving electronic signals from the antenna 210 and analyzing the electronic signals to determine information such as position, velocity, and precise time based on data received from GPS satellites.
  • GPS global positioning system
  • the GPS receiver 211 and/or processing device 202 may be configured to log location, altitude, and velocity data locally at a predetermined (e.g., user-configured) sample rate.
  • the GPS receiver 211 and/or processing device 202 may further be configured to determine whether the vehicle is within a geofence (if one is set).
  • the GPS receiver 211 and/or processing device 202 may further be configured to filter results based on a configured signal strength parameter.
  • the GPS receiver 211 is an Adafruit ® Ultimate GPS Breakout v3 or similar.
  • the GPS receiver 211 can be used to detect rollover crashes, log position, velocity, and altitude.
  • the outputs of the GPS receiver 211 include latitude, longitude, altitude, velocity, and number of connected satellites.
  • the system 200 includes a transmitter configured to send wireless messages.
  • the system includes a modem 206, such as a Rock7 modem or other Iridium satellite modems.
  • the transmitter may, for example, send messages requesting help based on detecting rollover is likely or has occurred.
  • the system 200 includes one or more batteries, such as the 12 V battery 218 (e.g., the ATV battery) and/or the portable battery 214.
  • the portable battery 214 is an Anker PowerCore II 20000 connected to the processing device 202 via a USB-C connection.
  • the system 200 includes an ATV engine 212 coupled to a relay 208.
  • the relay 208 is an electrically operated switch.
  • the relay 208 may include a set of input terminals for one or more control signals, and a set of operating contact terminals.
  • the relay is connected to the ATV engine 212 and the 12 V battery 218 and/or portable battery 214.
  • the relay 208 can disengage the ATV engine 212 and batteries 218 and/or 214 to perform automatic or manual device shutoff as described herein.
  • the system 200 includes a converter 216.
  • FIG. 3A is a simplified flowchart of a method 300 for predicting ATV rollover, according to some embodiments.
  • the method presented in FIG. 3A and described below is intended to be illustrative and non-limiting. It is appreciated that the processing steps may be performed in an order different from that depicted in FIG. 3A and that not all the steps depicted in FIG. 3A need be performed.
  • the method 300 may be implemented by a computer system, such as the system 100 shown in FIG. 1 and FIG. 2.
  • an inertial measurement unit fastened to an all-terrain vehicle and at least one processor physically wired to the IMU.
  • the system can further include a global positioning system (GPS) receiver.
  • the at least one local processor is rigidly connected with the IMU.
  • the system further includes a transmitter configured to send a wireless message (e.g., based on an alert as described herein).
  • the system further includes a housing enclosing the IMU, the at least one local processor, and the memory. As described above with respect to FIGS. 1 and 2, the system can include such components and others.
  • a registration process is performed using the mobile application.
  • the registration may, for example, create a unique identifier (ID) based on serial and/or model numbers of the ATV.
  • the system begins collecting data when it is powered on. In some aspects, the data is sent to the database. Alternatively, or additionally, the data is loaded into the mobile application. In some implementations, the system begins recording data when the system (e.g., the processing device shown in FIG. 2 and/or the ATV itself) is powered on, and recording ends when the system is turned off. The recording may include storing data locally on the processing device and/or transmitting the data for storage remotely (e.g., on a cloud database by transmitting the data to the server via satellite or WiFi, when available). In some aspects, a parallel thread performs database interactions.
  • the system attempts data transmission at a suitable interval, such as once every 10 seconds.
  • the system looks up, from a data table, dimensions of the all-terrain vehicle.
  • the data table can include various information about the ATV.
  • An example data structure is illustrated in FIG. 4.
  • the dimensions can include one or more of a weight, a width, a length, a wheelbase, a seat height, and so forth.
  • the system receives a user selection of vehicle parameters, environmental parameters, or the like, at some initial time. The looking up is based on the user selection. For example, the system receives a make or model of the all-terrain vehicle (e.g., via the mobile application), the dimensions, and so forth.
  • the mobile application receives, via user input, data such as the make and model of the ATV.
  • the database includes parameters of various different ATVs such as weight, chassis, wheelbase, etc.
  • the system may query the database, based on the type of ATV, to retrieve such ATV parameters. If the ATV is not specifically in the dataset, the system can prompt the user, via the mobile app, to input that data (e.g., “please measure the wheelbase”). As another example, a user can interact with the mobile application to indicate whether a rider is present or the ATV is autonomously driven.
  • the system receives IMU data from the IMU.
  • the IMU data can include, for example, acceleration, angular rate, and/or the like.
  • the IMU data can be collected by the processing device, as shown in FIG. 2.
  • the system estimates a roll angle and a pitch angle from the IMU data.
  • the system may use the orientation information identified by the IMU to compute the roll and pitch angles.
  • the roll angle and a pitch angle can be calculated using accelerometer data. Suitable computations are described in M. Wrona, “Roll and Pitch Angles From Accelerometer Sensors,” https://mwrona.com/posts/accel-roll-pitch/ (2020) and Y.
  • the system measures the ATV's attitude (including roll and pitch) with the IMU.
  • a Madgwick filter is implemented. This filter fuses gyroscope, accelerometer, and magnetometer measurements to calculate the vehicle's attitude.
  • the vehicle's attitude is initially estimated by the integration of gyroscope measures, which in the long term inherently yields drift.
  • the long-term drift from the gyroscope integration is compensated by the accelerometer estimates of attitude.
  • the magnetometer measures are used to compensate for magnetic distortions from potential sources of interference around the sensor, such as electrical appliances (for instance, a GPS sensor), and metal structures (e.g., the ATV's frame).
  • the system determines a yaw rate from the IMU data.
  • the system may use the ATV’s magnetometer readings, mass, speed, and/or orientation to compute the yaw rate.
  • the yaw rate can be calculated based on gyroscope data as described in Tawil, supra.
  • the system calculates a turning radius from the yaw rate.
  • the system may use the ATV’s velocity, identified by the IMU, and the yaw rate, computed at step 310, to compute the turning radius.
  • the turning radius is given by: ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , where the turning radius is the ATV’s turning radius in m, velocity is the ATV’s speed in m s -1 , and yaw rate is the ATV’s yaw rate in radians s -1 .
  • the system converts the vehicle’s speed from km/h to meters/s. With the yaw rate in radians, this provides a turning radius in meters. For cases with no yaw rate (i.e., basically driving on a straight line), the turning radius can be set to a suitable value, such as the maximum value used to train the neural network (e.g., 50 m).
  • the system may identify and/or calculate various additional parameters. For example, in some embodiments, the system estimates a speed based on data obtained from the GPS receiver and/or data obtained from the IMU. In some implementations, the system uses data from both the GPS receiver and the IMU to estimate the speed of the all-terrain vehicle. For example, the system averages speed data from the two sources.
  • the system fuses GPS and IMU data through an Unscented Kalman Filter (UKF).
  • the UKF is a non-linear version of a Kalman Filter, which is an algorithm that combines multiple sensor information to estimate the state of a system, such as the vehicle's position.
  • the system measures and/or computes acceleration in three dimensions. Due to the environment of some ATV off-road crashes (e.g., dense woods), GPS signals might be unavailable or yield inaccurate measures. To address this issue, IMU data (which allows continuous position tracking) is fused with GPS data through an Unscented Kalman Filter. [0071]
  • the system further processes the input parameters. For example, the system performs a Z-transformation on the input parameters to avoid overfitting.
  • the system identifies average values for the speed, the dimensions, the roll, angle, the pitch angle, and/or the turning radius. The average may, for example, be a mean or a median of previously computed values from the ATV and/or other ATVs historically.
  • the system calculates a difference between a respective average and each of the speed, the dimensions, the roll angle, the pitch angle, and the turning radius.
  • the system divides each difference by a respective scale factor before inputting the parameters to the neural network.
  • the scale factor is a standard deviation or a variance, of the respective speed, dimensions, roll angle, pitch angle, and turning radius as determined from training data with which the neural network model was trained (as further described below with respect to step 314). This helps improve prediction accuracy.
  • the system inputs, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle, and the turning radius.
  • Input Variable Unit Description ATV weight Kg ATV net weight (no rider and fuel tank empty) Measured from left to right across the widest part of Width mm the ATV, including tires but not including any side mirrors Measured from front to right across the longest part Length mm of the ATV Measured between the center of the front and rear Wheelbase mm wheel hubs from the same side Seat height mm Measured from the ground up to the ATV seat center Speed km/h ATV’s last recorded speed before the incident ATV pitch angle degrees ATV’s last recorded pitch before the incident ATV roll angle degrees ATV’s last recorded roll before the incident Calculated as the tangential speed (in m/s) divided Turning radius m by the yaw rate (in radians/s) (binary score -
  • the neural network model has exactly two hidden layers, which has been found to provide accurate predictions.
  • each hidden layer of the neural network has 32 neurons. It should be understood that other numbers of layers and/or numbers of neurons can be implemented.
  • a classification model based on machine learning algorithms is implemented in the embedded system. This algorithm calculates the likelihood of a rollover event based on the parameters such as those shown above in Table 1.
  • the neural network model may comprise one or more of a K-nearest neighbors (KNN) model, a random forest, a Support Vector Machine (SVM), and/or a Deep Neural Network (DNN). Example tuning parameters for each type of model are shown in Table 2 below.
  • the neural network model is initially trained for rollover prediction with labeled training data indicating various parameters (e.g., speed, roll angle, turning radius, etc.), labeled to indicate whether or not a rollover occurred.
  • the deep neural network is trained for rollover prediction with a dataset containing 51,700 samples.
  • the neural network is trained using a categorical cross entropy loss function.
  • a Rectified Linear Unit (ReLU) activation function and/or Adaptive moment estimation (Adam) optimizer are used to train the neural network.
  • the training dataset includes observations from mathematical simulations, finite element analysis (FEA), and realistic simulations, including static and dynamic tests. For the samples consisting of mathematical simulations and FEA, the occurrence of a rollover can be calculated instantly based on the data. For the samples consisting of realistic simulations (static and dynamic tests), the occurrence of a rollover can be determined by a human observer.
  • the system reads, from the neural network model, a value predicting whether the all-terrain vehicle will roll over.
  • the neural network model may output a value indicating how likely roll-over is to occur.
  • the value is a scalar representing a probability of roll over (e.g., a value between 0 and 1, where 0 is most unlikely and 1 is most likely / has occurred).
  • the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz.
  • the system may compare the value to a threshold.
  • the system may store a threshold value, which may be configurable (e.g., via user intput to the app). For example, the threshold is 50% likely rollover.
  • the system may continue monitoring and repeating steps 302 – 316. If the value does exceed the threshold, then the system may perform some action to warn or otherwise prevent the rollover (e.g., by proceeding to step 318).
  • the system outputs an alert based on the value.
  • the alert can be an aural or visual indicator onboard the all-terrain vehicle configured to warn a rider based on the alert.
  • the system issues an audiovisual alert consisting of a high-intensity red LED and a piezo buzzer.
  • the system may send an alert to the mobile phone application and/or remotely to a 911 or other rescue service.
  • the on-board system transmits a message to the server, and the server, based on the message, calls 911 or another emergency service and provides the coordinates of the ATV.
  • FIG. 3B illustrates a method 350 for determining whether a rollover has occurred.
  • rollover occurrence is detected based on actual angle values that are provided by the processing of the IMU data, which is a robust and accurate method.
  • the system determines whether a rollover has occurred based on ATV stability angle, speed, and riding condition.
  • rollover occurrences are determined by comparing the ATV's static stability angles to the ATV's roll and pitch angles in real time.
  • gyroscope, accelerometer, and magnometer data is acquired, which is filtered at 354.
  • the ATV's attitude (including roll and pitch) can, for example, be measured with an IMU as shown in FIG. 2.
  • a Madgwick filter can be implemented. This filter fuses gyroscope, accelerometer, and magnetometer measurements to calculate the vehicle's attitude.
  • the vehicle's attitude is initially estimated by the integration of gyroscope measures, which in the long term inherently yields drift. The long-term drift from the gyroscope integration is compensated by the accelerometer estimates of attitude.
  • the magnetometer measures are used to compensate for magnetic distortions from potential sources of interference around the sensor, such as electrical appliances (for instance, a GPS sensor), and metal structures (e.g., the ATV's frame).
  • the roll and pitch angles are compared to a longitudianl stability angle of the ATV.
  • the ATV static stability angles are the critical angles at which an ATV begins to roll (either sideways or forward/rearward).
  • Lateral (side) and longitudinal (rear) stability angles are important measures of the relative stability of an ATV and can be used to describe the rollover propensity of specific vehicles. These angles are generally determined through tilt table tests or calculated based on center of gravity (C.G.) location that can be determined by lifting axle method.
  • C.G. center of gravity
  • an internal counter is updated (e.g., either incremented at 358 or decremented at 360).
  • the counter is compared to respective thresholds at 362 and 364.
  • a certain threshold at 364 e.g., 25 consecutive readings / 5 s
  • the system triggers the emergency system alerting first responders at 366.
  • the importance of implementing a counter is explained by the fact that false positive rollover detection may occur.
  • the counter circumvents erroneous detections by evaluating a sequence of events instead of a single occurrence.
  • an emergency alert is sent to the rescue service (e.g., a third party service such as Noonlight ® ).
  • the emergency alert may be transmitted via the satellite network.
  • the alert contains location data and the vehicle owner’s phone number for verification attempts.
  • the location of the ATV is sent to first responders, family members, and/or supervisors (e.g., even when the operator is not able to do so).
  • the recipients of such alerts and/or information can be configured (e.g., by the operator).
  • users can opt into receiving a notification or contacting first responders in case of an accident, rollover, or potential rollover.
  • the system uses geofencing for controlling anti- robbery and/or autonomous shut-off systems.
  • a geofence is a virtual perimeter for a real- world geographic area. The limits of the geofence can be configured, e.g., by a user via the mobile application.
  • an alert is sent to the vehicle owner.
  • the vehicle is automatically shut off upon determination that the vehicle has left its geofence.
  • the autonomous shut-off system can be activated to prevent a robbery or a possible incident (high speed, turning sharp curve, etc.).
  • the anti-robbery system works based on the autonomous shut-off and geofencing systems.
  • the smartphone application can be used by the riders to trigger the emergency alert in case any ATV incident (hitting an object, rollover, operator ejection, etc.) occurs and the operator is conscious and able to reach and interact with their phone.
  • the application causes display of an interface element, such as a button, that the user can select to trigger the emergency alert.
  • a keyfob can be used by the riders to trigger the emergency alert .
  • the above characteristics are functional without cellular signals.
  • the techniques described above can use satellite networks to operate in off-road locations without cellular or WiFi signals, as commonly occurs when driving an ATV.
  • Additional advantages include a rollover detection system with an accuracy higher than 99%.
  • the rollover detection system described herein has a fast emergency notification time (less than 41 s) and crash localization with an accuracy of 2 m.
  • the system can identify roll and pitch angles with average errors of a fraction of a degree, respectively, and the system can detect the ATV’s speed with an average error of a fraction of a m/s.
  • the techniques described herein provide accurate and fast results.
  • Additional features may include allowing users to create rider credentials, customized geofencing, licensing and servicing, and danger alerts. Further, the system can identify when the ATV is being used appropriately and when the ATV is not being used appropriately, which can be used to improve operator behavior (e.g., identify high risk operators and understand why they are high risk). The system can automatically raise a panic alert in an emergency, even when the rider is unable to do so and when cellphone service and internet connection are unavailable. The system can decrease fatalities and the severity of injuries in ATV rollover crashes by reducing the response time.
  • FIG. 4 shows an example database schema 400 for managing ATV data, according to some embodiments.
  • a first data structure (e.g., data table, linked list, etc.) for users 402 stores information such as a user identifier (ID), first name, last name, email address, phone number, and current device. The user information may be stored as a set of strings.
  • a second data structure (e.g., data table, linked list, etc.) for devices 404 (e.g., the ATV) stores information such as a device ID, user ID, index, name, last location, whether geofenced, a geofence radius, a geofence center, an ATV model, and a model number.
  • the data stored to the second data structure for devices 404 can include string, integer, Boolean, double, and geopoint data formats, as shown in FIG. 4.
  • the second data structure for devices 404 is mapped to the first data structure for users 402 as well as a third data structure for ride history 406.
  • the third data structure e.g., data table, linked list, etc.
  • the third data structure for ride history 406 stores information such as an ID, a device ID, an index, whether rollover occurred, altitudes, velocities, coordinates, satellites, GPS timestamps, terrain points, and terrain timestamps.
  • the data stored to the third data structure for ride history 406 can include string, integer, Boolean, double, geopoint, timestamp, and terrain point data formats, as shown in FIG. 4.
  • a fourth data structure (e.g., data table, linked list, etc.) for terrain points 408 stores information such as x, y, and z coordinates and whether rollover occurred.
  • the x, y, and z coordinates are stored in double format, and whether rollover occurred is stored in Boolean format.
  • FIGS. 5A – 10 illustrate example user interfaces for ATV monitoring, according to some embodiments.
  • the system can include a mobile application that users can interact with to configure parameters used for rollover prediction, as well as to view current and/or past ATV parameters.
  • the system can gather and process ATV data using the techniques described above with respect to FIGS.
  • FIG. 5A is an example user interface 500 for ATV management, according to some embodiments.
  • the user interface 500 includes an emergency button 501 that can be used to dispatch an alarm to call for help.
  • the user interface 500 further includes interface elements to allow a user to navigate to various screens (e.g., device info, engine control, and location control screens, as depicted in FIG. 5A, as well as view ride data if available).
  • FIG. 5B is an example user interface 502 for geofencing, according to some embodiments.
  • the user interface 502 can be used to configure or view geofencing settings. As shown in FIG.
  • FIG. 6 is another example dashboard user interface 600 for ATV monitoring, according to some embodiments.
  • the dashboard user interface 600 shows information collected in relation to the ATV, such as the ATV make and model 602.
  • the dashboard user interface 600 further shows information about the last ride 604. As shown in FIG. 6, the information about the last ride 604 can include how many days ago the ride was, the location of the ride, the distance of the ride, and the length of time of the ride.
  • the dashboard user interface 600 further shows a graph 606 showing ride data over time.
  • FIG. 7 is another example user interface 700 for ATV monitoring, according to some embodiments.
  • the user interface 700 shows ride information for a particular session.
  • the information shown in the user interface 700 including the session date 702, the ATV make and model 704, the time range of the ride 706, and the location of the ride 708.
  • the user interface 700 further displays the total time of the ride 710, the average speed during the ride 712, the number of miles covered in the ride 714, the number of rollovers during the ride 716, the top speed during the ride 718, and the number of stops during the ride 720.
  • the user interface 700 further displays a graph 722 showing velocity over time for the ride.
  • FIG. 8 is another example user interface 800 for ATV monitoring, according to some embodiments.
  • the user interface 800 shows a map 802.
  • FIGS. 9A and 9B are additional example user interfaces 900, 902 for ATV monitoring, according to some embodiments.
  • the user interface 900 depicted in FIG. 9A and the user interface 902 depicted in FIG.9B can be used to allow a user to view or configure ATV parameters. As described above, such parameters can be used to perform tasks such as rollover prediction.
  • a user can view or modify information about the ATV, such as a name for the ATV 906, a device name 904, phone number 908, and so forth, as illustrated in FIG.9A.
  • the user interface 900 can also be used to enable or disable settings such as a research mode 910.
  • the user interface 900 further includes elements for configuring a Madgwick sample rate 912, GPS sample rate 914, IMU connection 916, and IMU degrees of freedom 918, as shown in Fig. 9A.
  • a user can view or modify information about the ATV, such as a name for the ATV 920 and a vehicle model 922.
  • the user interface 900 can also be used to enable or disable geofencing via a slider 924.
  • the user interface 900 can also be used to establish a radius 926 and center 928 to be used for geofencing.
  • FIG. 10 is another example user interface 1000 for ATV monitoring, according to some embodiments.
  • the user interface 1000 can display ride history 1002 for one or more ATVs, including ride information on multiple dates, with the name of the ATV 1004, the date 1006, and the distance traveled 1008, as shown in this example.
  • Any of the interfaces can also include buttons at the bottom, as shown in FIG. 10, to navigate to different interfaces, such as a rides button 1010, a settings button 1012, and a home button 1014.
  • FIG. 11 illustrates an example computer system 1100, in which various embodiments may be implemented.
  • the system 1100 may be used to implement any of the computer systems and/or devices described above.
  • computer system 1100 includes a processing unit 1104 that communicates with a number of peripheral subsystems via a bus subsystem 1102. These peripheral subsystems may include a processing acceleration unit 1106, an I/O subsystem 1108, a storage subsystem 1118 and a communications subsystem 1124.
  • Storage subsystem 1118 includes tangible computer- readable storage media 1122 and a system memory 1110.
  • Bus subsystem 1102 provides a mechanism for letting the various components and subsystems of computer system 1100 communicate with each other as intended. Although bus subsystem 1102 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple buses. Bus subsystem 1102 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. For example, such architectures may include an Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, which can be implemented as a Mezzanine bus manufactured to the IEEE P1386.1 standard.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • PCI Peripheral Component Interconnect
  • Processing unit 1104 which can be implemented as one or more integrated circuits (e.g., a conventional microprocessor or microcontroller), controls the operation of computer system 1100.
  • processors may be included in processing unit 1104. These processors may include single core or multicore processors.
  • processing unit 1104 may be implemented as one or more independent processing units 1132 and/or 1134 with single or multicore processors included in each processing unit. In other embodiments, processing unit 1104 may also be implemented as a quad-core processing unit formed by integrating two dual-core processors into a single chip.
  • processing unit 1104 can execute a variety of programs in response to program code and can maintain multiple concurrently executing programs or processes.
  • Computer system 1100 may additionally include a processing acceleration unit 1106, which can include a digital signal processor (DSP), a special-purpose processor, and/or the like.
  • DSP digital signal processor
  • I/O subsystem 1108 may include user interface input devices and user interface output devices.
  • User interface input devices may include a keyboard, pointing devices such as a mouse or trackball, a touchpad or touch screen incorporated into a display, a scroll wheel, a click wheel, a dial, a button, a switch, a keypad, audio input devices with voice command recognition systems, microphones, and other types of input devices.
  • User interface input devices may include, for example, motion sensing and/or gesture recognition devices such as the Microsoft Kinect® motion sensor that enables users to control and interact with an input device, such as the Microsoft Xbox® 360 game controller, through a natural user interface using gestures and spoken commands.
  • User interface input devices may also include eye gesture recognition devices such as the Google Glass® blink detector that detects eye activity (e.g., ‘blinking’ while taking pictures and/or making a menu selection) from users and transforms the eye gestures as input into an input device (e.g., Google Glass®). Additionally, user interface input devices may include voice recognition sensing devices that enable users to interact with voice recognition systems (e.g., Siri® navigator), through voice commands.
  • eye gesture recognition devices such as the Google Glass® blink detector that detects eye activity (e.g., ‘blinking’ while taking pictures and/or making a menu selection) from users and transforms the eye gestures as input into an input device (e.g., Google Glass®).
  • user interface input devices may include voice recognition sensing devices that enable users to interact with voice recognition systems (e.g., Siri® navigator), through voice commands.
  • voice recognition systems e.g., Siri® navigator
  • User interface input devices may also include, without limitation, three dimensional (3D) mice, joysticks or pointing sticks, gamepads and graphic tablets, and audio/visual devices such as speakers, digital cameras, digital camcorders, portable media players, webcams, image scanners, fingerprint scanners, barcode reader 3D scanners, 3D printers, laser rangefinders, and eye gaze tracking devices.
  • user interface input devices may include, for example, medical imaging input devices such as computed tomography, magnetic resonance imaging, position emission tomography, medical ultrasonography devices.
  • User interface input devices may also include, for example, audio input devices such as MIDI keyboards, digital musical instruments and the like.
  • User interface output devices may include a display subsystem, indicator lights, or non-visual displays such as audio output devices, etc.
  • the display subsystem may be a cathode ray tube (CRT), a flat-panel device, such as that using a liquid crystal display (LCD) or plasma display, a projection device, a touch screen, and the like.
  • CTR cathode ray tube
  • LCD liquid crystal display
  • plasma display a projection device
  • touch screen a touch screen
  • output device is intended to include all possible types of devices and mechanisms for outputting information from computer system 1100 to a user or other computer.
  • user interface output devices may include, without limitation, a variety of display devices that visually convey text, graphics and audio/video information such as monitors, printers, speakers, headphones, automotive navigation systems, plotters, voice output devices, and modems.
  • Computer system 1100 may comprise a storage subsystem 1118 that comprises software elements, shown as being currently located within a system memory 1110.
  • System memory 1110 may store program instructions that are loadable and executable on processing unit 1104, as well as data generated during the execution of these programs.
  • system memory 1110 may be volatile (such as random-access memory (RAM)) and/or non-volatile (such as read-only memory (ROM), flash memory, etc.)
  • RAM random-access memory
  • ROM read-only memory
  • flash memory etc.
  • the RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated and executed by processing unit 1104.
  • system memory 1110 may include multiple different types of memory, such as static random-access memory (SRAM) or dynamic random-access memory (DRAM).
  • a basic input/output system containing the basic routines that help to transfer information between elements within computer system 1100, such as during start-up, may typically be stored in the ROM.
  • system memory 1110 also illustrates application programs 1112, which may include client applications, Web browsers, mid-tier applications, relational database management systems (RDBMS), etc., program data 1114, and an operating system 1116.
  • operating system 1116 may include various versions of Microsoft Windows®, Apple Macintosh®, and/or Linux operating systems, a variety of commercially available UNIX® or UNIX-like operating systems (including without limitation the variety of GNU/Linux operating systems, the Google Chrome® OS, and the like) and/or mobile operating systems such as iOS, Windows® Phone, Android® OS, BlackBerry® 10 OS, and Palm® OS operating systems.
  • Storage subsystem 1118 may also provide a tangible computer-readable storage medium for storing the basic programming and data constructs that provide the functionality of some embodiments.
  • Software programs, code modules, instructions that when executed by a processor provide the functionality described above may be stored in storage subsystem 1118.
  • Storage subsystem 1118 may also provide a repository for storing data used in accordance with the present disclosure.
  • Storage subsystem 1118 may also include a computer-readable storage media reader 1120 that can further be connected to computer-readable storage media 1122. Together and optionally, in combination with system memory 1110, computer-readable storage media 1122 may comprehensively represent remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information.
  • Computer-readable storage media 1122 containing code, or portions of code can also include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to, volatile and non-volatile, removable and non- removable media implemented in any method or technology for storage and/or transmission of information.
  • This can include tangible computer-readable storage media such as RAM, ROM, electronically erasable programmable ROM (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other tangible computer readable media.
  • computer-readable storage media 1122 may include a hard disk drive that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a CD ROM, DVD, and Blu-Ray® disk, or other optical media.
  • Computer-readable storage media 1122 may include, but is not limited to, Zip® drives, flash memory cards, universal serial bus (USB) flash drives, secure digital (SD) cards, DVD disks, digital video tape, and the like.
  • Computer- readable storage media 1122 may also include, solid-state drives (SSD) based on non-volatile memory such as flash-memory based SSDs, enterprise flash drives, solid state ROM, and the like, SSDs based on volatile memory such as solid state RAM, dynamic RAM, static RAM, DRAM-based SSDs, magneto- resistive RAM (MRAM) SSDs, and hybrid SSDs that use a combination of DRAM and flash memory based SSDs.
  • SSD solid-state drives
  • Communications subsystem 1124 provides an interface to other computer systems and networks. Communications subsystem 1124 serves as an interface for receiving data from and transmitting data to other systems from computer system 1100. For example, communications subsystem 1124 may enable computer system 1100 to connect to one or more devices via the Internet.
  • communications subsystem 1124 can include radio frequency (RF) transceiver components for accessing wireless voice and/or data networks (e.g., using cellular telephone technology, advanced data network technology, such as 3G, 4G or EDGE (enhanced data rates for global evolution), WiFi (IEEE 802.11 family standards, or other mobile communication technologies, or any combination thereof), global positioning system (GPS) receiver components, and/or other components.
  • RF radio frequency
  • communications subsystem 1124 can provide wired network connectivity (e.g., Ethernet) in addition to or instead of a wireless interface.
  • communications subsystem 1124 may also receive input communication in the form of structured and/or unstructured data feeds 1126, event streams 1128, event updates 1130, and the like on behalf of one or more users who may use computer system 1100.
  • communications subsystem 1124 may be configured to receive data feeds 1126 in real-time from users of social networks and/or other communication services such as Twitter® feeds, Facebook® updates, web feeds such as Rich Site Summary (RSS) feeds, and/or real-time updates from one or more third party information sources.
  • RSS Rich Site Summary
  • communications subsystem 1124 may also be configured to receive data in the form of continuous data streams, which may include event streams 1128 of real- time events and/or event updates 1130, that may be continuous or unbounded in nature with no explicit end.
  • applications that generate continuous data may include, for example, sensor data applications, financial tickers, network performance measuring tools (e.g., network monitoring and traffic management applications), clickstream analysis tools, automobile traffic monitoring, and the like.
  • Communications subsystem 1124 may also be configured to output the structured and/or unstructured data feeds 1126, event streams 1128, event updates 1130, and the like to one or more databases that may be in communication with one or more streaming data source computers coupled to computer system 1100.
  • Computer system 1100 can be one of various types, including a handheld portable device (e.g., an iPhone® cellular phone, an iPad® computing tablet, a PDA), a wearable device (e.g., a Google Glass® head mounted display), a PC, a workstation, a mainframe, a kiosk, a server rack, or any other data processing system.
  • a handheld portable device e.g., an iPhone® cellular phone, an iPad® computing tablet, a PDA
  • a wearable device e.g., a Google Glass® head mounted display
  • PC personal computer system
  • workstation e.g., a workstation
  • mainframe e.g., a mainframe
  • a kiosk e.g., a server rack
  • server rack e.g., a server rack
  • the computing system for rollover prediction may have one or more microprocessors/processing devices that can further be a component of the overall apparatuses.
  • the control systems are generally proximate to their respective devices, in electronic communication (wired or wireless) and can also include a display interface and/or operational controls configured to be handled by a user to monitor the respective systems, to change configurations of the respective systems, and to operate, directly guide, or set programmed instructions for the respective systems, and sub-portions thereof.
  • Such processing devices can be communicatively coupled to a non-volatile memory device via a bus.
  • the non-volatile memory device may include any type of memory device that retains stored information when powered off.
  • Non-limiting examples of the memory device include electrically erasable programmable read-only memory (“ROM”), flash memory, or any other type of non-volatile memory.
  • ROM electrically erasable programmable read-only memory
  • flash memory or any other type of non-volatile memory.
  • at least some of the memory device can include a non-transitory medium or memory device from which the processing device can read instructions.
  • a non-transitory computer-readable medium can include electronic, optical, magnetic, or other storage devices capable of providing the processing device with computer-readable instructions or other program code.
  • Non-limiting examples of a non- transitory computer-readable medium include (but are not limited to) magnetic disk(s), memory chip(s), ROM, random-access memory (“RAM”), an ASIC, a configured processor, optical storage, and/or any other medium from which a computer processor can read instructions.
  • the instructions may include processor-specific instructions generated by a compiler and/or an interpreter from code written in any suitable computer-programming language, including, for example, C, C++, C#, Java, Python, Perl, JavaScript, etc.
  • Such configuration can be accomplished, e.g., by designing electronic circuits to perform the operation, by programming programmable electronic circuits (such as microprocessors) to perform the operation, or any combination thereof.
  • Processes can communicate using a variety of techniques including but not limited to conventional techniques for inter process communication, and different pairs of processes may use different techniques, or the same pair of processes may use different techniques at different times.
  • the specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that additions, subtractions, deletions, and other modifications and changes may be made thereunto without departing from the broader spirit and scope as set forth in the claims.
  • Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is intended to be understood within the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Navigation (AREA)

Abstract

Provided herein are systems and methods to predict rollover of an all-terrain vehicle. The system uses an internal measurement unit (IMU) and local processor to gather data from the all-terrain vehicle. Dimensions of the all-terrain vehicle, IMU data, the roll angle, the pitch angle, and the turning radius of the all-terrain vehicle are obtained. The data is input to a neural network locally executing. The neural network predicts a value indicating whether the all-terrain vehicle is likely to rollover. Based on the value, the system may output an alert.

Description

PATENT Attorney Docket No.070772-1447919-232120PC Client Ref. No. UC 2022-536-3 ROLLOVER PREDICTION AND ALERT FOR ALL-TERRAIN VEHICLE CROSS-REFERENCE TO RELATED APPLICATIONS [0001] This application claims priority to US Provisional Application No.63/519,928, filed on August 16, 2023, the contents of which are incorporated by reference herein in their entirety. STATEMENT AS TO RIGHTS TO INVENTIONS MADE UNDER FEDERALLY SPONSORED RESEARCH AND DEVELOPMENT [0002] This invention was made with government support under NIOSH grant #U54OH007550 awarded by The Western Center for Agricultural Health and Safety / National Institute for Occupational Safety and Health (NIOSH). The government has certain rights in the invention. BACKGROUND [0003] All-terrain vehicles (ATV) are commonly used in agricultural operations to apply fertilizer and chemicals, inspect livestock and crops, supervise workers, transport personnel, mow grass, round up livestock, and carry and tow implements, as well as for recreational purposes. However, ATVs are unstable vehicles due to the narrow wheelbase and track width and high center of gravity, which increase the chance of accidents. Many of the crashes occur in rural areas with no cellular service. The off-road crashes are especially problematic because of the difficulty for Emergency Medical Service (EMS) and other first responders to locate the crash site, obtain access to the crash location, which may be a trail located in heavy woods, render aid to the injured, and transport the injured to the nearest hospital. [0004] Crash detection and prevention systems exist, primarily for vehicles such as cars and trucks. However, due to the unique shape and usage of ATVs, existing systems cannot accurately predict and prevent accidents such as rollover of ATVs. For instance, crash detection and notification systems are limited because they cannot warn the driver/rider ahead of time; thus, drivers cannot prevent the crash. [0005] Collision prediction systems also exist, but are mainly used for automobiles, while rollover prediction systems are more popular among farm machinery such as tractors. Regardless of the crash type, conventional crash prediction systems often consist of expensive sensors such as radar, light detection and ranging (LIDAR), camera, and Global Positioning System (GPS). An alternative budget solution consists of using collision prediction models. Collision prediction models take the state of multiple vehicles as input, i.e., there must be a continuous information exchange, which makes this approach vulnerable to communication errors. [0006] Similarly, there exist rollover prediction models, which are primarily based on the vehicle’s center of gravity (CG), and terrain slope angles (roll and pitch). Although collision and rollover prediction algorithms have brought considerable improvements in crash avoidance, they have several limitations. For instance, none of those models were designed for ATVs, which have significantly different dynamics from automobiles and tractors. For example, the simple addition of a passenger completely changes the vehicle’s stability. Furthermore, ATV crashes occur in diverse and unique scenarios, comprising different terrain slopes and surfaces. Moreover, collision models take the state of multiple vehicles as input, i.e., there must be a continuous information exchange, which makes this approach vulnerable to communication errors. Further, traditional rollover models rely on several assumptions that are unlikely to hold true in real crash scenarios, such as rider behavior, riding on curves, and the complexity of suspension systems. The techniques described herein overcome these and other problems. BRIEF SUMMARY [0007] Systems and methods monitor vehicles such as ATVs for issues such as likely rollover. In some embodiments, the system includes onboard circuitry and/or a mobile application that can collect data to analyze factors that lead to accidents, allow riders to visualize ride histories and statistics, and help first responders rescue injured riders who have suffered rollover accidents. [0008] In some embodiments, an all-terrain vehicle rollover prediction apparatus comprises an inertial measurement unit (IMU) configured to be fastened to an all-terrain vehicle; at least one local processor physically wired to the IMU; a memory connected with the at least one local processor, the memory having instructions executable by the at least one processor for: looking up, from a data table, dimensions of the all-terrain vehicle; receiving IMU data from the IMU; estimating a roll angle and a pitch angle from the IMU data; determining a yaw rate from the IMU data; calculating a turning radius from the yaw rate; inputting, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle, and the turning radius; reading, from the neural network model, a value predicting whether the all-terrain vehicle will roll over; and outputting an alert based on the value. [0009] In some aspects, the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz. In some aspects, the neural network model has exactly two hidden layers. In some aspects, each hidden layer of the neural network model has 32 neurons. [0010] In some aspects, the memory has further instructions executable by the at least one processor for: calculating a difference between a respective average and each of the speed, the dimensions, the roll angle, the pitch angle, and the turning radius and dividing each difference by a respective scale factor before inputting to the neural network model. [0011] In some aspects, the average is a mean or a median, and the scale factor is a standard deviation or a variance, of the respective speed, dimensions, roll angle, pitch angle, and turning radius as determined from training data with which the neural network model was trained. [0012] In some aspects, the neural network model was trained using a categorical cross entropy loss function. In some aspects, the memory has further instructions executable by the at least one processor for receiving a user selection of a make or model of the all-terrain vehicle, wherein the looking up is based on the user selection. [0013] In some aspects, the memory has further instructions executable by the at least one processor for receiving user entries of the dimensions, wherein the looking up is based on the user entries. In some aspects, the looked up dimensions of the all-terrain vehicle include one or more of a weight, a width, a length, a wheelbase, and a seat height. [0014] In some aspects, the memory has further instructions executable by the at least one processor for inputting, to the neural network model, a presence of a rider or a non-presence of a rider for autonomous driving. In some aspects, the value is a scalar representing a probability of roll over. [0015] In some aspects, the apparatus further includes an aural or visual indicator onboard the all-terrain vehicle configured to warn a rider based on the alert. In some aspects, the apparatus further includes a transmitter configured to send a wireless message based on the alert. [0016] In some aspects, the apparatus further includes a global positioning system (GPS) receiver, wherein the memory has further instructions executable by the at least one processor for: obtaining data from the GPS receiver and data from the IMU and combining the data from the GPS receiver and the data from the IMU to estimate the speed of the all-terrain vehicle. [0017] In some aspects, the at least one local processor is rigidly connected with the IMU. In some aspects, the apparatus further includes a housing enclosing the IMU, the at least one local processor, and the memory. [0018] In some aspects, the at least one local processor is rigidly connected with the IMU. In some aspects, the apparatus further includes a housing enclosing the IMU, the at least one local processor, and the memory. [0019] In various embodiments, methods are provided for executing all or part of the operations disclosed herein. [0020] In various embodiments, one or more non-transitory computer-readable media are provided for storing instructions which, when executed by one or more processors, cause a system to perform part or all of the operations and/or methods disclosed herein. [0021] A further understanding of the nature and the advantages of the inventions disclosed herein may be realized by reference of the remaining portions of the specification and the attached drawings. BRIEF DESCRIPTION OF THE DRAWINGS [0022] In order to more fully understand the present invention, reference is made to the accompanying drawings. Understanding that these drawings are not to be considered limitations in the scope of the invention, the presently described embodiments and the presently understood best mode of the invention are described with additional detail through use of the accompanying drawings. [0023] FIG. 1 is a simplified block diagram of an example of a system for predicting ATV rollover according to some embodiments. [0024] FIG. 2 is a simplified block diagram of an example of components of the system for predicting ATV rollover of FIG. 1, according to some embodiments. [0025] FIG. 3A is a simplified flowchart of a method for predicting ATV rollover, according to some embodiments. [0026] FIG. 3B is a simplified flowchart of a method for detecting ATV rollover, according to some embodiments. [0027] FIG. 4 is an example database schema for managing ATV data, according to some embodiments. [0028] FIG. 5A is an example user interface for ATV monitoring, according to some embodiments. [0029] FIG. 5B is another example user interface for ATV monitoring, according to some embodiments. [0030] FIG. 6 is an example dashboard user interface for ATV monitoring, according to some embodiments. [0031] FIG. 7 is another example user interface for ATV monitoring, according to some embodiments. [0032] FIG. 8 is another example user interface for ATV monitoring, according to some embodiments. [0033] FIG. 9A is another example user interface for ATV monitoring, according to some embodiments. [0034] FIG. 9B is another example user interface for ATV monitoring, according to some embodiments. [0035] FIG. 10 is another example user interface for ATV monitoring, according to some embodiments. [0036] FIG. 11 is a block diagram illustrating an example computer system, according to at least one embodiment. DETAILED DESCRIPTION [0037] As noted above, existing crash prediction and reaction systems have shortcomings including requiring complex hardware and being unsuitable for ATV applications. The systems and methods of the present disclosure overcome these issues and provide accurate, low-cost rollover prediction tailored to ATVs. [0038] In some embodiments, an AI-based incident prediction and notification system includes a deep neural network trained to calculate the likelihood of a rollover event based on the ride’s parameters (e.g., vehicle’s speed, roll, pitch, and turning radius) and ATV parameters (e.g., weight, height, width, wheelbase, and seat height). Advantageously, as compared to prior systems, the systems and methods of the present disclosure provide simplicity in terms of inputs and easiness of implementation for real-time applications (e.g., a computing time smaller than 1 millisecond (ms)). In some aspects, the system takes as input ATV dimensions that are easy to determine or readily available on the internet and ride parameters that are measured by its embedded system in real-time. On the other hand, alternative systems require information either about the vehicle’s center of gravity or pre- determined stability angles, which are variables that require laborious field tests to be obtained. [0039] FIG. 1 is a simplified block diagram of an example of a system 100 for predicting ATV rollover according to some embodiments. The system 100 may include an all-terrain vehicle rollover prediction apparatus including an all-terrain vehicle (ATV) 102, a server 110 coupled to a database 112, and a mobile application 104 executing on a user device such as a smartphone. The ATV 102, the server 110, and the mobile application 104 can further be coupled to a satellite network 106 and a rescue service 108. Various communication protocols 114 can be implemented for communication between the components of the system 100. [0040] The ATV 102 may be coupled to, or include, components for collecting and processing data. The collected data can include ride parameters such as velocity, 3-axis acceleration and rotation, and device location. The system 100 may include an inertial measurement unit (IMU) configured to be fastened to the ATV 102, at least one local processor physically wired to the IMU, and a memory connected with the at least one local processor, the memory having instructions executable by the at least one processor for performing the methods described herein. Additional components of the system 100 are described in further detail below with respect to FIG. 2. [0041] In some embodiments, the server computer 110 includes functionality to receive, transmit, and analyze data via the ATV 102, the mobile application 104, the satellite network 106, and/or the rescue service 108. The server computer 110 may include a processor coupled to a memory, a network interface, and a computer-readable medium, as further described below with respect to FIG. 11. [0042] In some embodiments, the server computer 110 includes or is communicatively coupled to a database 112. In some implementations, the database 112 is a remote (e.g., cloud) database. The server computer 110 may receive data from the ATV 102, satellite network 106, mobile application 104, and/or rescue service 108, and record some or all of that data, or derivatives thereof, to the database 112. The server computer 110 may further transmit some or all of the data to the ATV 102, satellite network 106, mobile application 104, and/or rescue service 108. For example, data is transmitted to the mobile application 104 and displayed via interfaces such as those illustrated in FIGS. 6 – 10. As another example, data is transmitted to the rescue service 108 indicative of a rollover situation, which can trigger dispatch of a rescue crew. [0043] The mobile application 104 can include functionality for displaying information collected by the system 100 or derivatives thereof. For example, the mobile application visualizes collected parameters. Alternatively, or additionally, the mobile application stores ride history and parameters. In some implementations, some, or all, of this data is stored to the database 112. [0044] In some embodiments, the mobile application 104 includes functionality to allow users (e.g., ATV owners) to add, manage, monitor, and track riding parameters and relevant statistics for their ATVs. The application can read the riding parameters from the database and will display the information in an analytical form. In some implementations, Bluetooth Low Energy (BLE) is used to transfer information directly from the processing device shown in FIG. 2 to the application so a user can access riding parameters in real time. [0045] The satellite network 106 can be used to transmit emergency messages off-board. Common methods for remote data transfer such as WiFi and cellular networks may be unavailable in locations where ATVs are used (e.g., remote or rural locations). Additionally, or alternatively, the satellite network 106 is used for GPS location determining functions. In some examples, the satellite network 106 is the Iridium Satellite Network and the system includes a Rock7 RockBLOCK Iridium Satellite Modem to communicate with the satellite network 106. [0046] In some embodiments, the mobile application 104 and/or other components of the system 100 sends emergency alerts to one or more rescue services 108. In some aspects, satellites 106 are used, so that such alerts can be transmitted without the need for cellular or WiFi connections, which is useful in remote areas where ATVs are commonly used. In some examples, the devices are in communication with one or more third-party rescue services 108, such as Noonlight®, which is used in conjunction with the server 110 to contact first responders. [0047] FIG. 2 is a simplified block diagram of an example of components of the system 200 for predicting ATV rollover of FIG. 1, according to some embodiments. The components shown in FIG. 2 may be installed in the ATV 102. As shown in FIG. 2, in some embodiments, the system components 200 include a processing device 202, an inertial measurement unit (IMU) 204, a modem 206, a relay 208, an antenna 210, a GPS receiver 211, an ATV engine 212, one or more batteries, such as a 12 volt (V) battery 218 and a portable battery 214, and a converter 216. In some embodiments, some or all of the components shown in FIG. 2 (e.g., the processing device 202, the IMU 204, and/or other components) are enclosed in a housing. For example, the electronic components of the embedded system are placed inside a custom-manufactured enclosure, resistant to vibration and dust. Such an enclosure can be installed on the ATV, e.g., installed on the rear rack of the ATV. [0048] In some embodiments, the processing device 202 includes at least one processor. In the example depicted in FIG. 2, the processing device 202 is a Raspberry Pi® (e.g., a Raspberry Pi® 4) or other microcontroller. The processing device 202 is a data computation device or devices, operable to carry out instructions. The processing device 202 may include a CPU that includes at least one high-speed data processor adequate to execute program components for executing user and/or system-generated requests. The CPU may be a microprocessor such as Raspberry Pi® 4; AMD®’s Athlon, Duron, and/or Operon; IBM® and/or Motorola®’s Power PC®; IBM®’s and Sony®’s Cell processor; Intel®’s Celeron®, Itanium®, Pentium®, Xeon®, and/or XScale®; and/or the like processor(s). The processing device 202 may be local to, e.g., installed on, the ATV 102 shown in FIG. 1. [0049] The processing device 202 may include, or be coupled to, a memory. The memory may be any magnetic, electronic, or optical memory. The memory may include any number of memory modules. An example of memory is dynamic random access memory (DRAM). The memory can include a computer-readable medium, which may include software modules comprising programming for performing the methods described herein. [0050] In some embodiments, the inertial measurement unit (IMU) 204 includes electronics configured for measuring and reporting the acceleration, angular rate, and/or other parameters of the ATV. The IMU 204 may include one or more accelerometers, gyroscopes, and/or magnetometers. In some examples, the IMU 204 is fastened to the all-terrain vehicle 102 depicted in FIG. 1 and is coupled (e.g., physically wired and/or rigidly connected) to the processing device 202. In some aspects, the IMU 204 logs accelerometer and gyroscope data locally. This data may be collected at a sample rate (e.g., a user-determined sample rate). The data collected by the IMU 204 is provided to the processing device 202 for analysis (e.g., to determine if a rollover has occurred or is likely to occur, as described herein). [0051] In some embodiments, the inertial measurement unit (IMU) 204 is an IMU such as the Adafruit® LSM9DS1. In some examples, the IMU includes an accelerometer, which measures acceleration on three axes. The accelerometer is capable of capturing data used in determining the vehicle’s pitch, roll, and yaw. Alternatively or additionally, the IMU includes a gyroscope, which measures rotation velocity. Alternatively or additionally, the IMU includes a magnetometer, which measures magnetic fields. [0052] In some embodiments, the antenna 210 includes electronics configured for recreiving and amplifying radio signals from satellites and converting the radio signals to electornic signals. The antenna 210 may, for example, be a GPS or GNSS antenna, as depicted in FIG. 2. [0053] In some embodiments, the global positioning system (GPS) receiver 211 includes electronics configured for receiving electronic signals from the antenna 210 and analyzing the electronic signals to determine information such as position, velocity, and precise time based on data received from GPS satellites. The GPS receiver 211 and/or processing device 202 may be configured to log location, altitude, and velocity data locally at a predetermined (e.g., user-configured) sample rate. The GPS receiver 211 and/or processing device 202 may further be configured to determine whether the vehicle is within a geofence (if one is set). The GPS receiver 211 and/or processing device 202 may further be configured to filter results based on a configured signal strength parameter. In some examples, the GPS receiver 211 is an Adafruit® Ultimate GPS Breakout v3 or similar. The GPS receiver 211 can be used to detect rollover crashes, log position, velocity, and altitude. In some examples, the outputs of the GPS receiver 211 include latitude, longitude, altitude, velocity, and number of connected satellites. Some or all of the data from the GPS receiver can be logged and/or sent to a remote database. [0054] In some embodiments, the system 200 includes a transmitter configured to send wireless messages. For example, as shown in FIG. 2, the system includes a modem 206, such as a Rock7 modem or other Iridium satellite modems. The transmitter, may, for example, send messages requesting help based on detecting rollover is likely or has occurred. [0055] In some embodiments, the system 200 includes one or more batteries, such as the 12 V battery 218 (e.g., the ATV battery) and/or the portable battery 214. In some examples, the portable battery 214 is an Anker PowerCore II 20000 connected to the processing device 202 via a USB-C connection. This portable battery 214 can supply enough power to the processing device 202 for several hours in the event the ATV battery is out of charge. [0056] In some embodiments, the system 200 includes an ATV engine 212 coupled to a relay 208. The relay 208 is an electrically operated switch. The relay 208 may include a set of input terminals for one or more control signals, and a set of operating contact terminals. The relay is connected to the ATV engine 212 and the 12 V battery 218 and/or portable battery 214. The relay 208 can disengage the ATV engine 212 and batteries 218 and/or 214 to perform automatic or manual device shutoff as described herein. [0057] In some embodiments, the system 200 includes a converter 216. The converter 216 may be an DC-DC converter that converts 12 V power from the ATV battery to a suitable voltage level for providing to the processing device 202. [0058] FIG. 3A is a simplified flowchart of a method 300 for predicting ATV rollover, according to some embodiments. The method presented in FIG. 3A and described below is intended to be illustrative and non-limiting. It is appreciated that the processing steps may be performed in an order different from that depicted in FIG. 3A and that not all the steps depicted in FIG. 3A need be performed. In certain implementations, the method 300 may be implemented by a computer system, such as the system 100 shown in FIG. 1 and FIG. 2. [0059] At step 302, an inertial measurement unit (IMU) fastened to an all-terrain vehicle and at least one processor physically wired to the IMU are provided. The system can further include a global positioning system (GPS) receiver. In some aspects, the at least one local processor is rigidly connected with the IMU. In some aspects, the system further includes a transmitter configured to send a wireless message (e.g., based on an alert as described herein). In some aspects, the system further includes a housing enclosing the IMU, the at least one local processor, and the memory. As described above with respect to FIGS. 1 and 2, the system can include such components and others. [0060] In some embodiments, a registration process is performed using the mobile application. The registration may, for example, create a unique identifier (ID) based on serial and/or model numbers of the ATV. [0061] In some embodiments, the system begins collecting data when it is powered on. In some aspects, the data is sent to the database. Alternatively, or additionally, the data is loaded into the mobile application. In some implementations, the system begins recording data when the system (e.g., the processing device shown in FIG. 2 and/or the ATV itself) is powered on, and recording ends when the system is turned off. The recording may include storing data locally on the processing device and/or transmitting the data for storage remotely (e.g., on a cloud database by transmitting the data to the server via satellite or WiFi, when available). In some aspects, a parallel thread performs database interactions. In some examples, the system attempts data transmission at a suitable interval, such as once every 10 seconds. [0062] At step 304, the system looks up, from a data table, dimensions of the all-terrain vehicle. The data table can include various information about the ATV. An example data structure is illustrated in FIG. 4. The dimensions can include one or more of a weight, a width, a length, a wheelbase, a seat height, and so forth. [0063] In some aspects, the system receives a user selection of vehicle parameters, environmental parameters, or the like, at some initial time. The looking up is based on the user selection. For example, the system receives a make or model of the all-terrain vehicle (e.g., via the mobile application), the dimensions, and so forth. In some cases, the mobile application receives, via user input, data such as the make and model of the ATV. The database includes parameters of various different ATVs such as weight, chassis, wheelbase, etc. The system may query the database, based on the type of ATV, to retrieve such ATV parameters. If the ATV is not specifically in the dataset, the system can prompt the user, via the mobile app, to input that data (e.g., “please measure the wheelbase”). As another example, a user can interact with the mobile application to indicate whether a rider is present or the ATV is autonomously driven. [0064] At step 306, the system receives IMU data from the IMU. The IMU data can include, for example, acceleration, angular rate, and/or the like. The IMU data can be collected by the processing device, as shown in FIG. 2. [0065] At step 308, the system estimates a roll angle and a pitch angle from the IMU data. The system may use the orientation information identified by the IMU to compute the roll and pitch angles. For example, the roll angle and a pitch angle can be calculated using accelerometer data. Suitable computations are described in M. Wrona, “Roll and Pitch Angles From Accelerometer Sensors,” https://mwrona.com/posts/accel-roll-pitch/ (2020) and Y. Tawil, “Towards understanding IMU: Basics of Accelerometer and Gyroscope Sensors and How to Compute Pitch, Roll and Yaw Angles,” https://atadiat.com/en/e-towards- understanding-imu-basics-of-accelerometer-and-gyroscope-sensors/ (2023), which are incorporated by reference. [0066] In some aspects, the system measures the ATV's attitude (including roll and pitch) with the IMU. To improve the accuracy and robustness of vehicle's attitude estimate, a Madgwick filter is implemented. This filter fuses gyroscope, accelerometer, and magnetometer measurements to calculate the vehicle's attitude. The vehicle's attitude is initially estimated by the integration of gyroscope measures, which in the long term inherently yields drift. The long-term drift from the gyroscope integration is compensated by the accelerometer estimates of attitude. The magnetometer measures are used to compensate for magnetic distortions from potential sources of interference around the sensor, such as electrical appliances (for instance, a GPS sensor), and metal structures (e.g., the ATV's frame). [0067] At step 310, the system determines a yaw rate from the IMU data. The system may use the ATV’s magnetometer readings, mass, speed, and/or orientation to compute the yaw rate. For example, the yaw rate can be calculated based on gyroscope data as described in Tawil, supra. [0068] At step 312, the system calculates a turning radius from the yaw rate. The system may use the ATV’s velocity, identified by the IMU, and the yaw rate, computed at step 310, to compute the turning radius. For example, the turning radius is given by: ^^^^^^^^ ^^^^^^^ ^^^^^^ ൌ ^^^ ^^^^ , where the turning radius is the ATV’s turning radius in m, velocity is the ATV’s speed in m s-1, and yaw rate is the ATV’s yaw rate in radians s-1. In some aspects, the system converts the vehicle’s speed from km/h to meters/s. With the yaw rate in radians, this provides a turning radius in meters. For cases with no yaw rate (i.e., basically driving on a straight line), the turning radius can be set to a suitable value, such as the maximum value used to train the neural network (e.g., 50 m). [0069] The system may identify and/or calculate various additional parameters. For example, in some embodiments, the system estimates a speed based on data obtained from the GPS receiver and/or data obtained from the IMU. In some implementations, the system uses data from both the GPS receiver and the IMU to estimate the speed of the all-terrain vehicle. For example, the system averages speed data from the two sources. In some aspects, the system fuses GPS and IMU data through an Unscented Kalman Filter (UKF). The UKF is a non-linear version of a Kalman Filter, which is an algorithm that combines multiple sensor information to estimate the state of a system, such as the vehicle's position. The fundamental advantage of the UKF over the Kalman Filter is that it works for non-linear systems, which is the case for most robots and sensors. This enables the system to estimate the vehicle's location, speed, attitude, etc. at a much faster rate (e.g., IMU frequency = 800 Hz) than would otherwise be possible (e.g., GPS frequency ~ 1 Hz, which would result in predictions taking ~ 1-2 s). [0070] In some aspects, the system measures and/or computes acceleration in three dimensions. Due to the environment of some ATV off-road crashes (e.g., dense woods), GPS signals might be unavailable or yield inaccurate measures. To address this issue, IMU data (which allows continuous position tracking) is fused with GPS data through an Unscented Kalman Filter. [0071] In some embodiments, the system further processes the input parameters. For example, the system performs a Z-transformation on the input parameters to avoid overfitting. In some examples, the system identifies average values for the speed, the dimensions, the roll, angle, the pitch angle, and/or the turning radius. The average may, for example, be a mean or a median of previously computed values from the ATV and/or other ATVs historically. The system calculates a difference between a respective average and each of the speed, the dimensions, the roll angle, the pitch angle, and the turning radius. In some examples, the system divides each difference by a respective scale factor before inputting the parameters to the neural network. For example, the scale factor is a standard deviation or a variance, of the respective speed, dimensions, roll angle, pitch angle, and turning radius as determined from training data with which the neural network model was trained (as further described below with respect to step 314). This helps improve prediction accuracy. [0072] At step 314, the system inputs, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle, and the turning radius. Various other parameters can be input to the neural network, such as a presence of a rider or a non-presence of a rider (e.g., for autonomous driving). Table 1 shows an example set of input parameters for the model. Input Variable Unit Description ATV weight Kg ATV net weight (no rider and fuel tank empty) Measured from left to right across the widest part of Width mm the ATV, including tires but not including any side mirrors Measured from front to right across the longest part Length mm of the ATV Measured between the center of the front and rear Wheelbase mm wheel hubs from the same side Seat height mm Measured from the ground up to the ATV seat center Speed km/h ATV’s last recorded speed before the incident ATV pitch angle degrees ATV’s last recorded pitch before the incident ATV roll angle degrees ATV’s last recorded roll before the incident Calculated as the tangential speed (in m/s) divided Turning radius m by the yaw rate (in radians/s) (binary score - 0 if Part of the data used to train the classification Presence of a none, or 1 algorithm consisted of autonomous ATVs without rider otherwise) any riders Table 1 – Example Model Input Variables [0073] In some embodiments, the neural network model is a deep neural network comprising an input layer, an output layer, and one or more hidden layers. In some aspects, the neural network model has exactly two hidden layers, which has been found to provide accurate predictions. In some aspects, each hidden layer of the neural network has 32 neurons. It should be understood that other numbers of layers and/or numbers of neurons can be implemented. [0074] In some embodiments, a classification model based on machine learning algorithms is implemented in the embedded system. This algorithm calculates the likelihood of a rollover event based on the parameters such as those shown above in Table 1. The neural network model may comprise one or more of a K-nearest neighbors (KNN) model, a random forest, a Support Vector Machine (SVM), and/or a Deep Neural Network (DNN). Example tuning parameters for each type of model are shown in Table 2 below. Algorithm Parameter Values KNN n 5, 7, 9, 10 Random Number of trees 100, 200, 250, 300 Forest Maximal depth 10 linear, polynomial, radial basis function, Kernel function sigmoid SVM C 2-3, 2-1, 2
Figure imgf000018_0001
Activation ReLU Hidden Layers 2 (32/32) DNN (neurons) Model Optimizer Adam Epochs 100 Table 2 – Example model tuning parameters [0075] In some embodiments, the neural network model is initially trained for rollover prediction with labeled training data indicating various parameters (e.g., speed, roll angle, turning radius, etc.), labeled to indicate whether or not a rollover occurred. For example, the deep neural network is trained for rollover prediction with a dataset containing 51,700 samples. In some embodiments, the neural network is trained using a categorical cross entropy loss function. In some examples, a Rectified Linear Unit (ReLU) activation function and/or Adaptive moment estimation (Adam) optimizer are used to train the neural network. [0076] In some embodiments, the training dataset includes observations from mathematical simulations, finite element analysis (FEA), and realistic simulations, including static and dynamic tests. For the samples consisting of mathematical simulations and FEA, the occurrence of a rollover can be calculated instantly based on the data. For the samples consisting of realistic simulations (static and dynamic tests), the occurrence of a rollover can be determined by a human observer. [0077] At step 316, the system reads, from the neural network model, a value predicting whether the all-terrain vehicle will roll over. The neural network model may output a value indicating how likely roll-over is to occur. For example, the value is a scalar representing a probability of roll over (e.g., a value between 0 and 1, where 0 is most unlikely and 1 is most likely / has occurred). In some aspects, the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz. [0078] The system may compare the value to a threshold. The system may store a threshold value, which may be configurable (e.g., via user intput to the app). For example, the threshold is 50% likely rollover. If the value does not exceed the threshold, then the system may continue monitoring and repeating steps 302 – 316. If the value does exceed the threshold, then the system may perform some action to warn or otherwise prevent the rollover (e.g., by proceeding to step 318). [0079] At step 318, the system outputs an alert based on the value. The alert can be an aural or visual indicator onboard the all-terrain vehicle configured to warn a rider based on the alert. For example, the system issues an audiovisual alert consisting of a high-intensity red LED and a piezo buzzer. Alternatively, or additionally, the system may send an alert to the mobile phone application and/or remotely to a 911 or other rescue service. In some examples, the on-board system transmits a message to the server, and the server, based on the message, calls 911 or another emergency service and provides the coordinates of the ATV. [0080] FIG. 3B illustrates a method 350 for determining whether a rollover has occurred. In some embodiments, rollover occurrence is detected based on actual angle values that are provided by the processing of the IMU data, which is a robust and accurate method. In some embodiments, the system determines whether a rollover has occurred based on ATV stability angle, speed, and riding condition. [0081] In some embodiments, rollover occurrences are determined by comparing the ATV's static stability angles to the ATV's roll and pitch angles in real time. At 352, gyroscope, accelerometer, and magnometer data is acquired, which is filtered at 354. The ATV's attitude (including roll and pitch) can, for example, be measured with an IMU as shown in FIG. 2. To improve the accuracy and robustness of vehicle's attitude estimate, a Madgwick filter can be implemented. This filter fuses gyroscope, accelerometer, and magnetometer measurements to calculate the vehicle's attitude. The vehicle's attitude is initially estimated by the integration of gyroscope measures, which in the long term inherently yields drift. The long-term drift from the gyroscope integration is compensated by the accelerometer estimates of attitude. The magnetometer measures are used to compensate for magnetic distortions from potential sources of interference around the sensor, such as electrical appliances (for instance, a GPS sensor), and metal structures (e.g., the ATV's frame). [0082] At 356, the roll and pitch angles are compared to a longitudianl stability angle of the ATV. The ATV static stability angles are the critical angles at which an ATV begins to roll (either sideways or forward/rearward). Lateral (side) and longitudinal (rear) stability angles are important measures of the relative stability of an ATV and can be used to describe the rollover propensity of specific vehicles. These angles are generally determined through tilt table tests or calculated based on center of gravity (C.G.) location that can be determined by lifting axle method. [0083] When either roll or pitch angles were higher than the ATV's lateral or longitudinal stability angles, respectively, an internal counter is updated (e.g., either incremented at 358 or decremented at 360). The counter is compared to respective thresholds at 362 and 364. When the counter is above a certain threshold at 364 (e.g., 25 consecutive readings / 5 s) the system triggers the emergency system alerting first responders at 366. The importance of implementing a counter is explained by the fact that false positive rollover detection may occur. The counter circumvents erroneous detections by evaluating a sequence of events instead of a single occurrence. [0084] When a rollover is detected by the device, an emergency alert is sent to the rescue service (e.g., a third party service such as Noonlight®). The emergency alert may be transmitted via the satellite network. In some examples, the alert contains location data and the vehicle owner’s phone number for verification attempts. [0085] Alternatively, or additionally, the location of the ATV is sent to first responders, family members, and/or supervisors (e.g., even when the operator is not able to do so). In some aspects, the recipients of such alerts and/or information can be configured (e.g., by the operator). In some aspects, users can opt into receiving a notification or contacting first responders in case of an accident, rollover, or potential rollover. [0086] Alternatively, or additionally, the system uses geofencing for controlling anti- robbery and/or autonomous shut-off systems. A geofence is a virtual perimeter for a real- world geographic area. The limits of the geofence can be configured, e.g., by a user via the mobile application. In some aspects, if the vehicle leaves its geofence, an alert is sent to the vehicle owner. Alternatively, or additionally, the vehicle is automatically shut off upon determination that the vehicle has left its geofence. The autonomous shut-off system can be activated to prevent a robbery or a possible incident (high speed, turning sharp curve, etc.). The anti-robbery system works based on the autonomous shut-off and geofencing systems. If an ATV crosses the geofence, the shut-off system will turn it off. [0087] Alternatively, or additionally, the smartphone application can be used by the riders to trigger the emergency alert in case any ATV incident (hitting an object, rollover, operator ejection, etc.) occurs and the operator is conscious and able to reach and interact with their phone. For example, the application causes display of an interface element, such as a button, that the user can select to trigger the emergency alert. Alternatively, or additionally, a keyfob can be used by the riders to trigger the emergency alert . [0088] Advantageously, the above characteristics are functional without cellular signals. Unlike crash notification systems developed for the automotive industry, working around roads (which usually have cellular signals) and functional where cellular signals are available, the techniques described above can use satellite networks to operate in off-road locations without cellular or WiFi signals, as commonly occurs when driving an ATV. [0089] Additional advantages include a rollover detection system with an accuracy higher than 99%. The rollover detection system described herein has a fast emergency notification time (less than 41 s) and crash localization with an accuracy of 2 m. The system can identify roll and pitch angles with average errors of a fraction of a degree, respectively, and the system can detect the ATV’s speed with an average error of a fraction of a m/s. Thus, the techniques described herein provide accurate and fast results. [0090] Additional features may include allowing users to create rider credentials, customized geofencing, licensing and servicing, and danger alerts. Further, the system can identify when the ATV is being used appropriately and when the ATV is not being used appropriately, which can be used to improve operator behavior (e.g., identify high risk operators and understand why they are high risk). The system can automatically raise a panic alert in an emergency, even when the rider is unable to do so and when cellphone service and internet connection are unavailable. The system can decrease fatalities and the severity of injuries in ATV rollover crashes by reducing the response time. [0091] FIG. 4 shows an example database schema 400 for managing ATV data, according to some embodiments. A first data structure (e.g., data table, linked list, etc.) for users 402 stores information such as a user identifier (ID), first name, last name, email address, phone number, and current device. The user information may be stored as a set of strings. A second data structure (e.g., data table, linked list, etc.) for devices 404 (e.g., the ATV) stores information such as a device ID, user ID, index, name, last location, whether geofenced, a geofence radius, a geofence center, an ATV model, and a model number. The data stored to the second data structure for devices 404 can include string, integer, Boolean, double, and geopoint data formats, as shown in FIG. 4. The second data structure for devices 404 is mapped to the first data structure for users 402 as well as a third data structure for ride history 406. [0092] The third data structure (e.g., data table, linked list, etc.) for ride history 406 stores information such as an ID, a device ID, an index, whether rollover occurred, altitudes, velocities, coordinates, satellites, GPS timestamps, terrain points, and terrain timestamps. The data stored to the third data structure for ride history 406 can include string, integer, Boolean, double, geopoint, timestamp, and terrain point data formats, as shown in FIG. 4. [0093] A fourth data structure (e.g., data table, linked list, etc.) for terrain points 408 stores information such as x, y, and z coordinates and whether rollover occurred. In the example depicted in FIG. 4, the x, y, and z coordinates are stored in double format, and whether rollover occurred is stored in Boolean format. [0094] FIGS. 5A – 10 illustrate example user interfaces for ATV monitoring, according to some embodiments. As described above, the system can include a mobile application that users can interact with to configure parameters used for rollover prediction, as well as to view current and/or past ATV parameters. The system can gather and process ATV data using the techniques described above with respect to FIGS. 1 – 4, and use this data to generate visualizations shown via user interface. Examples of such interfaces are shown in FIGS. 5A – 10. [0095] FIG. 5A is an example user interface 500 for ATV management, according to some embodiments. The user interface 500 includes an emergency button 501 that can be used to dispatch an alarm to call for help. The user interface 500 further includes interface elements to allow a user to navigate to various screens (e.g., device info, engine control, and location control screens, as depicted in FIG. 5A, as well as view ride data if available). [0096] FIG. 5B is an example user interface 502 for geofencing, according to some embodiments. The user interface 502 can be used to configure or view geofencing settings. As shown in FIG. 5B, the user interface 502 shows a map 504. A user can configure points 506 defining a geofence 508. [0097] FIG. 6 is another example dashboard user interface 600 for ATV monitoring, according to some embodiments. The dashboard user interface 600 shows information collected in relation to the ATV, such as the ATV make and model 602. The dashboard user interface 600 further shows information about the last ride 604. As shown in FIG. 6, the information about the last ride 604 can include how many days ago the ride was, the location of the ride, the distance of the ride, and the length of time of the ride. The dashboard user interface 600 further shows a graph 606 showing ride data over time. [0098] FIG. 7 is another example user interface 700 for ATV monitoring, according to some embodiments. The user interface 700 shows ride information for a particular session. The information shown in the user interface 700 including the session date 702, the ATV make and model 704, the time range of the ride 706, and the location of the ride 708. In the example depicted in FIG.7, the user interface 700 further displays the total time of the ride 710, the average speed during the ride 712, the number of miles covered in the ride 714, the number of rollovers during the ride 716, the top speed during the ride 718, and the number of stops during the ride 720. As shown in FIG. 7, the user interface 700 further displays a graph 722 showing velocity over time for the ride. [0099] FIG. 8 is another example user interface 800 for ATV monitoring, according to some embodiments. The user interface 800 shows a map 802. The map 802 illustrates a path 804 traveled by an ATV. [0100] FIGS. 9A and 9B are additional example user interfaces 900, 902 for ATV monitoring, according to some embodiments. The user interface 900 depicted in FIG. 9A and the user interface 902 depicted in FIG.9B can be used to allow a user to view or configure ATV parameters. As described above, such parameters can be used to perform tasks such as rollover prediction. [0101] Using the user interface 900 of FIG. 9A, a user can view or modify information about the ATV, such as a name for the ATV 906, a device name 904, phone number 908, and so forth, as illustrated in FIG.9A. The user interface 900 can also be used to enable or disable settings such as a research mode 910. The user interface 900 further includes elements for configuring a Madgwick sample rate 912, GPS sample rate 914, IMU connection 916, and IMU degrees of freedom 918, as shown in Fig. 9A. [0102] Using the user interface 902 of FIG. 9B, a user can view or modify information about the ATV, such as a name for the ATV 920 and a vehicle model 922. The user interface 900 can also be used to enable or disable geofencing via a slider 924. The user interface 900 can also be used to establish a radius 926 and center 928 to be used for geofencing. The user interface 900 also shows a serial number 930, model number 932, and manufacturer 934. A user can interact with a “Save” button 936 to save changes or a “Remove Device” button 938 to remove the ATV. [0103] FIG. 10 is another example user interface 1000 for ATV monitoring, according to some embodiments. The user interface 1000 can display ride history 1002 for one or more ATVs, including ride information on multiple dates, with the name of the ATV 1004, the date 1006, and the distance traveled 1008, as shown in this example. Any of the interfaces can also include buttons at the bottom, as shown in FIG. 10, to navigate to different interfaces, such as a rides button 1010, a settings button 1012, and a home button 1014. In some aspects, the user interface 1000 sorts user rides by date. The constructed hierarchy categorizes all rides into a month and months into a year. This allows for the table view to accurately display trip history. [0104] FIG. 11 illustrates an example computer system 1100, in which various embodiments may be implemented. The system 1100 may be used to implement any of the computer systems and/or devices described above. As shown in the figure, computer system 1100 includes a processing unit 1104 that communicates with a number of peripheral subsystems via a bus subsystem 1102. These peripheral subsystems may include a processing acceleration unit 1106, an I/O subsystem 1108, a storage subsystem 1118 and a communications subsystem 1124. Storage subsystem 1118 includes tangible computer- readable storage media 1122 and a system memory 1110. [0105] Bus subsystem 1102 provides a mechanism for letting the various components and subsystems of computer system 1100 communicate with each other as intended. Although bus subsystem 1102 is shown schematically as a single bus, alternative embodiments of the bus subsystem may utilize multiple buses. Bus subsystem 1102 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. For example, such architectures may include an Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, which can be implemented as a Mezzanine bus manufactured to the IEEE P1386.1 standard. [0106] Processing unit 1104, which can be implemented as one or more integrated circuits (e.g., a conventional microprocessor or microcontroller), controls the operation of computer system 1100. One or more processors may be included in processing unit 1104. These processors may include single core or multicore processors. In certain embodiments, processing unit 1104 may be implemented as one or more independent processing units 1132 and/or 1134 with single or multicore processors included in each processing unit. In other embodiments, processing unit 1104 may also be implemented as a quad-core processing unit formed by integrating two dual-core processors into a single chip. [0107] In various embodiments, processing unit 1104 can execute a variety of programs in response to program code and can maintain multiple concurrently executing programs or processes. At any given time, some or all of the program code to be executed can be resident in processor(s) 1104 and/or in storage subsystem 1118. Through suitable programming, processor(s) 1104 can provide various functionalities described above. Computer system 1100 may additionally include a processing acceleration unit 1106, which can include a digital signal processor (DSP), a special-purpose processor, and/or the like. [0108] I/O subsystem 1108 may include user interface input devices and user interface output devices. User interface input devices may include a keyboard, pointing devices such as a mouse or trackball, a touchpad or touch screen incorporated into a display, a scroll wheel, a click wheel, a dial, a button, a switch, a keypad, audio input devices with voice command recognition systems, microphones, and other types of input devices. User interface input devices may include, for example, motion sensing and/or gesture recognition devices such as the Microsoft Kinect® motion sensor that enables users to control and interact with an input device, such as the Microsoft Xbox® 360 game controller, through a natural user interface using gestures and spoken commands. User interface input devices may also include eye gesture recognition devices such as the Google Glass® blink detector that detects eye activity (e.g., ‘blinking’ while taking pictures and/or making a menu selection) from users and transforms the eye gestures as input into an input device (e.g., Google Glass®). Additionally, user interface input devices may include voice recognition sensing devices that enable users to interact with voice recognition systems (e.g., Siri® navigator), through voice commands. [0109] User interface input devices may also include, without limitation, three dimensional (3D) mice, joysticks or pointing sticks, gamepads and graphic tablets, and audio/visual devices such as speakers, digital cameras, digital camcorders, portable media players, webcams, image scanners, fingerprint scanners, barcode reader 3D scanners, 3D printers, laser rangefinders, and eye gaze tracking devices. Additionally, user interface input devices may include, for example, medical imaging input devices such as computed tomography, magnetic resonance imaging, position emission tomography, medical ultrasonography devices. User interface input devices may also include, for example, audio input devices such as MIDI keyboards, digital musical instruments and the like. [0110] User interface output devices may include a display subsystem, indicator lights, or non-visual displays such as audio output devices, etc. The display subsystem may be a cathode ray tube (CRT), a flat-panel device, such as that using a liquid crystal display (LCD) or plasma display, a projection device, a touch screen, and the like. In general, use of the term "output device" is intended to include all possible types of devices and mechanisms for outputting information from computer system 1100 to a user or other computer. For example, user interface output devices may include, without limitation, a variety of display devices that visually convey text, graphics and audio/video information such as monitors, printers, speakers, headphones, automotive navigation systems, plotters, voice output devices, and modems. [0111] Computer system 1100 may comprise a storage subsystem 1118 that comprises software elements, shown as being currently located within a system memory 1110. System memory 1110 may store program instructions that are loadable and executable on processing unit 1104, as well as data generated during the execution of these programs. [0112] Depending on the configuration and type of computer system 1100, system memory 1110 may be volatile (such as random-access memory (RAM)) and/or non-volatile (such as read-only memory (ROM), flash memory, etc.) The RAM typically contains data and/or program modules that are immediately accessible to and/or presently being operated and executed by processing unit 1104. In some implementations, system memory 1110 may include multiple different types of memory, such as static random-access memory (SRAM) or dynamic random-access memory (DRAM). In some implementations, a basic input/output system (BIOS), containing the basic routines that help to transfer information between elements within computer system 1100, such as during start-up, may typically be stored in the ROM. By way of example, and not limitation, system memory 1110 also illustrates application programs 1112, which may include client applications, Web browsers, mid-tier applications, relational database management systems (RDBMS), etc., program data 1114, and an operating system 1116. By way of example, operating system 1116 may include various versions of Microsoft Windows®, Apple Macintosh®, and/or Linux operating systems, a variety of commercially available UNIX® or UNIX-like operating systems (including without limitation the variety of GNU/Linux operating systems, the Google Chrome® OS, and the like) and/or mobile operating systems such as iOS, Windows® Phone, Android® OS, BlackBerry® 10 OS, and Palm® OS operating systems. [0113] Storage subsystem 1118 may also provide a tangible computer-readable storage medium for storing the basic programming and data constructs that provide the functionality of some embodiments. Software (programs, code modules, instructions) that when executed by a processor provide the functionality described above may be stored in storage subsystem 1118. These software modules or instructions may be executed by processing unit 1104. Storage subsystem 1118 may also provide a repository for storing data used in accordance with the present disclosure. [0114] Storage subsystem 1118 may also include a computer-readable storage media reader 1120 that can further be connected to computer-readable storage media 1122. Together and optionally, in combination with system memory 1110, computer-readable storage media 1122 may comprehensively represent remote, local, fixed, and/or removable storage devices plus storage media for temporarily and/or more permanently containing, storing, transmitting, and retrieving computer-readable information. [0115] Computer-readable storage media 1122 containing code, or portions of code, can also include any appropriate media known or used in the art, including storage media and communication media, such as but not limited to, volatile and non-volatile, removable and non- removable media implemented in any method or technology for storage and/or transmission of information. This can include tangible computer-readable storage media such as RAM, ROM, electronically erasable programmable ROM (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disk (DVD), or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other tangible computer readable media. This can also include nontangible computer-readable media, such as data signals, data transmissions, or any other medium which can be used to transmit the desired information, and which can be accessed by computing system 1100. [0116] By way of example, computer-readable storage media 1122 may include a hard disk drive that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive that reads from or writes to a removable, nonvolatile magnetic disk, and an optical disk drive that reads from or writes to a removable, nonvolatile optical disk such as a CD ROM, DVD, and Blu-Ray® disk, or other optical media. Computer-readable storage media 1122 may include, but is not limited to, Zip® drives, flash memory cards, universal serial bus (USB) flash drives, secure digital (SD) cards, DVD disks, digital video tape, and the like. Computer- readable storage media 1122 may also include, solid-state drives (SSD) based on non-volatile memory such as flash-memory based SSDs, enterprise flash drives, solid state ROM, and the like, SSDs based on volatile memory such as solid state RAM, dynamic RAM, static RAM, DRAM-based SSDs, magneto- resistive RAM (MRAM) SSDs, and hybrid SSDs that use a combination of DRAM and flash memory based SSDs. The disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for computer system 1100. [0117] Communications subsystem 1124 provides an interface to other computer systems and networks. Communications subsystem 1124 serves as an interface for receiving data from and transmitting data to other systems from computer system 1100. For example, communications subsystem 1124 may enable computer system 1100 to connect to one or more devices via the Internet. In some embodiments communications subsystem 1124 can include radio frequency (RF) transceiver components for accessing wireless voice and/or data networks (e.g., using cellular telephone technology, advanced data network technology, such as 3G, 4G or EDGE (enhanced data rates for global evolution), WiFi (IEEE 802.11 family standards, or other mobile communication technologies, or any combination thereof), global positioning system (GPS) receiver components, and/or other components. In some embodiments communications subsystem 1124 can provide wired network connectivity (e.g., Ethernet) in addition to or instead of a wireless interface. [0118] In some embodiments, communications subsystem 1124 may also receive input communication in the form of structured and/or unstructured data feeds 1126, event streams 1128, event updates 1130, and the like on behalf of one or more users who may use computer system 1100. [0119] By way of example, communications subsystem 1124 may be configured to receive data feeds 1126 in real-time from users of social networks and/or other communication services such as Twitter® feeds, Facebook® updates, web feeds such as Rich Site Summary (RSS) feeds, and/or real-time updates from one or more third party information sources. [0120] Additionally, communications subsystem 1124 may also be configured to receive data in the form of continuous data streams, which may include event streams 1128 of real- time events and/or event updates 1130, that may be continuous or unbounded in nature with no explicit end. Examples of applications that generate continuous data may include, for example, sensor data applications, financial tickers, network performance measuring tools (e.g., network monitoring and traffic management applications), clickstream analysis tools, automobile traffic monitoring, and the like. [0121] Communications subsystem 1124 may also be configured to output the structured and/or unstructured data feeds 1126, event streams 1128, event updates 1130, and the like to one or more databases that may be in communication with one or more streaming data source computers coupled to computer system 1100. [0122] Computer system 1100 can be one of various types, including a handheld portable device (e.g., an iPhone® cellular phone, an iPad® computing tablet, a PDA), a wearable device (e.g., a Google Glass® head mounted display), a PC, a workstation, a mainframe, a kiosk, a server rack, or any other data processing system. [0123] Due to the ever-changing nature of computers and networks, the description of computer system 1100 depicted in the figure is intended only as a specific example. Many other configurations having more or fewer components than the system depicted in the figure are possible. For example, customized hardware might also be used and/or particular elements might be implemented in hardware, firmware, software (including applets), or a combination. Further, connection to other computing devices, such as network input/output devices, may be employed. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will appreciate other ways and/or methods to implement the various embodiments. [0124] It should be appreciated that the computing system for rollover prediction may have one or more microprocessors/processing devices that can further be a component of the overall apparatuses. The control systems are generally proximate to their respective devices, in electronic communication (wired or wireless) and can also include a display interface and/or operational controls configured to be handled by a user to monitor the respective systems, to change configurations of the respective systems, and to operate, directly guide, or set programmed instructions for the respective systems, and sub-portions thereof. Such processing devices can be communicatively coupled to a non-volatile memory device via a bus. The non-volatile memory device may include any type of memory device that retains stored information when powered off. Non-limiting examples of the memory device include electrically erasable programmable read-only memory (“ROM”), flash memory, or any other type of non-volatile memory. In some aspects, at least some of the memory device can include a non-transitory medium or memory device from which the processing device can read instructions. A non-transitory computer-readable medium can include electronic, optical, magnetic, or other storage devices capable of providing the processing device with computer-readable instructions or other program code. Non-limiting examples of a non- transitory computer-readable medium include (but are not limited to) magnetic disk(s), memory chip(s), ROM, random-access memory (“RAM”), an ASIC, a configured processor, optical storage, and/or any other medium from which a computer processor can read instructions. The instructions may include processor-specific instructions generated by a compiler and/or an interpreter from code written in any suitable computer-programming language, including, for example, C, C++, C#, Java, Python, Perl, JavaScript, etc. [0125] While the above description describes various embodiments of the invention and the best mode contemplated, regardless of how detailed the above text, the invention can be practiced in many ways. Details of the system may vary considerably in its specific implementation, while still being encompassed by the present disclosure. As noted above, particular terminology used when describing certain features or aspects of the invention should not be taken to imply that the terminology is being redefined herein to be restricted to any specific characteristics, features, or aspects of the invention with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the invention to the specific examples disclosed in the specification, unless the above Detailed Description section explicitly defines such terms. Accordingly, the actual scope of the invention encompasses not only the disclosed examples, but also all equivalent ways of practicing or implementing the invention under the claims. [0126] The teachings of the invention provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various examples described above can be combined to provide further implementations of the invention. Some alternative implementations of the invention may include not only additional elements to those implementations noted above, but also may include fewer elements. Further any specific numbers noted herein are only examples; alternative implementations may employ differing values or ranges, and can accommodate various increments and gradients of values within and at the boundaries of such ranges. [0127] References throughout the foregoing description to features, advantages, or similar language do not imply that all of the features and advantages that may be realized with the present technology should be or are in any single embodiment of the invention. Rather, language referring to the features and advantages is understood to mean that a specific feature, advantage, or characteristic described in connection with an embodiment is included in at least one embodiment of the present technology. Thus, discussion of the features and advantages, and similar language, throughout this specification may, but do not necessarily, refer to the same embodiment. Furthermore, the described features, advantages, and characteristics of the present technology may be combined in any suitable manner in one or more embodiments. One skilled in the relevant art will recognize that the present technology can be practiced without one or more of the specific features or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments of the present technology. [0128] Although specific embodiments have been described, various modifications, alterations, alternative constructions, and equivalents are also encompassed within the scope of the disclosure. Embodiments are not restricted to operation within certain specific data processing environments but are free to operate within a plurality of data processing environments. Additionally, although embodiments have been described using a particular series of transactions and steps, it should be apparent to those skilled in the art that the scope of the present disclosure is not limited to the described series of transactions and steps. Various features and aspects of the above-described embodiments may be used individually or jointly. [0129] Further, while embodiments have been described using a particular combination of hardware and software, it should be recognized that other combinations of hardware and software are also within the scope of the present disclosure. Embodiments may be implemented only in hardware, or only in software, or using combinations thereof. The various processes described herein can be implemented on the same processor or different processors in any combination. Accordingly, where components or modules are described as being configured to perform certain operations, such configuration can be accomplished, e.g., by designing electronic circuits to perform the operation, by programming programmable electronic circuits (such as microprocessors) to perform the operation, or any combination thereof. Processes can communicate using a variety of techniques including but not limited to conventional techniques for inter process communication, and different pairs of processes may use different techniques, or the same pair of processes may use different techniques at different times. [0130] The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that additions, subtractions, deletions, and other modifications and changes may be made thereunto without departing from the broader spirit and scope as set forth in the claims. Thus, although specific disclosure embodiments have been described, these are not intended to be limiting. Various modifications and equivalents are within the scope of the following claims. [0131] The use of the terms “a” and “an” and “the” and similar referents in the context of describing the disclosed embodiments (especially in the context of the following claims) are to be construed to cover both the singular and the plural, unless otherwise indicated herein or clearly contradicted by context. The terms “comprising,” “having,” “including,” and “containing” are to be construed as open-ended terms (i.e., meaning “including, but not limited to,”) unless otherwise noted. The term “connected” is to be construed as partly or wholly contained within, attached to, or joined together, even if there is something intervening. Recitation of ranges of values herein are merely intended to serve as a shorthand method of referring individually to each separate value falling within the range, unless otherwise indicated herein and each separate value is incorporated into the specification as if it were individually recited herein. All methods described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The use of any and all examples, or exemplary language (e.g., “such as”) provided herein, is intended merely to better illuminate embodiments and does not pose a limitation on the scope of the disclosure unless otherwise claimed. No language in the specification should be construed as indicating any non- claimed element as essential to the practice of the disclosure. [0132] Disjunctive language such as the phrase “at least one of X, Y, or Z,” unless specifically stated otherwise, is intended to be understood within the context as used in general to present that an item, term, etc., may be either X, Y, or Z, or any combination thereof (e.g., X, Y, and/or Z). Thus, such disjunctive language is not generally intended to, and should not, imply that certain embodiments require at least one of X, at least one of Y, or at least one of Z to each be present. [0133] Preferred embodiments of this disclosure are described herein, including the best mode known for carrying out the disclosure. Variations of those preferred embodiments may become apparent to those of ordinary skill in the art upon reading the foregoing description. Those of ordinary skill should be able to employ such variations as appropriate and the disclosure may be practiced otherwise than as specifically described herein. Accordingly, this disclosure includes all modifications and equivalents of the subject matter recited in the claims appended hereto as permitted by applicable law. Moreover, any combination of the above- described elements in all possible variations thereof is encompassed by the disclosure unless otherwise indicated herein. [0134] All references, including publications, patent applications, and patents, cited herein are hereby incorporated by reference to the same extent as if each reference were individually and specifically indicated to be incorporated by reference and were set forth in its entirety herein. [0135] In the foregoing specification, aspects of the disclosure are described with reference to specific embodiments thereof, but those skilled in the art will recognize that the disclosure is not limited thereto. Various features and aspects of the above-described disclosure may be used individually or jointly. Further, embodiments can be utilized in any number of environments and applications beyond those described herein without departing from the broader spirit and scope of the specification. The specification and drawings are, accordingly, to be regarded as illustrative rather than restrictive.

Claims

WHAT IS CLAIMED IS: 1. An all-terrain vehicle rollover prediction apparatus comprising: an inertial measurement unit (IMU) configured to be fastened to an all-terrain vehicle; at least one local processor physically wired to the IMU; a memory connected with the at least one local processor, the memory having instructions executable by the at least one processor for: looking up, from a data table, dimensions of the all-terrain vehicle; receiving IMU data from the IMU; estimating a roll angle and a pitch angle from the IMU data; determining a yaw rate from the IMU data; calculating a turning radius from the yaw rate; inputting, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle, and the turning radius; reading, from the neural network model, a value predicting whether the all-terrain vehicle will roll over; and outputting an alert based on the value.
2. The apparatus of claim 1 wherein the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz.
3. The apparatus of claim 1 or claim 2 wherein the neural network model has exactly two hidden layers.
4. The apparatus of claim 3 wherein each hidden layer of the neural network model has 32 neurons.
5. The apparatus of any preceding claim wherein the memory has further instructions executable by the at least one processor for: calculating a difference between a respective average and each of the speed, the dimensions, the roll angle, the pitch angle, and the turning radius; and dividing each difference by a respective scale factor before inputting to the neural network model.
6. The apparatus of claim 5 wherein the average is a mean or a median, and the scale factor is a standard deviation or a variance, of the respective speed, dimensions, roll angle, pitch angle, and turning radius as determined from training data with which the neural network model was trained.
7. The apparatus of any preceding claim wherein the neural network model was trained using a categorical cross entropy loss function.
8. The apparatus of any preceding claim wherein the memory has further instructions executable by the at least one processor for: receiving a user selection of a make or model of the all-terrain vehicle, wherein the looking up is based on the user selection.
9. The apparatus of any preceding claim wherein the memory has further instructions executable by the at least one processor for: receiving user entries of the dimensions, wherein the looking up is based on the user entries.
10. The apparatus of any preceding claim wherein the looked up dimensions of the all-terrain vehicle include one or more of a weight, a width, a length, a wheelbase, and a seat height.
11. The apparatus of any preceding claim wherein the memory has further instructions executable by the at least one processor for: inputting, to the neural network model, a presence of a rider or a non-presence of a rider for autonomous driving.
12. The apparatus of any preceding claim wherein the value is a scalar representing a probability of roll over.
13. The apparatus of any preceding claim further comprising: an aural or visual indicator onboard the all-terrain vehicle configured to warn a rider based on the alert.
14. The apparatus of any preceding claim further comprising: a transmitter configured to send a wireless message based on the alert.
15. The apparatus of any preceding claim further comprising: a global positioning system (GPS) receiver, wherein the memory has further instructions executable by the at least one processor for: obtaining data from the GPS receiver and data from the IMU; and combining the data from the GPS receiver and the data from the IMU to estimate the speed of the all-terrain vehicle.
16. The apparatus of claim 1 wherein the at least one local processor is rigidly connected with the IMU.
17. The apparatus of any preceding claim further comprising: a housing enclosing the IMU, the at least one local processor, and the memory.
18. A method of predicting whether an all-terrain vehicle will roll over, the method comprising: providing an inertial measurement unit (IMU) fastened to an all-terrain vehicle and at least one local processor physically wired to the IMU; looking up, from a data table, dimensions of the all-terrain vehicle; receiving, in the processor, IMU data from the IMU; estimating a roll angle and a pitch angle from the IMU data; determining a yaw rate from the IMU data; calculating a turning radius from the yaw rate; inputting, to a neural network model executing on the at least one local processor, input parameters based on a speed of the all-terrain vehicle, the dimensions, the roll angle, the pitch angle, and the turning radius; reading, from the neural network model, a value predicting whether the all- terrain vehicle will roll over; and outputting an alert based on the value.
19. The method of claim 18 wherein the at least one local processor and the neural network model are optimized to determine the value at a frequency faster than 10 Hz.
20. The method of claim 18 or claim 19 wherein the neural network model has exactly two hidden layers.
21. The method of any one of claims 18-20 further comprising: receiving a user selection of a make or model of the all-terrain vehicle, wherein the looking up is based on the user selection.
22. The method of any one of claims 18-21 further comprising: receiving user entries of the dimensions, wherein the looking up is based on the user entries.
23. The method of any one of claims 18-22 further comprising: inputting, to the neural network model, a presence of a rider or a non-presence of a rider for autonomous driving.
24. The method of any one of claims 18-23 wherein the value is a scalar representing a probability of roll over.
25. A non-transitory computer readable medium storing computer- executable instructions that, when executed by one or more processors, cause performance of a method according to any one of claims 18-24.
PCT/US2024/034045 2023-08-16 2024-06-14 Rollover prediction and alert for all-terrain vehicle Pending WO2025038169A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363519928P 2023-08-16 2023-08-16
US63/519,928 2023-08-16

Publications (1)

Publication Number Publication Date
WO2025038169A1 true WO2025038169A1 (en) 2025-02-20

Family

ID=91898290

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/034045 Pending WO2025038169A1 (en) 2023-08-16 2024-06-14 Rollover prediction and alert for all-terrain vehicle

Country Status (1)

Country Link
WO (1) WO2025038169A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070067085A1 (en) * 2005-09-19 2007-03-22 Ford Global Technologies Llc Integrated vehicle control system using dynamically determined vehicle conditions
US8378803B1 (en) * 2009-11-25 2013-02-19 Craig A. Keiser Safety system for all-terrain vehicles
US20190143970A1 (en) * 2017-11-10 2019-05-16 GM Global Technology Operations LLC Determination of roll angle and bank angle with suspension displacement data
US20190354838A1 (en) * 2018-05-21 2019-11-21 Uber Technologies, Inc. Automobile Accident Detection Using Machine Learned Model
EP3842312A1 (en) * 2018-12-28 2021-06-30 Samsung Electronics Co., Ltd. Electronic apparatus for detecting risk factors around vehicle and method for controlling same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070067085A1 (en) * 2005-09-19 2007-03-22 Ford Global Technologies Llc Integrated vehicle control system using dynamically determined vehicle conditions
US8378803B1 (en) * 2009-11-25 2013-02-19 Craig A. Keiser Safety system for all-terrain vehicles
US20190143970A1 (en) * 2017-11-10 2019-05-16 GM Global Technology Operations LLC Determination of roll angle and bank angle with suspension displacement data
US20190354838A1 (en) * 2018-05-21 2019-11-21 Uber Technologies, Inc. Automobile Accident Detection Using Machine Learned Model
EP3842312A1 (en) * 2018-12-28 2021-06-30 Samsung Electronics Co., Ltd. Electronic apparatus for detecting risk factors around vehicle and method for controlling same

Similar Documents

Publication Publication Date Title
EP4211601B1 (en) Method for electronic trip familiarity detection
US9487129B2 (en) Method and system to control vehicle turn indicators
US9718468B2 (en) Collision prediction system
US10311658B2 (en) Unexpected impulse change collision detector
EP2738650A1 (en) System and method for auto-calibration and auto-correction of primary and secondary motion for telematics applications via wireless mobile devices
US20170236411A1 (en) Network computer system for analyzing driving actions of drivers on road segments of a geographic region
US20240412575A1 (en) Accident-severity scoring device, method, and system
US11225264B2 (en) Realtime driver assistance system
CN104750753B (en) A kind of driving behavior report-generating method and equipment
US20240193698A1 (en) Cloud-based vehicular telematics systems and methods for generating hybrid epoch driver predictions and driver feedback
US20220383256A1 (en) Post-vehicular incident reconstruction report
US20180082203A1 (en) Detection of operator likelihood of deviation from planned route
US20240331456A1 (en) Systems and methods for detecting full-stops to reduce vehicle accidents
US20240010190A1 (en) Systems and methods for pothole detection and avoidance
US20250246033A1 (en) Detecting use of driver assistance systems
WO2025038169A1 (en) Rollover prediction and alert for all-terrain vehicle
US20250029436A1 (en) Systems and methods for detecting software interactions for autonomous vehicles within changing environmental conditions
US12125106B1 (en) Cloud-based vehicular telematics systems and methods for automatically generating rideshare-based risk profiles of rideshare drivers of a transport network company (TNC) platform
KR20200062025A (en) Method and apparatus for processing dangers of vehicles based on multi-log analysis
Khorsandi et al. Artificial Intelligence-Driven All-Terrain Vehicle Crash Prediction and Prevention System
US12293614B2 (en) Verifying mobile telematics with vehicle information
Warren et al. Monitoring driver behaviour with backpocketdriver
De Moura Araujo et al. Artificial Intelligence-Driven All-Terrain Vehicle Crash Prediction and Prevention System
US11170406B2 (en) System and methods for battery electric vehicle driving analysis
JP2023135475A (en) Information processing device, information processing method, program, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24740715

Country of ref document: EP

Kind code of ref document: A1