WO2025064675A1 - Smart trap and monitoring device - Google Patents
Smart trap and monitoring device Download PDFInfo
- Publication number
- WO2025064675A1 WO2025064675A1 PCT/US2024/047486 US2024047486W WO2025064675A1 WO 2025064675 A1 WO2025064675 A1 WO 2025064675A1 US 2024047486 W US2024047486 W US 2024047486W WO 2025064675 A1 WO2025064675 A1 WO 2025064675A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- animal
- trap
- component
- data
- trapping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M23/00—Traps for animals
- A01M23/16—Box traps
- A01M23/18—Box traps with pivoted closure flaps
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01M—CATCHING, TRAPPING OR SCARING OF ANIMALS; APPARATUS FOR THE DESTRUCTION OF NOXIOUS ANIMALS OR NOXIOUS PLANTS
- A01M23/00—Traps for animals
- A01M23/38—Electric traps
Definitions
- the present invention relates to a smart device for trapping animals with remote connection capabilities.
- the present invention is an animal trap which may be accessed and monitored remotely with wireless, satellite, or other remote-connection capabilities.
- the present invention relates to a smart trap which identifies types or species of animals and triggers the trapping device to capture said animals. The device then sends photos, videos, sounds, or other data to the user to determine if the correct animal has been captured, which the user may then selectively and remotely release, or choose to retrieve said animal.
- traps are designed to target certain types or species of animals.
- corral traps are designed to capture and contain hogs. The design allows for a group of hogs to enter into the coral, trapping the group to allow for removal or relocation.
- Snares may be suited for more individualistic animals which travel in certain predictable pathways like foxes, squirrels, chipmunks, or other such animals.
- Many other types of traps exist, such as deadfall traps, pit traps, foothold traps, and spring traps, with each having certain benefits and disadvantages.
- traps In addition to being better suited for different types of animals, traps also may be better suited for various tasks. For example, some traps are designed for lethal use on certain animals, while other traps are designed to capture an animal, unharmed, for study, relocation, or other purposes. Additionally, with growing awareness of the treatment of animals, the need for humane trapping continues to grow. Many types of existing traps, like snares or foothold traps, are considered inhumane because they cause pain, suffering, and potential death for the animals. Also, these traps may or may not be selective enough in capturing a specific type or species of animal. [0004] Typically, box or cage traps are considered humane and are widely used.
- traps are designed and sized appropriately to be baited for a certain animal.
- the trap When the animal enters the box or cage, the trap has a trigger which the animal may trip, resulting in a door, latch, cage, or other mechanism closing in and prohibiting the animal from escaping.
- These traps are considered more humane as the trap is far less likely to cause pain or damage to the animal during trapping and containment.
- These types of traps are typically used to trap animals such as foxes, racoons, skunks, and other such animals for relocation. These traps are typically able to be reused multiple times and for different types of animals.
- the trapping device should have the capability to identify certain traits of the animal and selectively trap an animal with desired traits. This identification of traits may be done using motion sensors, infrared sensors, cameras, pressure mats, or other forms of selectively identifying an animal or traits of an animal. Additionally, the device should be able to selectively engage or disengage the trapping function based on the information obtained through such sensors, cameras, or other informationgathering tool.
- a smart trapping device which can operate autonomously and use artificial intelligence (“Al”) software that can identify certain animal characteristics and selectively trap an animal.
- the Al should be able to identify whether a certain animal is the target animal or a non-target animal.
- the Al software should then be able to engage the trapping device to humanely trap the animal if it is the target animal.
- the Al software should be able to leave the trap ready for future capture of a target animal.
- the Al software should be able alert a user and provide them with the media necessary (like a picture, video, sound, or other form of media) which allows the user to make a decision on whether or not to collect or release the animal.
- the present invention relates to a smart trap device.
- the smart trap device has a containment area, a trapping mechanism, and a technology component.
- the containment area may be of varying size, shape, and materials, depending on the target animal. For example, many existing traps use a metal cage of sufficient size to trap animals such as racoons, skunks, and other similar-sized animals.
- the trapping mechanism may be a door, latch, covering, or certain type of movement of the containment area.
- the trap may have a door which utilizes gravity to close once an animal has entered the containment area and triggers a sensor which allows the door to close. The sensor collects information about the animal, which is then analyzed by Al software to determine whether the animal is a potential target animal.
- the containment area may be raised with an opening, such that the opening is aligned with the sensor, sensors, or Al target area.
- the containment area may drop onto the animal.
- the trapping mechanism may be customized depending on the type of target animal. When trapping only the target animal, as opposed to any animal visiting the trap, the smart trap reduces the negative impacts to the ecosystem by reducing the number of required human visits to the trap site, reduces or eliminates the chances of trapping a non-target animal, and provides less disruption to the environment.
- the technology component has a connectivity feature, sensing feature, a data capture feature, an analytical feature, and a power feature.
- the connectivity feature provides a remote control interface which allows the user to control and monitor the operation of the smart trap from a remote location.
- the monitoring and control feature of the present invention may be used to manage a singular trap or a multitude of traps.
- the connectivity feature may allow for the interconnection of access and control of multiple traps to establish a web of data collection from each of the various traps within the system.
- the connectivity feature may be in the form of cellular, satellite, or other wireless method of remote connection.
- the sensing feature may be a motion sensor, infrared sensor, pressure mat, camera, microphone, other device, or any combination thereof that is used to detect an animal.
- the data capture feature may be in the form of a camera, microphone, or other method of collecting data such as a picture, sound, pressure, or other data.
- the analytical feature may be an Al feature which analyzes data captured by the sensing features to identify and act depending on the incoming data. The analytical feature is used to locate, identify, and make decisions based on the data received. The decision-making process of the analytical feature is used to determine whether or not the data will be saved, communicated to the user, or to activate the trap.
- the analytical feature of the technology component utilizes Al programming with customized Al models to determine whether the animal is the target animal, thus triggering the trapping feature to capture the animal.
- the user may assess when they need to return to the trap to retrieve the target animal or re-bait the trap, thus reducing the labor and maintenance requirement.
- the data may be analyzed through a custom user interface or data exports to inform several aspects of trapping and utilization of the present invention such as tracking progress of the trap(s), estimating of animal populations, informing adaptive trapping strategies, evaluating trap placement, and improving Al functionality.
- the information may also be used to inform users when maintenance or service of the trap is required.
- the sensing feature may be a type of sensor which is used to detect and identify an animal.
- the sensing feature may be a camera, pressure sensor, motion sensor, or other type of sensor.
- the input and data from the sensor collects information which may trigger the trap.
- the user may be provided with this information, such as a picture, video, sound, pressure mat reading, or other information collected from the sensor which allows the user to remotely control the trap.
- the user may review the data collected by the sensor(s).
- the data may be used to make a decision on whether or not to release the captured animal, trigger the capture mechanism, or utilize the information for other purposes.
- the information provided to the user is in the form of a wireless, satellite, or other remote connectivity methods.
- the user is not required to be within a certain proximity of the trap and so may remotely connect to the device.
- the remote connectivity to the device and the autonomous Al component allows for better utilization for remote trapping sites.
- remote trapping sites are difficult to monitor and optimize, as well as service and maintain.
- the device has a camera equipped to collect photographs, videos, and/or sounds of the trap and surrounding area.
- the camera is used to identify animals interacting with the trap to ensure the target animal is being trapped. Often times, this includes various types of recognition software including Al software.
- the camera may be used to capture pictures, videos, and sounds of the target animal to provide the user with information related to the trapping of the animal.
- the location of the camera may be critical to the functionality and information capture.
- the camera may be located at the top of the trap with an overhead view of the inside of the trap and the trapped animal. The overhead view from the camera allows for the collection of information about the trapped animal. The consistent distance and angle allows for better estimation of the size and other characteristics of the animal.
- the present invention utilizes Al software as a method of selectively collecting and analyzing data and utilizing this data for trapping purposes.
- the collection and analysis of the data may include capturing real-time information such as weather conditions, types and frequency of animals visiting the trap, activities during visitation, and any other collectible data.
- the data may then be analyzed by the Al software for a multitude of purposes such as identifying the types, species, and number of animals visiting and the date, time, and patterns of when animals are visiting and may be used to further improve the performance of the Al software.
- the improved data collection of the present invention may include the exact timing of capture, impact of weather or other environmental conditions on captures, detection of other species, analysis of trapping strategies and trap placement, real time analytics and reporting, integration with other databases, and ability to export data. Additionally, this information may be further utilized by users to inform and develop adaptive real-time trapping strategies, track progress of the trap or system of traps, and create reports.
- the Al software may alert the user of certain conditions, characteristics, or other information.
- the Al component of the smart trap may be functional at a remote location, and communication with the data- collecting elements is completed via the connectivity elements as previously described. Alternatively, the Al component may operate from a computer located on, in, or in close proximity to the trap.
- the trap may have various times of interrupted connectivity, but the Al and computer portion of the smart trap may continue to function on a standalone basis while waiting for the connectivity to be reestablished.
- the Al component may be capable of making continuous improvements through data received, analysis of the data, application improvements, real-time response to analytics, and additional development of technology.
- An additional aspect of the smart trap is the utilization of various types of power.
- the present invention is connected to an electrical grid through an outlet, providing continuous power.
- the invention is connected to a deep-cycle battery, a generator, or other power source.
- the invention is powered by solar power.
- the smart trap may be equipped to receive a solar panel to collect sufficient power to sustain the power output needed for the trap.
- the smart trap may also have a battery connected to the solar power source to store and utilize power reserves when the solar power is insufficient.
- the present invention may also be configured to utilize other types of power such as hydro power, wind power, and other power sources.
- FIG. 1 is a side perspective view of the smart trap device with the trapping mechanism armed.
- FIG. 2 is a front perspective view of the smart trap device with the trapping mechanism armed.
- FIG. 3 is a side perspective view of the smart trap device with the trapping mechanism engaged to capture an animal.
- FIG. 4 is a back perspective view of the smart trap device with the trapping mechanism engaged to capture an animal.
- FIG. 5 is a depiction of a multitude of smart trap devices, forming a system of interconnected or interconnectable smart trap devices.
- FIG. 6 is a right side perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
- FIG. 7 is a left side perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
- FIG. 8 is a front perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
- FIG. 9 is a depiction of the functional flow of the present invention.
- FIG. 10 is a flowchart of the artificial intelligence functionality of the present invention.
- FIG. 11 is a depiction of a multitude of traps used in a system and the communication flow between the individual traps, user, computer, and Al software.
- FIG. 12 is a flowchart of the Al logic of the present invention.
- the present invention 10 is a smart trap that has a containment component 20, a trapping mechanism 40, and a technology component 60.
- the containment component 20 has a top 22, a bottom 24, a first end 26, a second end 28, a first side 30, and a second side 32.
- the trapping mechanism 40 has an arming component 42, a trigger 44, and a sealing component 46.
- the technology component 60 has a connectivity component 64, a media component 66, and an Al component 68.
- the containment component 20 is designed to capture and retain a target animal 100.
- the first end 26 of the containment component 20 is opened to allow the animal 100 to enter the containment area.
- the trapping mechanism 40 may be engaged to close and entrap the animal 100 within the containment area.
- the arming component 42 of the trapping mechanism 40 readies the trap 10 to be engaged as needed.
- the trigger or sensor 44 of the trapping mechanism 40 signals the sealing component 46 of the trapping mechanism 40 to close and secure the animal 100 within the containment component 20.
- the trigger 44 may be a sensor, wire, pedal, or other type of device which controls the engagement or disengagement of the trapping mechanism 40.
- the arming component 42 may be a latch, hook, or other method of retaining the sealing component 46 in a desired position.
- the animal 100 may engage the trigger 44.
- the trigger 44 is activated by the Al software processing the sensor data. The engagement of the trigger 44 may prompt the arming component 42 to engage or disengage with the sealing component 46.
- the arming component 42 may force the sealing component 46 shut to trap the animal 100.
- the arming component 42 may release the sealing component 46, allowing gravity to shut the sealing component 46 to trap the animal 100.
- the technology component 60 processes information received from the media component 66.
- the information received from the medica component 66 is processed in a computer of the technology component.
- the computer is a microcontroller or low-power, single-board computer.
- the computer may be housed in the technology component 60 of the trap 10.
- the media component 66 may be a camera, sensor, microphone, or other device used to collect information.
- the present invention 10 may have one or more media components 66 used to collect data. When a camera device is used, the location of the device may be from an end view, side view, top or bottom view. In the present embodiment, the camera view is from the top of the device to provide a top-view angle of the animal.
- the connectivity component 64 provides a method of sharing, transferring, or transmitting information remotely.
- the connectivity component 64 may utilize existing technology in the form of cellular technology, Wi-Fi technology, satellite communication, radio waves, or other existing methods of information transmission.
- the Al component 68 utilizes computer processing information to make independent decisions based on the information or data collected.
- the Al component 68 of the present invention 10 processes information collected by the media components 66 to identify visiting animals.
- the Al component 68 may then engage the trigger 44 of the smart trap device 10 to close the trapping mechanism 40 if the animal has been identified as a target animal 100, or to allow the device 10 to remain open if the animal is not a target animal 100.
- the containment component 20 is designed to hold and retain a target animal 100.
- the containment component 20 is made of metal bars which run vertically and horizontally, forming a mesh, along each of the sides 30, 32, ends 26, 28, top 22 and bottom 24 to retain a desired animal 100.
- the trapping mechanism 40 is raised with the arming component 42 to create an opening. The trapping mechanism 40 holds the sealing component 46 in place for when a target animal 100 enters the trap device 10. Once a target animal 100 enters the trap device 10, the trigger 44 may be engaged to allow the sealing component 46 to close and trap the animal 100 within the containment component 20. As depicted in FIG.
- the technology component 60 is housed and retained on the top side 22 of the device 10.
- the arming component 42 is connected to the sealing component 46 and the technology component 60, allowing the technology component 60 to control the movement of the sealing component 46 based on the data received from the media component(s) 66.
- the sealing component 46 may be engaged to seal the containment component 20 when a target animal has entered the containment area.
- FIG. 3 depicts a target animal 100 trapped within the containment component 20.
- the sealing component 46 is configured to actuate, fall, or otherwise move to close off the first end 26 of the trap device 10.
- the sealing component 46 of FIG. 3 is suspended in a substantially horizontal position before triggered. Once triggered, a distal end of the sealing component 46 rotates around a proximal end of the sealing component 46 downward, sealing the opening of the containment component 20.
- the closure of the trap device 10 by the sealing component 46 seals off all sides of the trap device 10 to capture the animal 100.
- the trap device 10 may be made of a mesh, glass, acrylic, plastic, or other materials used to retain and secure an animal.
- the containment area may be made of varying types of materials to best suit the environment in which the trap device 10 is being used in or the target animal 100.
- the sealing component 46 of the device 10 in FIG. 4 may be made of a plastic material which drops to seal the containment component 20.
- the technology component 60 depicted in FIG. 4 is located on the top 22 of the device 10.
- the Al component 68 of the present invention 10 monitors incoming data collected from sensors and media devices.
- the incoming data is processed and analyzed by the Al component 68 to determine if a visiting animal is a target animal 100. If the visiting animal is a target animal 100, the Al component 68 triggers the capture of the target animal 100. During capture, the Al component 68 continuously collects data from the animal and surrounding environment.
- the trapping mechanism 40 is engaged once the Al component 68 has identified a target animal 100. If a non-target animal visits the trap, the Al component 68 will not trigger the trap 10, allowing the non-target animal to leave the area, allowing the trap device 10 to remain armed.
- the computer component of the technology component 60 utilizes the connectivity component 64 to send information to the user or users.
- the computer may send pictures, videos, sounds, weather information, and other information to the user.
- the user may also remotely control the trapping mechanism 40 through the computer and connectivity component 64.
- the user may remotely engage the trap, disengage the trap, or rearm the trap, maximizing the trap uptime.
- the smart trap 10 may be remotely armed, rearmed, or scheduled to be armed.
- the computer of the present invention 10 is configured to collect data continuously, periodically, or selectably. Data may be collected at all times of use of the trap. Some information may be collected prior to contact with an animal or target animal.
- the computer may only collect data once a sensor or media component 66 has been triggered by an animal. Alternatively, the computer may continuously collect and store all information gathered as directed by the user.
- the technology component may also have a platform 70.
- the platform 70 may provide a method for the user to store data, display the data in a mobile application or web-based format, or other methods of data presentation.
- the platform 70 may also allow the user to control the functions of the trap, such as opening or closing the sealing component 46, sorting and accessing data, manage alerts, or perform other functions or commands.
- FIG. 5 depicts a multitude of trap devices 10 used within a system.
- the computer of the present invention 10 may be used to facilitate data collection from multiple devices simultaneously.
- the sharing of information from the devices allows a user to obtain real-time data capture from a number of devices.
- the real-time data capture from the system of devices allows the Al component 68 and/or user to analyze various data points such as weather conditions, time of day, area of activity, and other data points to better inform trapping techniques, trapping locations, and animal behavior.
- Incoming data may be continuously analyzed and processed by the Al component 68, computer, or user through the connectivity component 64.
- the present invention 10 is powered by a power source.
- the power source may be a variety of existing sources of power such as electrical power, solar power, wind power, hydro power, or other forms of power.
- the present invention 10 may be powered by the power source continuously or may utilize a battery to store power for use when the power source is disconnected or not producing power.
- the power source is a solar panel and a battery.
- the present invention 10 may be utilized in addition to existing traps. For example, there currently exists mechanical traps which may be triggered once an animal enters the trap 10 and engages bait, a sensor, or other method of detecting an animal is present.
- the present invention 10 may be utilized in conjunction with such a trap 10 to allow a user to remotely connect to the trap 10 to identify whether the animal is a target animal 100. The user may selectively and remotely open and rearm the trap 10 as needed.
- the present invention 10 may utilize existing trapping technology to control the trapping mechanism 40, collect data, and provide real-time information via the connectivity component 64.
- the analytics of the sensor data provides progress tracking, animal population estimates, informed, adaptive trapping strategies, evaluation of trap placement, and improvement of Al.
- the smart trap device 10 reduces labor requirements, limits the impact to the ecosystem, increases trap uptime, increases the amount of data collected and the quality of data collected, provides greater accessibility to remote locations, and allows continuous improvement through Al learning.
- FIGS. 6-8 depict an embodiment of the present invention 200.
- the containment component 20 is made of a fine mesh material, designed to capture specific types of target animals 100.
- the device 200 has an arming component 242 and a sealing component 246.
- the arming component 242 holds and retains the sealing component 246 while the trap 200 is armed and waiting for a target animal 100.
- the sealing component 246 is released or forced into a closed position, thereby trapping the target animal 100.
- FIG. 9 is a flowchart of the functional flow of the present invention.
- the trap 10 is placed in a location desirable to capture the target animal 100.
- the trap 10 is armed waiting for the sensor or trigger to detect a target animal. Once detected, the trap 10 is triggered to capture the animal. If the capture fails, the user may remotely reactivate the trap 10 into an armed position. If the capture is a non-target capture, the user may remotely open the sealing component 46 the trap 10 into an armed position. If the capture is a target capture, the user may collect and retrieve the target animal 100.
- FIG. 10 is a flowchart of the typical user-experience functionality of the present invention 10.
- the user arms the trap 10.
- the sensors or triggers 44 detect an animal entering the trap.
- the Al component 68 determines whether the animal is a target animal 100. If the animal is determined to be a target animal 100, the Al component 68 engages the trigger 44 to close the trap, and the user is notified of the capture. If the Al component 68 determines the animal is not a target animal 100, the trap is not engaged, and the animal is free to exit the trap 10.
- FIG. 11 is a flowchart and information flow of a multitude of traps used in a system.
- the traps 10 in the system provide data to the platform 70 in the form of image and sensor data, trap closures, or other event notifications (which include success or failure responses to user commands and trap activity, such as captures).
- the platform 70 provides sortable, readable data as a mobile application to allow users to access information such as photos, capture data, trap status, alerts, environmental data, and provide notification alerts.
- the mobile application allows the user to provide direct commands to the traps 10 such as remotely opening or closing the traps 10.
- FIG. 12 is a flowchart of the artificial intelligence logic of the present invention 10.
- the sensors 44 collect data when an animal enters the trap.
- the Al model assesses a plurality of features and data points of the animal.
- the Al model then produces an Al confidence score.
- the Al confidence score is then used to determine whether the animal is a target animal 100 or a nontarget animal. If the Al confidence score meets the established threshold, the trap 10 is triggered to close. If the Al confidence score does not meet the established threshold, the trap 10 remains open.
- the computer of the present invention 10 has a processor coupled to the sensors and to the camera.
- the processor receives data from the sensors and images from the camera.
- the processor also sends and receives information to and from controllers that are coupled to the environment control devices through the communication device.
- the environment control devices may include, but are not limited to, devices that may allow for control of the trap 10 features, such as the adjustment of the camera, opening or closing of the trap mechanism, and other features.
- the processor is a representation of a component of the device employed to carry out the method of the present invention 10, which is to begin the process for collecting information and controlling the trapping device.
- the method includes the step of collecting data from the sensors and the camera, which collected data is selectable by the user.
- the method also includes the step of collecting information from one or more operators about other selectable conditions in the environment.
- the method further includes the step of training one or more computer programs or Al models carried out by the processor based on the combination of the sensed information and identification and trapping of target animals. That trained functionality is employed to actuate one or more of the control devices.
- the one or more computer programs further leam from that activity to determine resultant sensed information that is iteratively employed to resolve whether further steps are required, including the modification of prior actuation steps.
- the processor in the form of one or more computing devices combines physical hardware structures with software that may include firmware and middleware for the purpose of executing instructions that produce the actions described herein. It is to be understood that the computing device or devices suitable for performing the functions of the system to instantiate artificial intelligence functionality as desired include, but are not limited to, desktop computers, laptops, tablets, microcontrollers, single and multi-board computers, cloud computing resources, and mobile devices including smartphones, for example.
- a computing device described herein may be any type of device having a processor capable of carrying out instructions associated with one or more computer applications.
- the devices may contain or be connected to one or more databases of other devices wherein the one or more databases include information related to the invention.
- the database may include a library of information associated with one or more of the sensors and information about actions performed by the one or more devices.
- the one or more databases may be populated and updated with information by authorized users and attached functions.
- program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
- the present invention 10 can be implemented in distributed computing environments where tasks are performed by remote processing devices that are linked through one or more data transmission media including through a communication device.
- program function modules and other data may be located in both local and remote device storage media including memory storage devices.
- the processor, interactive drives, memory storage devices, databases and peripherals, such as signal exchange components, of a particular device may be interconnected through one or more electrical buses or cloud services.
- the one or more buses may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures.
- bus architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus.
- the interactive drives include one or more interfaces to couple to an Al-based apparatus, which may be or includes computer processing hardware and programming.
- the interactive drives are configured to exchange information with the Al apparatus, including delivery of instructions designed to ensure actuation functions are performed.
- Each of the devices 10 of the system of the present invention 10 may include one or more of one or more different computer readable media.
- Computer readable media can be any available media that can be accessed by the processor and includes both volatile and non-volatile media, removable and non-removable media.
- Computer readable media may be computer storage media and/or communication media.
- Computer storage media include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data.
- Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information, and which can be accessed by the computer system.
- Each of the devices may further include computer storage media in the form of volatile and/or non-volatile memory such as Read Only Memory (ROM) and Random Access Memory (RAM).
- RAM typically contains data and/or program modules that are accessible to and/or operated on by the processor. That is, RAM may include application programs, such as the functions of the present invention 10, and information in the form of data.
- the devices may also include other removable/non-removable, volatile/non-volatile computer storage and access media.
- a device may include a hard disk drive or solid state drive to read from and/or write to non-removable, non-volatile magnetic media, a magnetic disk drive to read to and/or write from a removable, non-volatile magnetic disk, and an optical disk drive to read to and/or write from a removable, non-volatile optical disk, such as a CD-ROM or other optical media.
- a removable, non-volatile optical disk such as a CD-ROM or other optical media.
- Other removable/non-removable, volatile/non-volatile computer storage media that can be used in the devices to perform the functional steps associated with the system and method of the present invention 10 include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like.
- the drives and their associated computer storage media described above provide storage of computer readable instructions, data structures, program modules and other data for the processor.
- a user may enter commands and information into the processor through input devices such as keyboards and pointing devices, such as a mouse, a trackball, a touch pad or a touch screen.
- Other input devices may include a microphonejoystick, game pad, satellite dish, scanner, or the like.
- These and other input devices are connected to the processor through the system bus, or other bus structures, such as a parallel port or a universal serial bus (USB), but is not limited thereto.
- a monitor or other type of display device is also connected to the processor through the system bus or other bus arrangement.
- Such computer program product may include computer-readable signals tangibly embodied on the computer- readable medium, where such signals define instructions, for example, as part of one or more programs that, as a result of being executed by the processor, instruct the processor to perform one or more of the functions or acts described herein, and/or various examples, variations and combinations thereof
- Such instructions may be written in any of a plurality of programming languages, for example, Javascript, Java, Python, Visual Basic, C, or C++, XML, HTML and the like, or any of a variety of combinations thereof.
- All the data aggregated and stored in the database or databases may be managed under an RDBMS for example Oracle, MySQL, Access, PostgreSQL and the like or any of a variety of combinations thereof.
- the RDBMS may interface with any web based or program driven applications written in any compatible programming languages including PHP, HTML, XML, Java, AJAX and the like or any of a variety of combinations thereof.
- the computer-readable medium on which such instructions are stored may reside on one or more of the components described above and may be distributed across one or more such components.
- the method implemented through the system described herein includes the step of establishing desired Al architectures through computer programming corresponding to the sensing and actuation steps described herein.
- the system is programmable and controllable through a control station, which may be a physical station, it may be a dashboard representation on a computing device, a mobile application, a web browser, or any combination of such.
- the control station includes three primary control operation types, which are ranges of sensed values and actuation operations, switches and images. The ranges allow for various settings of the present invention 10 to be set remotely, as well as show the current settings of the trapping device.
- the onboard Al continually receives and processes data from the sensors to provide information and reporting to the user, as well as improve trapping functions.
- the Al functionality may be located within the trap device 10, itself, provide continued functionality, learning, and data collection regardless of connectivity.
- FIG. 20 depicts the flow of communication between the users, platform, mobile application, computers, Al software, and traps. The communication may be in various forms such as data files, SMS notifications, user commands, API calls, and other forms of information communication.
- FIG. 21 is a flowchart of the Al logic. The flowchart depicts the continuous data collection and learning of the Al software of the present device.
- the Al component 68 may be established through a Convolution Neural Network or other processing system that "learns" the target animal characteristics, employs that learned information to active or not activate the trap 10 and, if that was a wrong decision, to learn from that to get a better understanding of the target are required action so as not to make the same mistake again.
- the Al “learning” function will typically proceed through the following steps: (1) Collect training data (images, sounds, etc); (2) Label the training data (specific to the present invention - target, empty, other); (3) The data is run through a convolutional neural network - generally leveraging a pre-trained CNN and then "transfer that learning" to the model and context given in training data; (4) A generated model is created that is specifically optimized for the microcontroller used on the device; (5) The device can then run the model against sensor data to "classify" the data in real-time with a confidence score which we use to determine if a target is present.
- the system generates custom models, optimized for each target based on our data collection approach.
- the data collection approach is to utilize targets in the trap 10 and take large amounts of pictures in varying conditions, with various objects in the trap, with and without objects located in the trap, with various lighting conditions, as well as random objects and potential bycatch. This provides a custom model to identify target animals.
- the computer processing and Al framework is constantly improving the model by sending collected images to our Al training pipeline for continuous improvement and better classification.
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Pest Control & Pesticides (AREA)
- Engineering & Computer Science (AREA)
- Insects & Arthropods (AREA)
- Wood Science & Technology (AREA)
- Zoology (AREA)
- Environmental Sciences (AREA)
- Catching Or Destruction (AREA)
Abstract
The present invention is a smart trapping device. The device has an application platform allowing the user to remotely interact with traps and analyze data collected from the traps. The smart trapping device gathers information about a visiting animal, ambient environment, and other pertinent information via sensors such as cameras and pressure pads. The artificial intelligence software is capable of identifying and assessing whether an animal is a target animal. If the animal is a target animal, the software may trigger the device to capture the animal. The application platform also has capabilities to alert a user of a capture or visitor and provide information to the user about the animal and the nature of the visit.
Description
SMART TRAP AND MONITORING DEVICE
BACKGROUND OF THE INVENTION
1. Fi el d of the Inventi on
[0001] The present invention relates to a smart device for trapping animals with remote connection capabilities. Specifically, the present invention is an animal trap which may be accessed and monitored remotely with wireless, satellite, or other remote-connection capabilities. Still more specifically, the present invention relates to a smart trap which identifies types or species of animals and triggers the trapping device to capture said animals. The device then sends photos, videos, sounds, or other data to the user to determine if the correct animal has been captured, which the user may then selectively and remotely release, or choose to retrieve said animal.
2. Description of the Prior Art
[0002] There currently exists various types of animal traps like snares, box traps, corral traps, and other types of animal traps. Typically, traps are designed to target certain types or species of animals. For example, corral traps are designed to capture and contain hogs. The design allows for a group of hogs to enter into the coral, trapping the group to allow for removal or relocation. Snares may be suited for more individualistic animals which travel in certain predictable pathways like foxes, squirrels, chipmunks, or other such animals. Many other types of traps exist, such as deadfall traps, pit traps, foothold traps, and spring traps, with each having certain benefits and disadvantages.
[0003] In addition to being better suited for different types of animals, traps also may be better suited for various tasks. For example, some traps are designed for lethal use on certain animals, while other traps are designed to capture an animal, unharmed, for study, relocation, or other purposes. Additionally, with growing awareness of the treatment of animals, the need for humane trapping continues to grow. Many types of existing traps, like snares or foothold traps, are considered inhumane because they cause pain, suffering, and potential death for the animals. Also, these traps may or may not be selective enough in capturing a specific type or species of animal.
[0004] Typically, box or cage traps are considered humane and are widely used. These traps are designed and sized appropriately to be baited for a certain animal. When the animal enters the box or cage, the trap has a trigger which the animal may trip, resulting in a door, latch, cage, or other mechanism closing in and prohibiting the animal from escaping. These traps are considered more humane as the trap is far less likely to cause pain or damage to the animal during trapping and containment. These types of traps are typically used to trap animals such as foxes, racoons, skunks, and other such animals for relocation. These traps are typically able to be reused multiple times and for different types of animals.
[0005] Existing animal traps have limited use and applicability across different types and species of animals. What is needed is a humane trapping device which operates autonomously and allows a user to operate the trap autonomously and control the trap remotely. The trapping device should have the capability to identify certain traits of the animal and selectively trap an animal with desired traits. This identification of traits may be done using motion sensors, infrared sensors, cameras, pressure mats, or other forms of selectively identifying an animal or traits of an animal. Additionally, the device should be able to selectively engage or disengage the trapping function based on the information obtained through such sensors, cameras, or other informationgathering tool.
[0006] What is needed is a smart trapping device which can operate autonomously and use artificial intelligence (“Al”) software that can identify certain animal characteristics and selectively trap an animal. The Al should be able to identify whether a certain animal is the target animal or a non-target animal. The Al software should then be able to engage the trapping device to humanely trap the animal if it is the target animal. Alternatively, if the animal is not the target animal, the Al software should be able to leave the trap ready for future capture of a target animal. Further, if an animal is trapped the Al software should be able alert a user and provide them with the media necessary (like a picture, video, sound, or other form of media) which allows the user to make a decision on whether or not to collect or release the animal.
SUMMARY OF THE INVENTION
[0006] The present invention relates to a smart trap device. The smart trap device has a containment area, a trapping mechanism, and a technology component. The containment area may be of varying size, shape, and materials, depending on the target animal. For example, many
existing traps use a metal cage of sufficient size to trap animals such as racoons, skunks, and other similar-sized animals. The trapping mechanism may be a door, latch, covering, or certain type of movement of the containment area. For example, the trap may have a door which utilizes gravity to close once an animal has entered the containment area and triggers a sensor which allows the door to close. The sensor collects information about the animal, which is then analyzed by Al software to determine whether the animal is a potential target animal. Alternatively, the containment area may be raised with an opening, such that the opening is aligned with the sensor, sensors, or Al target area. When the animal trips the trigger and is identified as a target animal by the Al software, the containment area may drop onto the animal. The trapping mechanism may be customized depending on the type of target animal. When trapping only the target animal, as opposed to any animal visiting the trap, the smart trap reduces the negative impacts to the ecosystem by reducing the number of required human visits to the trap site, reduces or eliminates the chances of trapping a non-target animal, and provides less disruption to the environment.
[0007] The technology component has a connectivity feature, sensing feature, a data capture feature, an analytical feature, and a power feature. The connectivity feature provides a remote control interface which allows the user to control and monitor the operation of the smart trap from a remote location. The monitoring and control feature of the present invention may be used to manage a singular trap or a multitude of traps. Additionally, the connectivity feature may allow for the interconnection of access and control of multiple traps to establish a web of data collection from each of the various traps within the system. The connectivity feature may be in the form of cellular, satellite, or other wireless method of remote connection. The sensing feature may be a motion sensor, infrared sensor, pressure mat, camera, microphone, other device, or any combination thereof that is used to detect an animal. The data capture feature may be in the form of a camera, microphone, or other method of collecting data such as a picture, sound, pressure, or other data. The analytical feature may be an Al feature which analyzes data captured by the sensing features to identify and act depending on the incoming data. The analytical feature is used to locate, identify, and make decisions based on the data received. The decision-making process of the analytical feature is used to determine whether or not the data will be saved, communicated to the user, or to activate the trap. The analytical feature of the technology component utilizes Al programming with customized Al models to determine whether the
animal is the target animal, thus triggering the trapping feature to capture the animal. With the remote monitoring feature combined with the Al component, the user may assess when they need to return to the trap to retrieve the target animal or re-bait the trap, thus reducing the labor and maintenance requirement. The data may be analyzed through a custom user interface or data exports to inform several aspects of trapping and utilization of the present invention such as tracking progress of the trap(s), estimating of animal populations, informing adaptive trapping strategies, evaluating trap placement, and improving Al functionality. The information may also be used to inform users when maintenance or service of the trap is required.
[0008] The sensing feature may be a type of sensor which is used to detect and identify an animal. The sensing feature may be a camera, pressure sensor, motion sensor, or other type of sensor. The input and data from the sensor collects information which may trigger the trap. The user may be provided with this information, such as a picture, video, sound, pressure mat reading, or other information collected from the sensor which allows the user to remotely control the trap. The user may review the data collected by the sensor(s). The data may be used to make a decision on whether or not to release the captured animal, trigger the capture mechanism, or utilize the information for other purposes. The information provided to the user is in the form of a wireless, satellite, or other remote connectivity methods. The user is not required to be within a certain proximity of the trap and so may remotely connect to the device. The remote connectivity to the device and the autonomous Al component allows for better utilization for remote trapping sites. Typically, remote trapping sites are difficult to monitor and optimize, as well as service and maintain.
[0009] In an embodiment, the device has a camera equipped to collect photographs, videos, and/or sounds of the trap and surrounding area. The camera is used to identify animals interacting with the trap to ensure the target animal is being trapped. Often times, this includes various types of recognition software including Al software. The camera may be used to capture pictures, videos, and sounds of the target animal to provide the user with information related to the trapping of the animal. The location of the camera may be critical to the functionality and information capture. In some embodiments of the present invention, the camera may be located at the top of the trap with an overhead view of the inside of the trap and the trapped animal. The overhead view from the camera allows for the collection of information about the trapped
animal. The consistent distance and angle allows for better estimation of the size and other characteristics of the animal.
[0010] The present invention utilizes Al software as a method of selectively collecting and analyzing data and utilizing this data for trapping purposes. The collection and analysis of the data may include capturing real-time information such as weather conditions, types and frequency of animals visiting the trap, activities during visitation, and any other collectible data. The data, may then be analyzed by the Al software for a multitude of purposes such as identifying the types, species, and number of animals visiting and the date, time, and patterns of when animals are visiting and may be used to further improve the performance of the Al software. Specifically, the improved data collection of the present invention may include the exact timing of capture, impact of weather or other environmental conditions on captures, detection of other species, analysis of trapping strategies and trap placement, real time analytics and reporting, integration with other databases, and ability to export data. Additionally, this information may be further utilized by users to inform and develop adaptive real-time trapping strategies, track progress of the trap or system of traps, and create reports. The Al software may alert the user of certain conditions, characteristics, or other information. The Al component of the smart trap may be functional at a remote location, and communication with the data- collecting elements is completed via the connectivity elements as previously described. Alternatively, the Al component may operate from a computer located on, in, or in close proximity to the trap. In this method, the trap may have various times of interrupted connectivity, but the Al and computer portion of the smart trap may continue to function on a standalone basis while waiting for the connectivity to be reestablished. Further, the Al component may be capable of making continuous improvements through data received, analysis of the data, application improvements, real-time response to analytics, and additional development of technology.
[0011] An additional aspect of the smart trap is the utilization of various types of power. In an embodiment, the present invention is connected to an electrical grid through an outlet, providing continuous power. In another embodiment, the invention is connected to a deep-cycle battery, a generator, or other power source. In yet another embodiment, the invention is powered by solar power. The smart trap may be equipped to receive a solar panel to collect sufficient power to sustain the power output needed for the trap. The smart trap may also have a battery connected to the solar power source to store and utilize power reserves when the solar power is insufficient. In
addition to the types of power described here, the present invention may also be configured to utilize other types of power such as hydro power, wind power, and other power sources.
BRIEF DESCRIPTION OF THE DRAWINGS
[0012] FIG. 1 is a side perspective view of the smart trap device with the trapping mechanism armed.
[0013] FIG. 2 is a front perspective view of the smart trap device with the trapping mechanism armed.
[0014] FIG. 3 is a side perspective view of the smart trap device with the trapping mechanism engaged to capture an animal.
[0015] FIG. 4 is a back perspective view of the smart trap device with the trapping mechanism engaged to capture an animal.
[0016] FIG. 5 is a depiction of a multitude of smart trap devices, forming a system of interconnected or interconnectable smart trap devices.
[0017] FIG. 6 is a right side perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
[0018] FIG. 7 is a left side perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
[0019] FIG. 8 is a front perspective view of an embodiment of the smart trap device wherein the containment component is made of a fine mesh metal screen material.
[0020] FIG. 9 is a depiction of the functional flow of the present invention.
[0021] FIG. 10 is a flowchart of the artificial intelligence functionality of the present invention.
[0022] FIG. 11 is a depiction of a multitude of traps used in a system and the communication flow between the individual traps, user, computer, and Al software.
[0023] FIG. 12 is a flowchart of the Al logic of the present invention.
DETAILED DESCRIPTION OF THE INVENTION
[0024] The present invention 10 is a smart trap that has a containment component 20, a trapping mechanism 40, and a technology component 60. The containment component 20 has a top 22, a bottom 24, a first end 26, a second end 28, a first side 30, and a second side 32. The trapping mechanism 40 has an arming component 42, a trigger 44, and a sealing component 46. The
technology component 60 has a connectivity component 64, a media component 66, and an Al component 68.
[0025] The containment component 20 is designed to capture and retain a target animal 100. The first end 26 of the containment component 20 is opened to allow the animal 100 to enter the containment area. Once the animal 100 is in the containment area, the trapping mechanism 40 may be engaged to close and entrap the animal 100 within the containment area. The arming component 42 of the trapping mechanism 40 readies the trap 10 to be engaged as needed. The trigger or sensor 44 of the trapping mechanism 40 signals the sealing component 46 of the trapping mechanism 40 to close and secure the animal 100 within the containment component 20. The trigger 44 may be a sensor, wire, pedal, or other type of device which controls the engagement or disengagement of the trapping mechanism 40. The arming component 42 may be a latch, hook, or other method of retaining the sealing component 46 in a desired position. When an animal 100 enters the trap 10 through an opening, the animal 100 may engage the trigger 44. In an embodiment, the trigger 44 is activated by the Al software processing the sensor data. The engagement of the trigger 44 may prompt the arming component 42 to engage or disengage with the sealing component 46. In an embodiment, the arming component 42 may force the sealing component 46 shut to trap the animal 100. Alternatively, in another embodiment, the arming component 42 may release the sealing component 46, allowing gravity to shut the sealing component 46 to trap the animal 100.
[0025] The technology component 60 processes information received from the media component 66. The information received from the medica component 66 is processed in a computer of the technology component. In an embodiment, the computer is a microcontroller or low-power, single-board computer. The computer may be housed in the technology component 60 of the trap 10. The media component 66 may be a camera, sensor, microphone, or other device used to collect information. The present invention 10 may have one or more media components 66 used to collect data. When a camera device is used, the location of the device may be from an end view, side view, top or bottom view. In the present embodiment, the camera view is from the top of the device to provide a top-view angle of the animal. The connectivity component 64 provides a method of sharing, transferring, or transmitting information remotely. The connectivity component 64 may utilize existing technology in the form of cellular technology, Wi-Fi technology, satellite communication, radio waves, or other existing methods of information
transmission. The Al component 68 utilizes computer processing information to make independent decisions based on the information or data collected. The Al component 68 of the present invention 10 processes information collected by the media components 66 to identify visiting animals. The Al component 68 may then engage the trigger 44 of the smart trap device 10 to close the trapping mechanism 40 if the animal has been identified as a target animal 100, or to allow the device 10 to remain open if the animal is not a target animal 100.
[0026] As depicted in FIGS. 1-4, the containment component 20 is designed to hold and retain a target animal 100. In the present embodiment, the containment component 20 is made of metal bars which run vertically and horizontally, forming a mesh, along each of the sides 30, 32, ends 26, 28, top 22 and bottom 24 to retain a desired animal 100. In the present embodiment, the trapping mechanism 40 is raised with the arming component 42 to create an opening. The trapping mechanism 40 holds the sealing component 46 in place for when a target animal 100 enters the trap device 10. Once a target animal 100 enters the trap device 10, the trigger 44 may be engaged to allow the sealing component 46 to close and trap the animal 100 within the containment component 20. As depicted in FIG. 1, the technology component 60 is housed and retained on the top side 22 of the device 10. The arming component 42 is connected to the sealing component 46 and the technology component 60, allowing the technology component 60 to control the movement of the sealing component 46 based on the data received from the media component(s) 66.
[0027] As depicted in FIG. 3, the sealing component 46 may be engaged to seal the containment component 20 when a target animal has entered the containment area. FIG. 3 depicts a target animal 100 trapped within the containment component 20. The sealing component 46 is configured to actuate, fall, or otherwise move to close off the first end 26 of the trap device 10. The sealing component 46 of FIG. 3 is suspended in a substantially horizontal position before triggered. Once triggered, a distal end of the sealing component 46 rotates around a proximal end of the sealing component 46 downward, sealing the opening of the containment component 20. The closure of the trap device 10 by the sealing component 46 seals off all sides of the trap device 10 to capture the animal 100.
[0028] As depicted in FIG. 4, the trap device 10 may be made of a mesh, glass, acrylic, plastic, or other materials used to retain and secure an animal. The containment area may be made of varying types of materials to best suit the environment in which the trap device 10 is being used
in or the target animal 100. The sealing component 46 of the device 10 in FIG. 4 may be made of a plastic material which drops to seal the containment component 20. The technology component 60 depicted in FIG. 4 is located on the top 22 of the device 10.
[0029] The Al component 68 of the present invention 10 monitors incoming data collected from sensors and media devices. The incoming data is processed and analyzed by the Al component 68 to determine if a visiting animal is a target animal 100. If the visiting animal is a target animal 100, the Al component 68 triggers the capture of the target animal 100. During capture, the Al component 68 continuously collects data from the animal and surrounding environment. The trapping mechanism 40 is engaged once the Al component 68 has identified a target animal 100. If a non-target animal visits the trap, the Al component 68 will not trigger the trap 10, allowing the non-target animal to leave the area, allowing the trap device 10 to remain armed.
Additionally, the computer component of the technology component 60 utilizes the connectivity component 64 to send information to the user or users. The computer may send pictures, videos, sounds, weather information, and other information to the user. In addition to the Al component 68 control of the trapping mechanism 40, the user may also remotely control the trapping mechanism 40 through the computer and connectivity component 64. The user may remotely engage the trap, disengage the trap, or rearm the trap, maximizing the trap uptime. The smart trap 10 may be remotely armed, rearmed, or scheduled to be armed. The computer of the present invention 10 is configured to collect data continuously, periodically, or selectably. Data may be collected at all times of use of the trap. Some information may be collected prior to contact with an animal or target animal. The computer may only collect data once a sensor or media component 66 has been triggered by an animal. Alternatively, the computer may continuously collect and store all information gathered as directed by the user. The technology component may also have a platform 70. The platform 70 may provide a method for the user to store data, display the data in a mobile application or web-based format, or other methods of data presentation. The platform 70 may also allow the user to control the functions of the trap, such as opening or closing the sealing component 46, sorting and accessing data, manage alerts, or perform other functions or commands.
[0030] FIG. 5 depicts a multitude of trap devices 10 used within a system. The computer of the present invention 10 may be used to facilitate data collection from multiple devices simultaneously. The sharing of information from the devices allows a user to obtain real-time
data capture from a number of devices. The real-time data capture from the system of devices allows the Al component 68 and/or user to analyze various data points such as weather conditions, time of day, area of activity, and other data points to better inform trapping techniques, trapping locations, and animal behavior. Incoming data may be continuously analyzed and processed by the Al component 68, computer, or user through the connectivity component 64.
[0031] The present invention 10 is powered by a power source. The power source may be a variety of existing sources of power such as electrical power, solar power, wind power, hydro power, or other forms of power. The present invention 10 may be powered by the power source continuously or may utilize a battery to store power for use when the power source is disconnected or not producing power. In some embodiments, the power source is a solar panel and a battery.
[0032] The present invention 10 may be utilized in addition to existing traps. For example, there currently exists mechanical traps which may be triggered once an animal enters the trap 10 and engages bait, a sensor, or other method of detecting an animal is present. The present invention 10 may be utilized in conjunction with such a trap 10 to allow a user to remotely connect to the trap 10 to identify whether the animal is a target animal 100. The user may selectively and remotely open and rearm the trap 10 as needed. The present invention 10 may utilize existing trapping technology to control the trapping mechanism 40, collect data, and provide real-time information via the connectivity component 64.
[0033] The analytics of the sensor data provides progress tracking, animal population estimates, informed, adaptive trapping strategies, evaluation of trap placement, and improvement of Al. The smart trap device 10 reduces labor requirements, limits the impact to the ecosystem, increases trap uptime, increases the amount of data collected and the quality of data collected, provides greater accessibility to remote locations, and allows continuous improvement through Al learning.
[0034] FIGS. 6-8 depict an embodiment of the present invention 200. The containment component 20 is made of a fine mesh material, designed to capture specific types of target animals 100. In the embodiment of FIGS. 6-8, the device 200 has an arming component 242 and a sealing component 246. The arming component 242 holds and retains the sealing component 246 while the trap 200 is armed and waiting for a target animal 100. When a target animal 200
enters the trap, the sealing component 246 is released or forced into a closed position, thereby trapping the target animal 100.
[0035] FIG. 9 is a flowchart of the functional flow of the present invention. The trap 10 is placed in a location desirable to capture the target animal 100. The trap 10 is armed waiting for the sensor or trigger to detect a target animal. Once detected, the trap 10 is triggered to capture the animal. If the capture fails, the user may remotely reactivate the trap 10 into an armed position. If the capture is a non-target capture, the user may remotely open the sealing component 46 the trap 10 into an armed position. If the capture is a target capture, the user may collect and retrieve the target animal 100.
[0036] FIG. 10 is a flowchart of the typical user-experience functionality of the present invention 10. The user arms the trap 10. The sensors or triggers 44 detect an animal entering the trap. The Al component 68 determines whether the animal is a target animal 100. If the animal is determined to be a target animal 100, the Al component 68 engages the trigger 44 to close the trap, and the user is notified of the capture. If the Al component 68 determines the animal is not a target animal 100, the trap is not engaged, and the animal is free to exit the trap 10.
[0037] FIG. 11 is a flowchart and information flow of a multitude of traps used in a system. The traps 10 in the system provide data to the platform 70 in the form of image and sensor data, trap closures, or other event notifications (which include success or failure responses to user commands and trap activity, such as captures). The platform 70 provides sortable, readable data as a mobile application to allow users to access information such as photos, capture data, trap status, alerts, environmental data, and provide notification alerts. The mobile application allows the user to provide direct commands to the traps 10 such as remotely opening or closing the traps 10.
[0038] FIG. 12 is a flowchart of the artificial intelligence logic of the present invention 10. The sensors 44 collect data when an animal enters the trap. The Al model assesses a plurality of features and data points of the animal. The Al model then produces an Al confidence score. The Al confidence score is then used to determine whether the animal is a target animal 100 or a nontarget animal. If the Al confidence score meets the established threshold, the trap 10 is triggered to close. If the Al confidence score does not meet the established threshold, the trap 10 remains open.
[0038] The computer of the present invention 10 has a processor coupled to the sensors and to the camera. The processor receives data from the sensors and images from the camera. The processor also sends and receives information to and from controllers that are coupled to the environment control devices through the communication device. The environment control devices may include, but are not limited to, devices that may allow for control of the trap 10 features, such as the adjustment of the camera, opening or closing of the trap mechanism, and other features.
[0039] The processor is a representation of a component of the device employed to carry out the method of the present invention 10, which is to begin the process for collecting information and controlling the trapping device. The method includes the step of collecting data from the sensors and the camera, which collected data is selectable by the user. The method also includes the step of collecting information from one or more operators about other selectable conditions in the environment. The method further includes the step of training one or more computer programs or Al models carried out by the processor based on the combination of the sensed information and identification and trapping of target animals. That trained functionality is employed to actuate one or more of the control devices. The one or more computer programs further leam from that activity to determine resultant sensed information that is iteratively employed to resolve whether further steps are required, including the modification of prior actuation steps. [0040] The processor in the form of one or more computing devices combines physical hardware structures with software that may include firmware and middleware for the purpose of executing instructions that produce the actions described herein. It is to be understood that the computing device or devices suitable for performing the functions of the system to instantiate artificial intelligence functionality as desired include, but are not limited to, desktop computers, laptops, tablets, microcontrollers, single and multi-board computers, cloud computing resources, and mobile devices including smartphones, for example. It is to be understood that a computing device described herein may be any type of device having a processor capable of carrying out instructions associated with one or more computer applications. The devices may contain or be connected to one or more databases of other devices wherein the one or more databases include information related to the invention. For example, the database may include a library of information associated with one or more of the sensors and information about actions performed
by the one or more devices. The one or more databases may be populated and updated with information by authorized users and attached functions.
[0041] The functions of the invention described herein with respect to the operations of the sensors and/or the devices may be described in the general context of computer-executable instructions, such as program modules, being executed by a computing device. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The present invention 10 can be implemented in distributed computing environments where tasks are performed by remote processing devices that are linked through one or more data transmission media including through a communication device. In a distributed computing environment, program function modules and other data may be located in both local and remote device storage media including memory storage devices.
[0042] The processor, interactive drives, memory storage devices, databases and peripherals, such as signal exchange components, of a particular device may be interconnected through one or more electrical buses or cloud services. The one or more buses may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus. The interactive drives include one or more interfaces to couple to an Al-based apparatus, which may be or includes computer processing hardware and programming. The interactive drives are configured to exchange information with the Al apparatus, including delivery of instructions designed to ensure actuation functions are performed.
[0043] Each of the devices 10 of the system of the present invention 10 may include one or more of one or more different computer readable media. Computer readable media can be any available media that can be accessed by the processor and includes both volatile and non-volatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may be computer storage media and/or communication media. Computer storage media include volatile and non-volatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data
structures, program modules or other data. Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information, and which can be accessed by the computer system.
[0044] Each of the devices may further include computer storage media in the form of volatile and/or non-volatile memory such as Read Only Memory (ROM) and Random Access Memory (RAM). RAM typically contains data and/or program modules that are accessible to and/or operated on by the processor. That is, RAM may include application programs, such as the functions of the present invention 10, and information in the form of data. The devices may also include other removable/non-removable, volatile/non-volatile computer storage and access media. For example, a device may include a hard disk drive or solid state drive to read from and/or write to non-removable, non-volatile magnetic media, a magnetic disk drive to read to and/or write from a removable, non-volatile magnetic disk, and an optical disk drive to read to and/or write from a removable, non-volatile optical disk, such as a CD-ROM or other optical media. Other removable/non-removable, volatile/non-volatile computer storage media that can be used in the devices to perform the functional steps associated with the system and method of the present invention 10 include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. [0045] The drives and their associated computer storage media described above provide storage of computer readable instructions, data structures, program modules and other data for the processor. A user may enter commands and information into the processor through input devices such as keyboards and pointing devices, such as a mouse, a trackball, a touch pad or a touch screen. Other input devices may include a microphonejoystick, game pad, satellite dish, scanner, or the like. These and other input devices are connected to the processor through the system bus, or other bus structures, such as a parallel port or a universal serial bus (USB), but is not limited thereto. A monitor or other type of display device is also connected to the processor through the system bus or other bus arrangement.
[0046] The processor is configured and arranged to perform the functions and steps described herein embodied in computer instructions stored and accessed in any one or more of the manners described. The functions and steps may be implemented, individually or in combination, as a
computer program product tangibly as computer-readable signals on a computer-readable medium, such as any one or more of the computer-readable media described. Such computer program product may include computer-readable signals tangibly embodied on the computer- readable medium, where such signals define instructions, for example, as part of one or more programs that, as a result of being executed by the processor, instruct the processor to perform one or more of the functions or acts described herein, and/or various examples, variations and combinations thereof Such instructions may be written in any of a plurality of programming languages, for example, Javascript, Java, Python, Visual Basic, C, or C++, XML, HTML and the like, or any of a variety of combinations thereof. Furthermore, all such programming may be integrated to eventual delivery of information and computed results via web pages delivered over the internet, intranets, 3G, 4G, 5G or evolving networks to computing devices including those in the mobile environment, for example, Smartphones or iPhone, iPad and the like or any variety of combinations thereof.
[0047] All the data aggregated and stored in the database or databases may be managed under an RDBMS for example Oracle, MySQL, Access, PostgreSQL and the like or any of a variety of combinations thereof. The RDBMS may interface with any web based or program driven applications written in any compatible programming languages including PHP, HTML, XML, Java, AJAX and the like or any of a variety of combinations thereof. The computer-readable medium on which such instructions are stored may reside on one or more of the components described above and may be distributed across one or more such components.
The method implemented through the system described herein includes the step of establishing desired Al architectures through computer programming corresponding to the sensing and actuation steps described herein.
[0048] The system is programmable and controllable through a control station, which may be a physical station, it may be a dashboard representation on a computing device, a mobile application, a web browser, or any combination of such. At a minimum, the control station includes three primary control operation types, which are ranges of sensed values and actuation operations, switches and images. The ranges allow for various settings of the present invention 10 to be set remotely, as well as show the current settings of the trapping device.
[0049] The onboard Al continually receives and processes data from the sensors to provide information and reporting to the user, as well as improve trapping functions. The Al functionality
may be located within the trap device 10, itself, provide continued functionality, learning, and data collection regardless of connectivity. FIG. 20 depicts the flow of communication between the users, platform, mobile application, computers, Al software, and traps. The communication may be in various forms such as data files, SMS notifications, user commands, API calls, and other forms of information communication. FIG. 21 is a flowchart of the Al logic. The flowchart depicts the continuous data collection and learning of the Al software of the present device. [0050] The Al component 68 may be established through a Convolution Neural Network or other processing system that "learns" the target animal characteristics, employs that learned information to active or not activate the trap 10 and, if that was a wrong decision, to learn from that to get a better understanding of the target are required action so as not to make the same mistake again. The Al “learning” function will typically proceed through the following steps: (1) Collect training data (images, sounds, etc); (2) Label the training data (specific to the present invention - target, empty, other); (3) The data is run through a convolutional neural network - generally leveraging a pre-trained CNN and then "transfer that learning" to the model and context given in training data; (4) A generated model is created that is specifically optimized for the microcontroller used on the device; (5) The device can then run the model against sensor data to "classify" the data in real-time with a confidence score which we use to determine if a target is present.
[0051] The system generates custom models, optimized for each target based on our data collection approach. The data collection approach is to utilize targets in the trap 10 and take large amounts of pictures in varying conditions, with various objects in the trap, with and without objects located in the trap, with various lighting conditions, as well as random objects and potential bycatch. This provides a custom model to identify target animals. The computer processing and Al framework is constantly improving the model by sending collected images to our Al training pipeline for continuous improvement and better classification.
[0052] The present invention 10 has been described with reference to specific examples and configurations. It is only intended to be limited to the description set out in the claims and equivalents.
Claims
1. An animal trapping device, the device comprising: a containment component; a trapping mechanism; a technology component; and a power source; wherein the technology component has one or more sensors, a computer, and an artificial intelligence component; wherein the one or more sensors collect data which is stored by the computer; wherein the collected data is analyzed by the artificial intelligence component; and wherein the artificial intelligence component may trigger the computer to actuate the trapping mechanism.
2. The device of Claim 1 wherein the containment component is formed of a metal cage.
3. The device of Claim 1 wherein the trapping mechanism is a gravity-controlled door.
4. The device of Claim 1 wherein the trapping mechanism is the containment area lowering on to a target animal in response to a trigger.
5. The device of Claim 1 wherein the power source is solar power.
6. The device of Claim 1 wherein the power source includes a battery.
7. The device of Claim 1 wherein the plurality of sensors includes an infrared sensor, a motion sensor, or a pressure mat.
8. The device of Claim 1 wherein the technology component has a camera or microphone.
9 The device of Claim 1 wherein the computer is configured to collect data from the one or more sensors.
10. The device of Claim 1 wherein the artificial intelligence component analyzes data collected from the one or more sensors.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202363583860P | 2023-09-19 | 2023-09-19 | |
| US63/583,860 | 2023-09-19 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2025064675A1 true WO2025064675A1 (en) | 2025-03-27 |
Family
ID=95072136
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2024/047486 Pending WO2025064675A1 (en) | 2023-09-19 | 2024-09-19 | Smart trap and monitoring device |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025064675A1 (en) |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110308137A1 (en) * | 2010-06-16 | 2011-12-22 | Ugiansky Bobby D | Wall-Less Trap Systems and Methods |
| US20190166823A1 (en) * | 2017-12-04 | 2019-06-06 | Caldera Services LLC | Selective Action Animal Trap |
| WO2020037377A1 (en) * | 2018-08-24 | 2020-02-27 | OutofBox Solutions Tech Pty Ltd | A detection system |
| US20200267515A1 (en) * | 2017-11-07 | 2020-08-20 | Pica Product Development, Llc | Systems, Methods and Devices for Remote Trap Monitoring |
| US20220142144A1 (en) * | 2020-11-10 | 2022-05-12 | Graham Patterson | System and Method for Capturing a Target Animal |
-
2024
- 2024-09-19 WO PCT/US2024/047486 patent/WO2025064675A1/en active Pending
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110308137A1 (en) * | 2010-06-16 | 2011-12-22 | Ugiansky Bobby D | Wall-Less Trap Systems and Methods |
| US20200267515A1 (en) * | 2017-11-07 | 2020-08-20 | Pica Product Development, Llc | Systems, Methods and Devices for Remote Trap Monitoring |
| US20190166823A1 (en) * | 2017-12-04 | 2019-06-06 | Caldera Services LLC | Selective Action Animal Trap |
| WO2020037377A1 (en) * | 2018-08-24 | 2020-02-27 | OutofBox Solutions Tech Pty Ltd | A detection system |
| US20220142144A1 (en) * | 2020-11-10 | 2022-05-12 | Graham Patterson | System and Method for Capturing a Target Animal |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20230217903A1 (en) | Animal Sensing System | |
| CN113349105B (en) | Intelligent bird feeding method, electronic equipment, bird feeder and storage medium | |
| US11925173B2 (en) | Data collection system and method for feeding aquatic animals | |
| KR101598898B1 (en) | The system for precaution for paralichthys olivaceus disease using and an analysis of image and sound | |
| CN102508288B (en) | Earthquake prediction auxiliary system based on technology of Internet of things | |
| CN102282570A (en) | Systems and methods for stereoscopic field-of-view multi-animal behavior characterization | |
| KR102425523B1 (en) | Method and appartus for checking symptom of pig and managing cloud service | |
| CN105989683A (en) | Enhanced residence security system | |
| CN109886999A (en) | Location determining method, device, storage medium and processor | |
| KR102156279B1 (en) | Method and automated camera-based system for detecting and suppressing harmful behavior of pet | |
| CN107872776B (en) | Method and device for indoor monitoring, electronic equipment and storage medium | |
| CN111814665A (en) | Accompanying method, device, server and storage medium based on pet emotion recognition | |
| KR102396999B1 (en) | Cattle behavior automatic recognition and the monitoring system using the deep learning and method thereof | |
| CN117423210A (en) | Nursing is with disease anti-drop intelligent response alarm system | |
| CN110334642A (en) | A method and system for machine vision recognition of pig behavior | |
| US20240046485A1 (en) | Real-motion prediction | |
| WO2025064675A1 (en) | Smart trap and monitoring device | |
| CN208064361U (en) | Mouse trap and muroid monitoring system | |
| CN116071702A (en) | Pig farm interaction behavior monitoring management method and system based on computer vision | |
| WO2023281278A1 (en) | Threat assessment system | |
| Roy et al. | IoT Applications in Wildlife Conservation: Tracking and Protecting Endangered Species | |
| Rogers et al. | Meerkat behaviour recognition dataset | |
| KR20160058378A (en) | An apparatus for catching insects | |
| CN112287893A (en) | Sow lactation behavior identification method based on audio and video information fusion | |
| CN115859119A (en) | Person detection method, person detection device, electronic apparatus, storage medium, and program product |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 24869159 Country of ref document: EP Kind code of ref document: A1 |