[go: up one dir, main page]

CN115648237B - Intelligent companion robot - Google Patents

Intelligent companion robot Download PDF

Info

Publication number
CN115648237B
CN115648237B CN202211223619.XA CN202211223619A CN115648237B CN 115648237 B CN115648237 B CN 115648237B CN 202211223619 A CN202211223619 A CN 202211223619A CN 115648237 B CN115648237 B CN 115648237B
Authority
CN
China
Prior art keywords
pet
intelligent
robot
camera
companion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211223619.XA
Other languages
Chinese (zh)
Other versions
CN115648237A (en
Inventor
骆润卿
张双彪
陈晨
佘怡欣
游凯卉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Information Science and Technology University
Original Assignee
Beijing Information Science and Technology University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Information Science and Technology University filed Critical Beijing Information Science and Technology University
Priority to CN202211223619.XA priority Critical patent/CN115648237B/en
Publication of CN115648237A publication Critical patent/CN115648237A/en
Application granted granted Critical
Publication of CN115648237B publication Critical patent/CN115648237B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an intelligent pet-companion robot which comprises a chassis and a machine body, wherein a mobile device is arranged on the chassis, a rotatable camera, a laser radar and a two-way communication horn are arranged outside the machine body, the camera is used for collecting image data, the laser radar is used for collecting radar data, a processor and a wireless communication module are arranged inside the machine body, the wireless communication module is used for sending the image data and/or audio data to a client of the intelligent pet-companion robot, the two-way communication horn is used for realizing intercom, the processor is used for generating a two-dimensional map according to the radar data and determining the current position of the intelligent pet-companion robot by adopting a Monte Carlo algorithm, global path planning is carried out on the basis of the two-dimensional map by adopting an A-star algorithm, a moving path is generated, and the mobile device is controlled to move the intelligent pet-companion robot to the vicinity of a pet according to the moving path. The intelligent pet companion robot provided by the invention can better realize the monitoring of pets when the host is not at home.

Description

Intelligent companion robot
Technical Field
The embodiment of the invention relates to the technical field of robots, in particular to an intelligent pet accompanying robot.
Background
The kept pets not only can relieve the human's feeling of autism and make the human feel accompanied, but also can obtain the achievement and happiness of the career pets. With the continuous increase of the incomes of residents, the life rhythm is continuously faster, and more people raising pets are raising. The pet can accompany the human, and the pet also needs the care of the human. For the office workers or pet raising groups who often have short trip travel requirements, the pets cannot be well cared. Accordingly, pet companion robots for helping outside owners to care for pets are increasingly entering the market.
The pet companion robot disclosed in the Chinese patent application with publication number of CN112775979A, entitled "control method of pet companion robot, pet companion robot and chip" scans a circular region with the pet companion robot as a center and the scanning distance of the infrared temperature detection sensor as a radius by the infrared temperature detection sensor. The circular area is a preset area. Scanning a preset area in real time through an infrared temperature detection sensor, acquiring a scanning result, judging that a pet is detected in the preset area when the scanning result meets preset conditions that a movable object exists and the temperature of the movable object is detected to be within a preset interval, and then feeding the pet. However, because the preset area scanned by the infrared temperature detection sensor is limited, if the pet is not in the preset area, intelligent monitoring of the pet is difficult to realize.
Disclosure of Invention
The embodiment of the invention provides an intelligent pet accompanying robot which is used for better realizing the monitoring of pets when an owner is not at home.
The embodiment of the invention provides an intelligent pet accompanying robot, which comprises a chassis and a machine body, wherein the chassis is used for bearing the machine body;
the chassis is provided with a moving device for realizing the movement of the intelligent pet accompanying robot;
The camera is used for collecting image data, the laser radar is used for collecting radar data, and the bidirectional communication horn comprises a pickup and a microphone;
the camera, the laser radar, the bidirectional communication horn and the wireless communication module are all connected with the processor;
The wireless communication module is used for sending the image data collected by the camera and/or the audio data collected by the pickup to a client of the intelligent pet-companion robot;
The bidirectional communication horn is used for realizing the intercommunication between the user and the pet;
The processor is used for:
generating a two-dimensional map by adopting Gmapping algorithm according to the received radar data, and determining the current position of the intelligent pet-companion robot by adopting Monte Carlo algorithm;
when the pet is identified to exist in the range of the camera according to the image data received from the sensor, determining the current position of the pet;
taking the current position of the intelligent pet accompanying robot as a starting point, taking the current position of the pet as a target end point, and adopting an A-star algorithm to carry out global path planning based on a two-dimensional map so as to generate a moving path;
and controlling the mobile device to move the intelligent pet companion robot according to the moving path so as to move the intelligent pet companion robot to the vicinity of the pet.
In one embodiment, the processor is further configured to:
When the pet is determined not to be in the range of the camera according to the image data received from the sensor, responding to the received pet seeking instruction, and controlling the pickup to collect the audio data of the pet;
and determining the direction of arrival and the distance by adopting a generalized cross-correlation-phase transformation algorithm according to the audio data, and determining the current position of the pet according to the determined direction of arrival and the determined distance.
In one embodiment, the processor is further configured to:
the microphone is controlled to play a prerecorded sound signal of the pet owner or a sound signal of the pet owner in a real-time voice call so as to guide the pet to make a sound or approach the intelligent pet companion robot.
In one embodiment, the processor is further configured to:
when the pet is determined not to be in the range of the camera according to the image data received from the sensor, traversing all areas except the barrier on the two-dimensional map by adopting a full-coverage path planning algorithm;
in the traversal process, carrying out real-time identification processing on image data acquired by a camera;
if the pet is identified, stopping traversing, determining the current position of the pet, and marking the current position and the current time of the pet in the two-dimensional map.
In one embodiment, the processor is further configured to:
Generating a probability distribution map of the positions of the pets in each time period according to the positions of the pets marked in the multi-time traversal process and the corresponding time;
When a pet seeking instruction is received, each position in the time period corresponding to the current time is sequentially determined to be a target end point according to the probability distribution diagram, and navigation is performed.
In one embodiment, a display device is mounted on the exterior of the body for displaying a pre-recorded video signal of the pet owner or a video signal of the pet owner in a real-time video call.
In one embodiment, the top of the machine body is provided with a universal head which rotates horizontally, and the camera is connected to the universal head.
In one embodiment, the mobile device comprises 4 steering wheels and 4 driving wheels, wherein the bottoms of the driving wheels are arranged at equal heights, and the omni-directional movement is realized.
In one embodiment, the machine body is provided with a grain storage device for storing pet grains, and the top of the machine body is provided with a grain storage self-locking button for controlling the ejection and retraction of the grain storage device.
In one embodiment, a food basin is arranged at the bottom of the chassis, the food basin is connected with the chassis in a extendable manner, the food basin is communicated with the grain storage device, and a predetermined amount of pet grains are thrown into the food basin from the grain storage device and extend out of the food basin in response to a grain placing instruction.
The intelligent pet companion robot comprises a chassis and a machine body, wherein the chassis is used for bearing the machine body, a mobile device is installed on the chassis and used for achieving movement of the intelligent pet companion robot, a rotatable camera, a laser radar and a two-way communication horn are arranged outside the machine body and used for collecting image data, the laser radar is used for collecting radar data, the two-way communication horn comprises a pickup and a microphone, a processor and a wireless communication module are arranged inside the machine body, the camera, the laser radar, the two-way communication horn and the wireless communication module are all connected with the processor, the wireless communication module is used for sending image data collected by the camera and/or audio data collected by the pickup to a client of the intelligent pet companion robot, the two-way communication horn is used for achieving intercom between a user and a pet, the processor is used for generating a two-dimensional map according to received radar data and determining the current position of the intelligent pet companion robot by adopting a Monte Carlo algorithm, when the current position of the intelligent pet companion robot is identified to a range of the pickup according to the image data received from an inductor, the current position of the intelligent pet companion robot is used as a starting point, the current position of the intelligent pet companion robot is used as a target path, the mobile path of the intelligent pet companion robot is planned to be moved to the intelligent pet companion robot by the mobile device according to the global mobile path, and the two-dimensional map is used for achieving the mobile pet companion robot is planned to be a mobile by the mobile device according to the intelligent pet companion robot. The intelligent pet companion robot can be accurately moved to the vicinity of the pet to provide companion service for the pet better by adopting the two-way communication loudspeaker to realize the intercom between the user and the pet, providing emotion companion for the pet better, generating a map by adopting a laser radar and positioning the pet, and adopting an A-star algorithm to carry out global path planning. In summary, the intelligent companion robot provided by the embodiment of the invention can better realize the monitoring of pets when the host is not at home.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
Fig. 1 is a schematic structural diagram of an intelligent pet-companion robot according to an embodiment of the present invention.
Specific embodiments of the present invention have been shown by way of the above drawings and will be described in more detail below. The drawings and the written description are not intended to limit the scope of the inventive concepts in any way, but rather to illustrate the inventive concepts to those skilled in the art by reference to the specific embodiments.
Detailed Description
The application will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, related operations of the present application have not been shown or described in the specification in order to avoid obscuring the core portions of the present application, and may be unnecessary to persons skilled in the art from a detailed description of the related operations, which may be presented in the description and general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The term "coupled" as used herein includes both direct and indirect coupling (coupling), unless otherwise indicated.
Example 1
Fig. 1 is a schematic structural diagram of an intelligent pet-companion robot according to an embodiment of the present invention. As shown in fig. 1, the intelligent pet-companion robot provided in this embodiment may include a chassis 1 and a body 2. Wherein the chassis 1 is used for carrying the machine body 2.
The mobile device 101 is installed on the chassis and used for realizing the movement of the intelligent companion robot. In an alternative embodiment, the moving device may comprise 4 steering wheels and 4 driving wheels, the bottoms of which are arranged at equal heights for achieving an omni-directional movement. Each steering wheel has a steering motion degree of freedom and a driving motion degree of freedom, wherein the steering motion degree of freedom of the steering wheel is actively controlled and the driving motion degree of freedom of the steering wheel is followed, and each driving wheel also has a steering motion degree of freedom and a driving motion degree of freedom, wherein the driving motion degree of freedom of the driving wheel is actively controlled and the steering motion degree of the driving wheel is followed, and the driving wheel and the steering wheel can be symmetrically arranged about the central axis of the chassis. The mobile device can well assist the intelligent pet-accompanying robot to complete straight movement, sideways movement, inclined movement, in-situ steering and other movements, and is simple to control.
The outside rotatable camera 201, laser radar 202 and the two-way communication loudspeaker 203 of being provided with of organism, the camera is used for gathering image data, and the laser radar is used for gathering radar data, and two-way communication loudspeaker includes adapter and microphone. In an alternative embodiment, a horizontal rotating universal head can be arranged on the top of the machine body, and the camera is connected to the universal head, so that 360-degree steering of the camera can be realized.
The inside treater 204 and the wireless communication module 205 that are provided with of organism, camera, laser radar, two-way communication loudspeaker and wireless communication module all are connected with the treater. The wireless communication module is used for sending the image data collected by the camera and/or the audio data collected by the pickup to the client of the intelligent pet-companion robot, so that a user can watch the image of the pet or hear the sound of the pet in real time through the client, and the pet can be monitored better. Through wireless communication module, the user can carry out the voice call through the customer end of intelligence companion pet robot, can realize through two-way communication loudspeaker that user and pet's intercommunication, not only the user can hear the sound of pet in real time, and the pet also can hear owner's sound, can provide the mental companion for the pet better.
The processor is used for generating a two-dimensional map according to received radar data by adopting a Gmapping algorithm, determining the current position of the intelligent pet-companion robot by adopting a Monte Carlo algorithm, determining the current position of the pet when the pet is identified to be in the range of the camera according to image data received from the sensor, taking the current position of the intelligent pet-companion robot as a starting point, taking the current position of the pet as a target end point, carrying out global path planning based on the two-dimensional map by adopting an A-star algorithm, generating a moving path, and controlling the moving device to move the intelligent pet-companion robot according to the moving path so as to move the intelligent pet-companion robot to the vicinity of the pet.
The pet can usually move in each indoor space, and the range of motion is larger. Therefore, in order to accurately locate the pet, in this embodiment, first, the indoor two-dimensional map of the pet activity space is constructed by using the radar data collected by the laser radar. When the map is constructed, gmapping algorithm is adopted to obtain a precise indoor map. And then determining the current position of the intelligent companion robot by adopting a Monte Carlo algorithm. And carrying out global path planning based on the two-dimensional map, precisely avoiding the obstacle, and searching for a globally optimal moving path. And moving the intelligent pet companion robot to the vicinity of the pet according to the determined moving path so as to realize close monitoring or voice interaction of the pet and the like. It should be noted that, the Gmapping algorithm, the Monte Carlo algorithm, and the A-star algorithm in this embodiment all adopt algorithms in the prior art, and are not described here again.
The intelligent pet companion robot provided by the embodiment can realize real-time monitoring of a user on pets by sending image data collected by the camera and/or audio data collected by the pick-up to the client of the intelligent pet companion robot through the wireless communication module, can better provide emotion companion for the pets by realizing intercommunication between the user and the pets through the bidirectional communication loudspeaker, can accurately move the intelligent pet companion robot to the vicinity of the pets by generating a map through a laser radar and positioning the pets and can better provide companion service for the pets by adopting an A-star algorithm to conduct global path planning. In summary, the intelligent companion robot provided by the embodiment can better realize the monitoring of pets when the host is not at home.
Example two
The processor can judge whether the pet is located in the camera range of the intelligent pet accompanying robot based on image recognition according to the image data acquired by the camera. When the state that the pet is not in the range of the camera of the intelligent pet-companion robot lasts for the preset time period, for example, the state that the pet is not in the range of the camera of the intelligent pet-companion robot lasts for 30 minutes, a pet-seeking instruction can be generated. Or the user can send a pet seeking instruction to the intelligent pet companion robot through the client of the intelligent pet companion robot.
When the pet is not in the range of the camera of the intelligent pet companion robot, in order to achieve rapid and accurate positioning of the pet, the intelligent pet companion robot provided by the embodiment can further achieve positioning through audio data on the basis of the embodiment. Specifically, the processor in the intelligent pet companion robot is further used for responding to the received pet companion command to collect the audio data of the pet when the pet is determined not to be in the range of the camera according to the image data received from the sensor, determining the direction of arrival (Direction of arrival, DOA) and the distance according to the audio data by adopting a generalized cross-correlation-Phase transformation (GCC-PHAT) algorithm, and determining the current position of the pet according to the determined direction of arrival and the determined distance. The sound of the pet is collected through the sound pick-up, and the current position of the pet is determined according to the direction and the distance of the sound.
When the pet does not make a sound, for example, the pet is in a sleep state, the sound of the pet cannot be obtained through the sound pick-up. At the moment, the processor can also be used for controlling the microphone to play a prerecorded sound signal of the pet owner or a sound signal of the pet owner in real-time voice communication so as to guide the pet to make a sound or approach to the intelligent companion pet robot, so that audio data acquisition is completed, and finally, the positioning of the pet is realized.
The intelligent pet companion robot provided by the embodiment further collects the audio data of the pet by controlling the pickup on the basis of the embodiment, then determines the direction of arrival and the distance according to the audio data, and finally determines the current position of the pet according to the determined direction of arrival and the determined distance, so that the pet can be rapidly and accurately positioned when the pet is not in the range of the camera of the intelligent pet companion robot.
Example III
When the pet is not in the range of the camera of the intelligent pet accompanying robot, besides positioning according to sound, or after sound source positioning fails, pet positioning can be achieved through traversal based on the obtained indoor two-dimensional map. The processor in the intelligent pet companion robot is further used for traversing all areas except the obstacles on the two-dimensional map by adopting a full-coverage path planning algorithm (Complete Coverage PATH PLANNING, CCPP) when the pet is determined not to be in the range of the camera according to the image data received from the sensor, carrying out real-time identification processing on the image data acquired by the camera in the traversing process, stopping traversing if the pet is identified, determining the current position of the pet, and marking the current position and the current time of the pet in the two-dimensional map.
It will be appreciated that the time of day and location of activity of the pet has a certain regularity. After the pet is subjected to multiple traversal positioning, the processor can be further used for generating a probability distribution map of the positions of the pets in all time periods according to the positions of the pets marked in the multiple traversal process and the corresponding time, and when a pet searching instruction is received, determining all the positions in the time period corresponding to the current time as target terminals according to the probability distribution map, and navigating. For example, 24 hours a day may be divided into 12 time periods. In the probability distribution map, the positions where the pets appear in the respective periods in the past traversal, and the number of times the respective positions appear may be recorded. The probability of pet presence is high at the positions with more occurrence times. For example, when a pet seeking instruction is received, if it is determined that the possible positions of the pet in the period corresponding to the current time are a (10 times), B (68 times), C (50 times) and D (34 times) according to the probability distribution map, the position B is determined as the target destination, and navigation is performed. If a pet is identified at B, then it ends. If the pet is not identified at the position B, the position C is determined to be a target destination, navigation is continued, and the like.
According to the intelligent pet companion robot, when the pets are not in the range of the camera of the intelligent pet companion robot, the positions of the pets can be accurately determined through traversal based on the two-dimensional map. According to the multiple traversals, a time-position probability distribution map of the pet is constructed, and the position with high probability is set as a target destination preferentially, so that the pet positioning process can be accelerated, and the positioning time is saved.
Example IV
On the basis of any one of the above embodiments, the intelligent companion robot provided in this embodiment not only can realize real-time monitoring of pets, but also can realize intelligent feeding of pets. The intelligent pet accompanying robot body provided by the embodiment is provided with a grain storage device for storing pet grains, and the top of the body is provided with a grain storage self-locking button for controlling the ejection and retraction of the grain storage device. The bottom of the chassis is provided with a food basin which can be connected with the chassis in an extending way, the food basin and the grain storage device can be communicated, and a predetermined amount of pet grains are put into the food basin from the grain storage device and extend out of the food basin in response to a grain putting instruction. The grain storage device in this embodiment can store the pet grain volume that is greater than the holding volume of edible basin far away, can realize once adding the grain, throws the feed many times. The storage device and the food basin can be communicated through a pipeline with a valve, and grains are thrown into the food basin during feeding. The grain placing instruction can be generated according to preset frequency, for example, feeding is carried out every 6 hours, and the grain placing instruction can also be sent to the intelligent pet accompanying robot by a user through a client.
Example five
The existing pet companion robot focuses more on how to perform intelligent feeding. For example, in the Chinese patent application with publication number of CN112775979A, the control method of the pet accompanying robot, the pet accompanying robot and the chip, the infrared temperature detection sensor can scan a circular area with the pet accompanying robot as a circle center and the scanning distance of the infrared temperature detection sensor as a radius. The circular area is a preset area, so that the pet accompanying robot can scan the preset area through the infrared temperature detection sensor, and temperature data of each object in the preset area are obtained. And taking the temperature data of each object as the scanning result. Scanning a preset area in real time through an infrared temperature detection sensor, acquiring a scanning result, judging that a pet is detected in the preset area when the scanning result meets a preset condition that a movable object exists and an object with the temperature within a preset interval is detected, and executing the step of starting the image pickup device. Shooting the preset area through a shooting device, acquiring shooting data, performing image processing on the shooting data, and determining the pet type corresponding to the pet according to an image processing result. And determining the feed type and the feed throwing amount according to the pet type, and finally throwing the feed according to the feed type and the feed throwing amount. The accurate feeding is realized mainly by determining the type of the pet. Moreover, the infrared temperature detection sensor has limited scanning preset area, and if the pet is not in the preset area, the intelligent feeding effect is difficult to achieve. In the Chinese patent application with publication number of CN113080094A, the invention is named as 'method, device and computer equipment for accompanying pet', the target motion quantity of the pet is confirmed, a corresponding motion scheme is formed according to the target motion quantity, the pet accompanying robot interacts with the pet according to the motion scheme, the pet wearing equipment records the actual motion quantity of the pet, a feeding scheme is formed according to the actual motion quantity, and the pet feeding device feeds food according to the feeding scheme. The pet companion robot forms a corresponding motion scheme according to the target motion quantity, and comprises the steps of determining an activity scene graph of a pet, planning an activity path and an activity mode of the pet robot matched with the target motion quantity of the pet according to the activity scene graph, or setting the activity path of the pet robot in the activity scene by a user. The pet accompanying robot attracts the pet in a set mode and performs corresponding movement according to the interaction of the pet and the movement path until the movement quantity of the pet meets the target movement quantity. The pet accompanying robot is provided with a camera and a sensor, can identify the environment conditions of the pet and the surrounding environment, and can sense the interaction of the pet, such as the touch of the pet, the approach of the pet and the like. The pet accompany robot has the binocular camera, can keep away the barrier based on the binocular camera, just so can guarantee with the interactive in-process of pet, can normal operating, not blocked by the barrier. The pet food is mainly fed scientifically according to the actual quantity of motion of the pet.
The pet needs to be not only cared by the human but also accompany the human. In addition to feeding, pets also have emotional needs, which is obviously not recognized by existing pet companion robots, and remains in the feeding machine stage.
The intelligent companion robot provided by the embodiment is further provided with a display device outside the body, and the display device is used for displaying a prerecorded video signal of the pet owner or a video signal of the pet owner in a real-time video call. And controlling the microphone to play a prerecorded sound signal of the pet owner or a sound signal of the pet owner in the real-time voice call. For example, a pre-recorded video of the pet owner may be played by the display device at a pre-set point in time (e.g., when the owner is generally away from work), and/or a microphone may play pre-recorded sound of the pet owner, so that the pet may feel the companion of the owner by sound and/or video even if the owner is not at home. By means of the wireless communication module, a user can conduct real-time voice or real-time video call with the intelligent pet accompanying robot through the client, sound signals of the pet owners in the real-time voice call are played through the microphone, and/or videos of the pet owners in the real-time video call are played through the display device, remote interaction between the owners and the pets can be better achieved, and better emotion accompanying is provided for the pets.
Example six
The application is further illustrated by a specific example. When no pet exists in the range of the camera of the intelligent pet accompanying robot, the position of the pet is determined indoors by using sensors such as a laser radar, and the pet is automatically kept away from the obstacle and navigated to the vicinity of the pet, so that the pet is better monitored when the owner is not at home. The intelligent companion robot that this embodiment provided, including organism and rotatable camera and chassis, the organism outside is provided with laser radar, stores up grain mouth and stores up grain mouth auto-lock button, is provided with the inductor in the organism, the controller, the core board, the organism lower part is equipped with two-way communication loudspeaker. The chassis has 4 steering wheels, the controller has wireless communication element, and the bottom of the chassis is provided with a food basin. The sensor is used for receiving the data acquired by the sensor and sending the data to the core board when the core board is needed. The core board is used for receiving information sent by the client, processing data, executing an algorithm, storing information and outputting an instruction to the controller. The controller is provided with a wireless communication element for receiving instructions from the core board and controlling the components of the intelligent companion robot to perform corresponding operations. The intelligent companion pet robot body top sets up horizontal pivoted universal head, and the camera is connected on the universal head, can realize 360 turns to. The chassis has 4 drive wheels, and the drive wheel bottom is high setting.
The organism front end is equipped with two-way communication loudspeaker, and two-way communication loudspeaker contains adapter and microphone, and wireless communication component is connected to two-way communication loudspeaker, and the user can be through the long-range and pet interaction of customer end. And a laser radar is arranged outside the machine body. The method comprises the steps that a robot operating system (Robot Operating System and ROS) is built by a core board, acquired data are transmitted to an inductor by a laser radar, a two-dimensional map is built by the core board according to the data received from the inductor through a Gmapping algorithm, the position of the robot is determined through a Monte Carlo algorithm, and a global path planning of a known target destination is conducted through an A-star algorithm.
When a pet exists in the range of the camera of the intelligent pet-companion robot, after the core board acquires data acquired by the camera through the sensor, an image recognition technology is applied to determine the position of the pet, if the core board receives instructions such as grain release or voice interaction sent by the client, the core board establishes the position as a target destination point, and an A-star algorithm is applied to autonomously avoid obstacle and navigate to the vicinity of the pet, so that the intelligent pet-companion robot can feed or perform voice interaction or the camera can view the pet in a short distance.
When the pet is not in the range of the camera of the intelligent pet-companion robot, if the core board receives a pet-seeking instruction of the client, the intelligent pet-companion robot can conduct sound source positioning according to the pet sound obtained by the sound pick-up. If the sound pick-up does not acquire the sound of the pet, the core board of the intelligent pet accompanying robot sends a command to the controller, and the controller controls the microphone to play the record of the owner or speak the sound in real time to guide the pet to make the sound or get close to the robot. If the pet is not in the range of the robot camera and the pet emits a sound, the sound pick-up device acquires the sound of the pet and sends the sound to the core board through the sensor, the core board carries out DOA estimation and distance estimation by applying a generalized cross-correlation-Phase Transform (GCC-PHAT) algorithm, so that the target end position is determined, the pet can be independently kept away from the obstacle and navigated to the vicinity of the pet by applying an A-star algorithm, and then the intelligent pet accompanying robot can carry out feeding or voice interaction or camera close-range viewing operation.
If the sound source positioning fails, the core board traverses all areas except the obstacle on the map by using a full-coverage path planning algorithm, the core board continuously processes the data acquired by the camera in the traversing process, and if the pet is identified (namely, the pet appears in the range of the camera in the traversing process), the core board stops traversing and marks the appearance position and the current time of the pet in the map. After the intelligent pet-seeking robot traverses and seeks for many times, the core board forms a probability distribution diagram of the position of the pet at a certain moment, and later when the intelligent pet-seeking robot core board receives a pet-seeking command, the core board firstly takes a plurality of possible positions of the pet as end points in sequence according to the probability distribution diagram, and an A-star algorithm is applied to autonomously avoid the obstacle and navigate to the vicinity of the pet. If the pets are identified in the range of the camera, searching the pets by using a traversing method.
The intelligent companion robot body rear is equipped with stores up grain mouth and top and is equipped with stores up grain mouth auto-lock button, realizes storing up the auto-lock of grain mouth after pressing the button, operates again and realizes storing up the withdrawal of grain mouth. The intelligent companion pet robot chassis is provided with a food basin, the food basin is connected with the machine body and is provided with a self-locking switch, the food basin can pass in and out through remote signal receiving, the core board receives signals received by the sensor and sends instructions to the controller, and the controller controls the food basin to enter and exit.
In the specific implementation mode, the self-locking button of the grain storage opening is pressed, the grain storage opening pops up, and after a proper amount of pet grains are filled, the button is pressed again, so that the grain storage opening is retracted. When a user opens corresponding software by using a mobile client such as a mobile phone or a tablet and the like at a unit or any place with signals, the wireless communication element starts to send and receive signals, and the user remotely controls the intelligent pet accompanying robot to put grains at fixed time and fixed quantity or clicks a food throwing button to put grains. The core board controls the food basin to be self-locked or retracted according to the related instructions of the client side when the food is put, the intelligent pet-accompanying robot is required to be manually carried at home when a user uses the intelligent pet-accompanying robot for the first time, and the core board utilizes the data acquired by the laser radar to construct an indoor map. The method comprises the steps that a user can check pet dynamics through a mobile client, when no pet exists in the range of a camera, the user can send a pet seeking instruction to a core board, the core board firstly obtains the sound of the pet according to a sound pick-up, a GCC-PHAT algorithm is applied to carry out DOA estimation and distance estimation, so that the target end position is determined, an A-star algorithm is applied to independently avoid obstacle and navigate to the vicinity of the pet, the core board can control a microphone which is a component part of a bidirectional voice communication loudspeaker to play the record of the owner or conduct speaking sound in real time to guide the pet to make the sound when the sound pick-up does not obtain the sound of the pet, if sound source positioning fails, the core board traverses all areas except for obstacles on a map through a full-coverage path planning algorithm, in the traversing process, the core board continuously processes the data obtained by the camera, and if the pet is identified to be in the range of the camera, the position and the current time of the pet are marked in the map. After the intelligent pet-seeking robot traverses and seeks for many times, the core board forms a probability distribution diagram of the position of the pet at a certain moment, and later when the intelligent pet-seeking robot core board receives a pet-seeking instruction, the core board firstly takes a plurality of possible positions of the pet as end points in sequence according to the probability distribution diagram, and an A-star algorithm is applied to autonomously avoid the obstacle and navigate to the vicinity of the pet. If no pet is identified in the range of the camera, searching the pet by using a traversing method. The user can talk with the pet through the two-way communication loudspeaker and can record and play at regular time, and the device is used for calling the pet to eat and the like.
The intelligent companion pet robot body top that this embodiment provided is equipped with 360 cameras that turn to, acquires the video data of pet to can be through wireless communication component, core board with video data transmission to clients such as cell-phone APP, make pet raising person not at home also can look over in real time loving pet developments. The bidirectional communication loudspeaker is connected with the wireless communication element, so that intercom between the user and the pet is realized, and the user can record and set playing time at a client side such as a mobile phone APP and the like, and is used for calling the pet to eat and the like. The system is characterized in that the system is also provided with a laser radar, a core board can complete a graph building function after receiving surrounding environment information acquired by the laser radar, when a user does not find a pet when viewing a video at a client, the user can send a pet searching instruction to view the pet through the client such as a mobile phone APP, after receiving the instruction, the core board can determine a target destination point (namely the pet position) by utilizing a sound source positioning mode, a mode of searching and traversing searching to the position where the pet is more likely to appear through machine learning priority, and then an A-star algorithm is applied to autonomously avoid the obstacle and navigate to the vicinity of the pet. The pet food storage device is provided with the food storage opening and the self-locking button of the food storage opening, the self-locking of the food storage opening is realized after the button is pressed, the retrieval of the food storage opening is realized by operating again, and the pet food is conveniently loaded into the pet food by a pet care person. The feeding basin capable of being controlled to enter and exit is arranged, and when a client side such as a mobile phone APP sends a corresponding instruction to the sensor, the core board controls the feeding basin through the controller, so that the feeding basin is self-locked or retracted. The intelligent pet-feeding robot has the advantages that the intelligent pet-feeding robot can flexibly move, and meanwhile, a good effect of remotely feeding pets can be achieved.
The indoor positioning navigation scheme can also adopt a bluetooth positioning technology, namely, by installing an iBeacon beacon indoors, the positioning is accurately calculated according to the strength Indicator (RSSI) value of the received signals of three points required by a positioning algorithm. And calculating the current specific position of the Bluetooth receiving equipment through a built-in positioning algorithm and interaction with a map engine database, so as to perform real-time navigation. Ultra Wideband (UWB) indoor positioning techniques may also be used.
According to the invention, the laser radar is installed, after the core board receives surrounding environment information acquired by the laser radar, the map building function can be completed, when a user does not find a pet when viewing a video at a client, the user can send a pet-seeking instruction to view the pet through the client such as a mobile phone APP, after the core board receives the instruction, a target destination (namely the pet position) can be determined by utilizing sound source positioning, searching and traversing searching modes which are more likely to occur to the pet through machine learning preferentially, and then the A-star algorithm is applied to autonomously avoid the obstacle and navigate to the vicinity of the pet.
The various embodiments in this disclosure are described in a progressive manner, and identical and similar parts of the various embodiments are all referred to each other, and each embodiment is mainly described as different from other embodiments.
The scope of the present disclosure is not limited to the above-described embodiments, and it is apparent that various modifications and variations can be made to the present disclosure by those skilled in the art without departing from the scope and spirit of the disclosure. Such modifications and variations are intended to be included herein within the scope of the following claims and their equivalents.

Claims (5)

1. An intelligent pet accompanying robot is characterized by comprising a chassis and a machine body, wherein the chassis is used for bearing the machine body;
The chassis is provided with a mobile device for realizing the movement of the intelligent pet accompanying robot;
the camera is used for collecting image data, the laser radar is used for collecting radar data, and the bidirectional communication horn comprises a pickup and a microphone;
the camera, the laser radar, the bidirectional communication horn and the wireless communication module are all connected with the processor;
the wireless communication module is used for sending the image data collected by the camera and/or the audio data collected by the pickup to a client of the intelligent pet accompanying robot;
The bidirectional communication horn is used for realizing the intercommunication between the user and the pet;
The processor is configured to:
Generating a two-dimensional map by adopting Gmapping algorithm according to the received radar data, and determining the current position of the intelligent pet-companion robot by adopting Monte Carlo algorithm;
When the pet is identified to exist in the range of the camera according to the image data received from the sensor, determining the current position of the pet;
Taking the current position of the intelligent pet-accompanying robot as a starting point, taking the current position of a pet as a target end point, and adopting an A-star algorithm to carry out global path planning based on the two-dimensional map so as to generate a moving path;
controlling the mobile device to move the intelligent pet companion robot according to the moving path so as to move the intelligent pet companion robot to the vicinity of the pet;
the processor is further configured to:
When the pet is determined not to be in the range of the camera according to the image data received from the sensor, traversing all areas except the barrier on the two-dimensional map by adopting a full-coverage path planning algorithm;
in the traversal process, carrying out real-time identification processing on the image data acquired by the camera;
if the pet is identified, stopping traversing, determining the current position of the pet, and marking the current position and the current time of the pet in the two-dimensional map;
generating a probability distribution map of the positions of the pets in each time period according to the positions of the pets marked in the multi-traversal process and the corresponding time, and
When a pet seeking instruction is received, each position in the time period corresponding to the current time is sequentially determined as a target end point according to the probability distribution diagram, navigation is performed, so that accurate pet seeking can be realized without installing an additional device on the pet,
The display device is arranged outside the machine body and used for displaying a pre-recorded video signal of the pet owner at a preset time point and/or displaying a video signal of the pet owner in a real-time video call so as to improve interaction between the pet and the owner;
the processor is further configured to:
when the pet is determined not to be in the range of the camera according to the image data received from the sensor, responding to the received pet seeking instruction, and controlling the pickup to collect the audio data of the pet;
determining the direction of arrival and the distance by adopting a generalized cross-correlation-phase transformation algorithm according to the audio data, and determining the current position of the pet according to the determined direction of arrival and the determined distance;
the processor is further configured to:
And controlling the microphone to play a prerecorded sound signal of the pet owner or a sound signal of the pet owner in the real-time voice call so as to guide the pet to make a sound or approach the intelligent companion pet robot.
2. The intelligent pet-companion robot according to claim 1, wherein a horizontally rotating universal head is arranged on the top of the body, and the camera is connected to the universal head.
3. The intelligent pet companion robot according to claim 1, wherein the moving device comprises 4 steering wheels and 4 driving wheels, and the bottoms of the driving wheels are arranged at equal heights and are used for realizing omnidirectional movement.
4. The intelligent pet companion robot according to claim 1, wherein the body is provided with a grain storage device for storing pet grains, and the top of the body is provided with a grain storage self-locking button for controlling the ejection and retraction of the grain storage device.
5. The intelligent pet companion robot according to claim 4, wherein a food basin is arranged at the bottom of the chassis and is connected with the chassis in a extendable manner, the food basin is communicated with the grain storage device, and a predetermined amount of pet food is thrown into the food basin from the grain storage device in response to a grain throwing instruction and the food basin is extended.
CN202211223619.XA 2022-10-08 2022-10-08 Intelligent companion robot Active CN115648237B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211223619.XA CN115648237B (en) 2022-10-08 2022-10-08 Intelligent companion robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211223619.XA CN115648237B (en) 2022-10-08 2022-10-08 Intelligent companion robot

Publications (2)

Publication Number Publication Date
CN115648237A CN115648237A (en) 2023-01-31
CN115648237B true CN115648237B (en) 2024-12-13

Family

ID=84985288

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211223619.XA Active CN115648237B (en) 2022-10-08 2022-10-08 Intelligent companion robot

Country Status (1)

Country Link
CN (1) CN115648237B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118737385A (en) * 2023-03-30 2024-10-01 两氢一氧(杭州)数字科技有限公司 Pet health management method, device and pet companion robot
CN118181323A (en) * 2024-04-29 2024-06-14 山东新一代信息产业技术研究院有限公司 Pet nurse robot and application method thereof

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108366343A (en) * 2018-03-20 2018-08-03 珠海市微半导体有限公司 The method that intelligent robot monitors pet
CN212260115U (en) * 2020-02-18 2021-01-01 广州佳可电子科技有限公司 Intelligent pet caring robot

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9070148B2 (en) * 2012-03-30 2015-06-30 Nakia Geller Gold and precious metal buying machine and method
RU127980U1 (en) * 2012-08-10 2013-05-10 Общество с ограниченной ответственностью "Научно-производственное предприятие Связь-Комплекс М" DEVICE FOR SATELLITE TRACKING OF DOG LOCATION (OPTIONS) AND WATERPROOF DESIGN OF THE ELECTRONIC MODULE PLACED ON THE DOG (OPTIONS)
DE102014205703A1 (en) * 2014-03-27 2015-10-01 Robert Bosch Gmbh Method and device for automatically scheduling and / or controlling a task
KR20170107341A (en) * 2016-03-15 2017-09-25 엘지전자 주식회사 Mobile robot and method for controlling the same
CN105856260A (en) * 2016-06-24 2016-08-17 深圳市鑫益嘉科技股份有限公司 On-call robot
CN206728878U (en) * 2017-05-18 2017-12-12 中航航空电子系统股份有限公司北京技术研发中心 A kind of pet accompanies robot device
CN107283435B (en) * 2017-06-15 2020-10-16 重庆柚瓣科技有限公司 Specific information collection system of endowment robot
CN107659608A (en) * 2017-07-24 2018-02-02 北京小豆儿机器人科技有限公司 A kind of emotional affection based on endowment robot shows loving care for system
CN112775979B (en) * 2019-11-08 2022-06-10 珠海一微半导体股份有限公司 Control method of pet accompanying robot, pet accompanying robot and chip
CN113080094B (en) * 2020-01-09 2023-02-10 广东顺德雷舜信息科技有限公司 Pet companion method, pet companion device and computer equipment
CN111597942B (en) * 2020-05-08 2023-04-18 上海达显智能科技有限公司 Smart pet training and accompanying method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108366343A (en) * 2018-03-20 2018-08-03 珠海市微半导体有限公司 The method that intelligent robot monitors pet
CN212260115U (en) * 2020-02-18 2021-01-01 广州佳可电子科技有限公司 Intelligent pet caring robot

Also Published As

Publication number Publication date
CN115648237A (en) 2023-01-31

Similar Documents

Publication Publication Date Title
US20210260773A1 (en) Systems and methods to control an autonomous mobile robot
US20230380383A1 (en) Animal wearable devices, systems, and methods
AU2019208265B2 (en) Moving robot, method for controlling the same, and terminal
JP4460528B2 (en) IDENTIFICATION OBJECT IDENTIFICATION DEVICE AND ROBOT HAVING THE SAME
CN109998421B (en) Mobile cleaning robot assembly and durable mapping
US20210100160A1 (en) Moving robot and method of controlling the same
CN115648237B (en) Intelligent companion robot
US9517559B2 (en) Robot control system, robot control method and output control method
CN112739244A (en) Mobile Robot Cleaning System
US10638028B2 (en) Apparatus, method, recording medium, and system for capturing coordinated images of a target
EP2369436B1 (en) Robot apparatus, information providing method carried out by the robot apparatus and computer storage media
EP2068275A2 (en) Communication robot
KR20200015877A (en) Moving robot and contorlling method thereof
US11260533B2 (en) Robot and robot system comprising same
CN113787517B (en) Self-moving robot control method, device, equipment and readable storage medium
CN105979442A (en) Noise suppression method and device and mobile device
JP7120254B2 (en) Information processing device, information processing method, and program
JP2018030223A (en) Searching robot
CN106172059A (en) Pet feeding robot
WO2018108176A1 (en) Robot video call control method, device and terminal
JP2020502675A (en) Navigation and self-locating method for autonomously moving processing device
WO2025092478A1 (en) Intelligent locomotion device control method based on charging pile and related device
US11986959B2 (en) Information processing device, action decision method and program
US20200238531A1 (en) Artificial intelligence moving robot and method for controlling the same
JP2009131914A (en) Robot control system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant