[go: up one dir, main page]

WO2025037529A1 - Control device and control method - Google Patents

Control device and control method Download PDF

Info

Publication number
WO2025037529A1
WO2025037529A1 PCT/JP2024/026975 JP2024026975W WO2025037529A1 WO 2025037529 A1 WO2025037529 A1 WO 2025037529A1 JP 2024026975 W JP2024026975 W JP 2024026975W WO 2025037529 A1 WO2025037529 A1 WO 2025037529A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving body
autonomous moving
control unit
autonomous mobile
autonomous
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/JP2024/026975
Other languages
French (fr)
Japanese (ja)
Inventor
佑弥 宗像
達馬 櫻井
祐介 川部
喬俊 狩野
藍 舘石
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Publication of WO2025037529A1 publication Critical patent/WO2025037529A1/en
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • This technology relates to a control device and a control method, and in particular to a control device and a control method that enable diversification of operations related to charging an autonomous mobile object.
  • the robot described in Patent Document 1 depends only on the charge level for its choice of actions, and can only select from pre-prepared actions. Therefore, the range of actions that the robot described in Patent Document 1 can perform while charging is limited, making it difficult to differentiate it from other robots.
  • the control device of one aspect of the present technology includes an operation control unit that controls the operation of the autonomous mobile body during charging based on the personality of the autonomous mobile body and the remaining charge.
  • a control device controls the operation of an autonomous moving body during charging based on the autonomous moving body's personality and remaining charge.
  • the operation of an autonomous mobile body during charging is controlled based on the personality of the autonomous mobile body and the remaining charge.
  • FIG. 1 is a block diagram showing an embodiment of an information processing system to which the present technology is applied.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of an autonomous moving body. 2 illustrates a configuration example of an actuator equipped in an autonomous moving body.
  • FIG. 2 is a diagram for explaining functions of a display provided in an autonomous moving body.
  • FIG. 11 is a diagram illustrating an example of the operation of an autonomous moving body.
  • FIG. 2 is a diagram illustrating a schematic diagram of an autonomous moving body in a charging state.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an autonomous moving body and a charging stand.
  • FIG. 2 is a block diagram showing an example of the functional configuration of a main control unit of the autonomous moving body; 11 is a flowchart for explaining a charging support process of an autonomous moving body.
  • 11 is a table showing examples of motions that the autonomous moving body executes on the charging stand.
  • 11 is a table showing an example of characteristics of an autonomous moving body's behavior on a charging stand with respect to the remaining charge amount and the innate personality of the autonomous moving body.
  • 13 shows an example of a method for correcting the motion of an autonomous moving body with respect to its inherent individuality when the remaining charge is less than 10%.
  • 13 shows an example of a method for correcting the motion of an autonomous moving body with respect to its inherent individuality when the remaining charge is 26% or more.
  • 1 shows examples of motions of an autonomous mobile object on a charging base specific to each innate personality.
  • 11 is a table showing an example of characteristics of an autonomous moving body's operation on a charging stand with respect to the remaining charge amount and acquired personality.
  • 13 shows an example of a method for correcting the motion of an autonomous moving body for an acquired personality when the remaining charge is less than 10%.
  • 13 shows an example of a method for correcting the motion of an autonomous moving body for an acquired personality when the remaining charge is 26% or more.
  • 1 shows examples of motions of an autonomous mobile object on a charging base that are specific to each acquired personality.
  • 5 is a flowchart for explaining a first embodiment of a charging stand return process of an autonomous moving body.
  • FIG. 11 is a flowchart for explaining a first embodiment of another autonomous mobile body support processing by an autonomous mobile body.
  • 13 is a flowchart for explaining a second embodiment of another autonomous mobile body support processing by an autonomous mobile body.
  • FIG. 2 is a diagram illustrating a wireless power supply unit of an autonomous moving body;
  • FIG. 1 is a diagram illustrating an example of a method for supplying power between autonomous moving bodies.
  • 10 is a flowchart for explaining a second embodiment of a charging stand return process of an autonomous moving body.
  • FIG. 1 illustrates an example of the configuration of a computer.
  • FIG. 1 is a block diagram showing an embodiment of an information processing system 1 to which the present technology is applied.
  • the information processing system 1 includes autonomous mobile units 11-1 to 11-n, information processing terminals 12-1 to 12-n, and an information processing server 13.
  • autonomous mobile units 11-1 to 11-n when there is no need to distinguish between the autonomous mobile units 11-1 to 11-n, they will simply be referred to as the autonomous mobile unit 11.
  • information processing terminals 12-1 to 12-n when there is no need to distinguish between the information processing terminals 12-1 to 12-n, they will simply be referred to as the information processing terminals 12.
  • Communication is possible between each autonomous mobile body 11 and the information processing server 13, between each information processing terminal 12 and the information processing server 13, between each autonomous mobile body 11 and each information processing terminal 12, between each autonomous mobile body 11, and between each information processing terminal 12, via the network 21.
  • direct communication is also possible between each autonomous mobile body 11 and each information processing terminal 12, between each autonomous mobile body 11, and between each information processing terminal 12, without going through the network 21.
  • the autonomous mobile unit 11 is an information processing device that recognizes its own and its surrounding situations based on collected sensor data, etc., and autonomously selects and executes various actions according to the situation. Unlike a robot that simply performs actions according to the user's instructions, one of the features of the autonomous mobile unit 11 is that it autonomously executes appropriate actions according to the situation.
  • the autonomous mobile body 11 can, for example, perform user recognition and object recognition based on captured images, and perform various autonomous actions according to the recognized user, object, etc.
  • the autonomous mobile body 11 can also, for example, perform voice recognition based on the user's speech, and perform actions based on the user's instructions, etc.
  • the autonomous mobile body 11 performs pattern recognition learning to acquire the ability to recognize users and objects.
  • the autonomous mobile body 11 can perform pattern recognition learning relating to objects, etc., not only by supervised learning based on given learning data, but also by dynamically collecting learning data based on instructions from a user, etc.
  • the autonomous moving body 11 can also be disciplined by the user.
  • discipline of the autonomous moving body 11 is broader than general discipline, such as teaching the autonomous moving body 11 rules and prohibitions and having it memorize them, and refers to changes in the autonomous moving body 11 that the user can sense as a result of the user's interaction with the autonomous moving body 11.
  • the shape, capabilities, desires, and other levels of the autonomous mobile body 11 can be designed appropriately according to the purpose and role.
  • the autonomous mobile body 11 is composed of an autonomous mobile robot that moves autonomously within a space and performs various operations.
  • the autonomous mobile body 11 is composed of an autonomous mobile robot that has a shape and operational capabilities that mimic those of an animal such as a human or a dog.
  • the autonomous mobile body 11 is composed of a vehicle or other device that has the ability to communicate with a user.
  • each autonomous mobile body 11 has its own unique personality.
  • the personality of the autonomous mobile body 11 is, for example, the internal personality of the autonomous mobile body 11, and is expressed by the behavior of the autonomous mobile body 11.
  • the personality of the autonomous mobile body 11 includes characteristics (e.g., personality, disposition) and features expressed by the behavior of the autonomous mobile body 11.
  • the behavior of the autonomous mobile body 11 is expressed not only by the movement of each part of the autonomous mobile body 11, but also by facial expressions, voice, and the like.
  • the behavior of the autonomous mobile body 11 includes visual, auditory, and tactile movements and changes that are manifested externally.
  • the personality of the autonomous moving body 11 includes innate personality and acquired personality.
  • the innate personality is a personality that is set before the autonomous moving body 11 starts operating, and is represented by, for example, various parameters that are set in advance before the autonomous moving body 11 is shipped. For example, one of multiple types of innate personality is set in advance for each autonomous moving body 11.
  • the acquired personality is a personality acquired through experiences after the autonomous mobile body 11 is put into operation and through interactions with the user, such as training, and is represented, for example, by various parameters that are set or changed after the autonomous mobile body 11 is put into operation. For example, multiple autonomous mobile bodies 11 with the same innate personality will come to have different personalities due to acquired personality.
  • the information processing terminal 12 communicates with the information processing server 13 via the network 21 or directly with the autonomous mobile body 11 to collect various data related to the autonomous mobile body 11, present the data to the user, or give instructions to the autonomous mobile body 11.
  • the information processing server 13 collects various data from each autonomous mobile body 11 and each information processing terminal 12, provides various data to each autonomous mobile body 11 and each information processing terminal 12, and controls the operation of each autonomous mobile body 11.
  • the information processing server 13 can also perform pattern recognition learning and processing corresponding to user discipline, similar to the autonomous mobile body 11, based on the data collected from each autonomous mobile body 11 and each information processing terminal 12.
  • the information processing server 13 supplies the above-mentioned applications and various data related to each autonomous mobile body 11 to each information processing terminal 12.
  • Network 21 may be composed of, for example, public line networks such as the Internet, telephone line networks, and satellite communication networks, as well as various LANs (Local Area Networks) including Ethernet (registered trademark), and WANs (Wide Area Networks).
  • Network 21 may also include dedicated line networks such as IP-VPN (Internet Protocol-Virtual Private Network).
  • Network 21 may also include wireless communication networks such as Wi-Fi (registered trademark) and Bluetooth (registered trademark).
  • the configuration of the information processing system 1 can be flexibly changed depending on the specifications, operation, etc.
  • the autonomous mobile body 11 may communicate information with various external devices in addition to the information processing terminal 12 and the information processing server 13.
  • the above external devices can include, for example, servers that transmit weather, news, and other service information, and various home appliances owned by the user.
  • the autonomous mobile bodies 11 and the information processing terminals 12 do not necessarily have to have a one-to-one relationship, and may have, for example, a many-to-many, many-to-one, or one-to-many relationship.
  • FIG. 2 is a diagram showing an example of the hardware configuration of the autonomous mobile unit 11.
  • the autonomous mobile unit 11 is a dog-like quadruped robot equipped with a head, a body, four legs, and a tail.
  • the autonomous moving body 11 is equipped with two displays, a display 51L and a display 51R, on its head.
  • a display 51L when there is no need to distinguish between the display 51L and the display 51R, they will simply be referred to as the display 51.
  • the autonomous moving body 11 also includes various sensors.
  • the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.
  • a microphone 52 for example, the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.
  • ToF Time Of Flight
  • the autonomous mobile body 11 is equipped with, for example, four microphones 52 on its head.
  • Each microphone 52 collects surrounding sounds including, for example, the user's speech and surrounding environmental sounds. Furthermore, by providing multiple microphones 52, surrounding sounds can be collected with high sensitivity and sound source localization becomes possible.
  • the autonomous mobile body 11 is equipped with two wide-angle cameras 53, for example, at the nose and waist, which capture images of the autonomous mobile body 11's surroundings.
  • the camera 53 located at the nose captures images within the autonomous mobile body 11's forward field of view (i.e., the dog's field of view).
  • the camera 53 located at the waist captures images of the surroundings centered around the upper part of the autonomous mobile body 11.
  • the autonomous mobile body 11 can extract feature points of the ceiling, for example, based on images captured by the camera 53 located at the waist, and achieve SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • the ToF sensor 54 is provided, for example, at the tip of the nose and detects the distance to an object that is in front of the head.
  • the ToF sensor 54 enables the autonomous mobile body 11 to detect the distance to various objects with high accuracy, and to realize operations according to the relative position to objects, including the user, obstacles, etc.
  • the human presence sensor 55 is placed, for example, on the chest and detects the presence of the user or a pet kept by the user. By detecting an animal object in front of the autonomous mobile body 11 using the human presence sensor 55, the autonomous mobile body 11 can realize various actions toward the animal object, such as actions corresponding to emotions such as interest, fear, or surprise.
  • the distance sensor 56 is placed, for example, on the chest and detects the condition of the floor surface in front of the autonomous mobile body 11.
  • the distance sensor 56 allows the autonomous mobile body 11 to accurately detect the distance to an object present on the floor surface in front of it, and to realize operations according to the relative position of the object.
  • the touch sensor 57 is arranged in areas where the user is likely to touch the autonomous moving body 11, such as the top of the head, under the chin, and on the back, and detects contact (touch) by the user.
  • the touch sensor 57 is composed of, for example, a capacitive or pressure-sensitive touch sensor.
  • the autonomous moving body 11 can detect contact actions by the user, such as touching, stroking, tapping, and pushing, using the touch sensor 57, and can perform an action according to the contact action.
  • by arranging the touch sensors 57 in a line or a plane on each part it becomes possible to detect the position touched within each part.
  • the illuminance sensor 58 is located, for example, on the back of the head at the base of the tail, and detects the illuminance of the space in which the autonomous mobile body 11 is located.
  • the autonomous mobile body 11 can detect the surrounding brightness using the illuminance sensor 58 and perform operations according to the brightness.
  • the sole buttons 59 are, for example, located on the areas corresponding to the paw pads of each of the four legs, and detect whether the bottom surfaces of the legs of the autonomous mobile body 11 are in contact with the floor.
  • the sole buttons 59 enable the autonomous mobile body 11 to detect contact or non-contact with the floor surface, and can, for example, know when it has been picked up by a user.
  • the inertial sensor 60 is, for example, disposed in the head and torso, respectively, and detects physical quantities such as speed, acceleration, and rotation of the head and torso.
  • the inertial sensor 60 is composed of a six-axis sensor that detects acceleration and angular velocity on the X-axis, Y-axis, and Z-axis.
  • the autonomous mobile body 11 can detect the movement of the head and torso with high accuracy using the inertial sensor 60, and realize operation control according to the situation.
  • the configuration of the sensors equipped in the autonomous mobile body 11 can be flexibly changed depending on the specifications, operation, etc.
  • the autonomous mobile body 11 may further include various communication devices including a temperature sensor, a geomagnetic sensor, and a GNSS (Global Navigation Satellite System) signal receiver.
  • GNSS Global Navigation Satellite System
  • FIG. 3 shows an example of the configuration of an actuator 71 provided in the autonomous mobile body 11.
  • the autonomous mobile body 11 has two degrees of freedom of rotation each in the ears and tail, and one in the mouth, for a total of 22 degrees of freedom of rotation.
  • the autonomous mobile body 11 has three degrees of freedom in its head, allowing it to perform both nodding and tilting its head.
  • the autonomous mobile body 11 can reproduce the swinging motion of its hips using the actuator 71 in its hips, allowing it to achieve natural and flexible movements that are closer to those of a real dog.
  • the autonomous mobile body 11 may achieve the above 22 degrees of rotational freedom by combining, for example, a single-axis actuator and a two-axis actuator.
  • single-axis actuators may be used in the elbows and knees of the legs, and two-axis actuators may be used in the shoulders and the base of the thighs.
  • the autonomous mobile body 11 is equipped with two displays, 51R and 51L, which correspond to the right and left eyes, respectively.
  • Each display 51 has a function to visually express the eye movements and emotions of the autonomous mobile body 11.
  • each display 51 can express the movements of the eyeballs, pupils, and eyelids according to emotions and actions, thereby producing natural expressions and movements similar to those of real animals such as dogs, and can express the line of sight and emotions of the autonomous mobile body 11 with high precision and flexibility.
  • the user can intuitively grasp the state of the autonomous mobile body 11 from the eyeball movements displayed on the display 51.
  • each display 51 is realized, for example, by two independent OLEDs (Organic Light Emitting Diodes).
  • OLEDs Organic Light Emitting Diodes
  • a more natural appearance can be realized compared to a pair of eyeballs represented by a single flat display, or two eyeballs each represented by two independent flat displays.
  • the autonomous mobile body 11 can reproduce movements and emotional expressions that are closer to those of real living creatures by controlling the movements of the joints and eyeballs with high precision and flexibility, as shown in Figure 5.
  • the autonomous moving body 11 is charged while placed on a platform-shaped charging stand 101. As described below, the autonomous moving body 11 performs operations on the charging stand 101 based on its personality and remaining charge.
  • Figures 5 and 6 show a simplified external structure of the autonomous moving body 11.
  • Fig. 7 illustrates only the configuration necessary for the processing described later, and the description of the configuration not necessary for the processing described later is appropriately omitted.
  • the autonomous mobile body 11 includes a display 51, an input unit 121, a main control unit 122, a memory 123, an eye display control unit 124, a mouth drive control unit 125, a mouth drive unit 126, a neck drive control unit 127, a neck drive unit 128, a leg drive control unit 129, a leg drive unit 130, a tail drive control unit 131, a tail drive unit 132, a voice control unit 133, a speaker 134, a wireless communication module 135, a power supply control unit 136, and a rechargeable battery 137.
  • the charging stand 101 includes a charging control unit 201, a charging circuit 202, and a display unit 203.
  • the autonomous moving body 11 and the charging stand 101 are connected via a connection connector 102.
  • the input unit 121 includes the microphone 52, camera 53, ToF sensor 54, human presence sensor 55, distance measurement sensor 56, touch sensor 57, illuminance sensor 58, sole button 59, and inertial sensor 60 described above, and has the function of collecting various sensor data related to the user and the surrounding conditions.
  • the input unit 121 also includes input devices such as switches and buttons. The input unit 121 supplies the collected sensor data and input data input via the input devices to the main control unit 122.
  • the main control unit 122 includes a processor such as a CPU (Central Processing Unit), and performs various types of information processing and controls each part of the autonomous mobile body 11.
  • a processor such as a CPU (Central Processing Unit)
  • CPU Central Processing Unit
  • the main control unit 122 recognizes the situation in which the autonomous mobile body 11 is located based on data supplied from each part of the autonomous mobile body 11.
  • the main control unit 122 controls the operation of the autonomous mobile body 11 by controlling the pupil display control unit 124, the mouth drive control unit 125, the neck drive control unit 127, the leg drive control unit 129, the tail drive control unit 131, and the voice control unit 133 based on the recognition results of the situation in which the autonomous mobile body 11 is placed.
  • the main control unit 122 executes learning processes related to the behavior of the autonomous mobile body 11 and recognition of the surrounding conditions, etc., based on data supplied from each part of the autonomous mobile body 11.
  • the main control unit 122 controls the power supply of the autonomous moving body 11 by controlling the power supply control unit 136.
  • Memory 123 includes, for example, non-volatile memory and volatile memory, and stores various programs and data.
  • the pupil display control unit 124 controls the display 51 to control the movement of the left and right eyes displayed on the display 51.
  • the mouth drive control unit 125 controls the mouth movement of the autonomous mobile body 11 by controlling the mouth drive unit 126.
  • the mouth drive control unit 125 supplies drive data (hereinafter referred to as mouth drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the mouth drive unit 126 to the main control unit 122.
  • the mouth drive unit 126 includes an actuator 71 that drives the mouth of the autonomous mobile body 11.
  • the pupil display control unit 124 and the mouth drive control unit 125 control the movement of the eyes and mouth of the autonomous mobile body 11, thereby changing the facial expression of the autonomous mobile body 11.
  • the neck drive control unit 127 controls the neck movement of the autonomous mobile body 11 by controlling the neck drive unit 128.
  • the neck drive control unit 127 supplies drive data (hereinafter referred to as neck drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the neck drive unit 128 to the main control unit 122.
  • the neck drive unit 128 includes an actuator 71 that drives the neck joint of the autonomous mobile body 11.
  • the leg drive control unit 129 controls the movement of each leg of the autonomous mobile body 11 by controlling the leg drive unit 130.
  • the leg drive control unit 129 supplies drive data (hereinafter referred to as leg drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the leg drive unit 130 to the main control unit 122.
  • the leg drive unit 130 includes actuators 71 that drive the joints of each leg of the autonomous mobile body 11.
  • the tail drive control unit 131 controls the tail movement of the autonomous mobile body 11 by controlling the tail drive unit 132.
  • the tail drive control unit 131 supplies drive data (hereinafter referred to as tail drive data) indicating the operating angle, operating speed, etc. of the actuator 71 equipped in the tail drive unit 132 to the main control unit 122.
  • the tail drive unit 132 includes an actuator 71 that drives the tail of the autonomous mobile body 11.
  • the audio control unit 133 generates and processes audio data corresponding to the audio output by the autonomous mobile body 11, and controls the characteristics and output timing of the audio.
  • the sounds output by the autonomous mobile body 11 include, for example, sounds by which the autonomous mobile body 11 communicates with the user and expresses its state or emotion, operation sounds accompanying the operation of the autonomous mobile body 11, and performance sounds for enhancing the performance of the autonomous mobile body 11.
  • the sounds by which the autonomous mobile body 11 communicates with the user and expresses its state or emotion include, for example, cries, conversation sounds, talking in its sleep, etc.
  • Operation sounds include, for example, cries, footsteps, etc.
  • Performance sounds include, for example, sound effects, music, etc.
  • the sounds output by the autonomous mobile body 11 include, for example, sounds that are output or change in response to external stimuli (hereinafter referred to as stimulus-responsive sounds), and sounds that are output or change in accordance with (linked to) the operation of the autonomous mobile body 11.
  • Stimulus-responsive sounds include, for example, cries, conversation sounds, talking in one's sleep, etc.
  • Sounds that are output or change in accordance with the operation of the autonomous mobile body 11 include, for example, operation sounds and performance sounds.
  • the characteristics of the sound to be controlled include, for example, the type of sound (e.g., bird cry, conversation, etc.), content, characteristics (e.g., pitch, volume, timbre, etc.), and sound quality.
  • the content of the sound includes the content of the conversation.
  • the speaker 134 outputs the various sounds described above based on the audio data supplied from the audio control unit 133.
  • the wireless communication module 135 communicates with other autonomous mobile bodies 11, the information processing terminal 12, and the information processing server 13, either via the network 21 or without the network 21, and transmits and receives various types of data.
  • the wireless communication module 135 supplies the received data to the main control unit 122, and obtains data to be transmitted from the main control unit 122.
  • the communication method of the wireless communication module 135 is not particularly limited and can be flexibly changed according to the specifications and operation.
  • the power supply control unit 136 controls the supply of power stored in the rechargeable battery 137 to each unit of the autonomous mobile body 11.
  • the power supply control unit 136 detects the remaining charge of the rechargeable battery and supplies remaining charge data indicating the detection result to the main control unit 122.
  • the charging control unit 201 controls the charging of the rechargeable battery 137 of the autonomous moving body 11 by the charging circuit 202.
  • the charging circuit 202 charges the rechargeable battery 137 of the autonomous mobile unit 11 under the control of the charging control unit 201.
  • the display unit 203 includes, for example, an LED (Light Emitting Diode) and displays the charging status of the rechargeable battery 137 of the autonomous mobile unit 11.
  • LED Light Emitting Diode
  • Fig. 8 shows an example of the functional configuration of the main control unit 122 in Fig. 7.
  • the main control unit 122 includes a recognition unit 151, a learning unit 152, and an operation control unit 153.
  • the recognition unit 151 recognizes the situation in which the autonomous mobile body 11 is placed, based on the sensor data and input data supplied from the input unit 121, the received data supplied from the wireless communication module 135, the mouth drive data supplied from the mouth drive unit 126, the neck drive data supplied from the neck drive control unit 127, the leg drive data supplied from the leg drive control unit 129, and the tail drive data supplied from the tail drive control unit 131.
  • the situation in which the autonomous mobile body 11 is placed includes, for example, the situation of itself and the surroundings.
  • the situation of itself includes, for example, the state and movement of the autonomous mobile body 11.
  • the surrounding situation includes, for example, the state, movement, and instructions of people in the vicinity such as a user, the state and movement of living things in the vicinity such as pets, the state and movement of surrounding objects, time, place, and the surrounding environment.
  • the surrounding objects include, for example, other autonomous mobile bodies.
  • the recognition unit 151 performs, for example, person identification, facial expression and gaze recognition, emotion recognition, object recognition, action recognition, spatial region recognition, color recognition, shape recognition, marker recognition, obstacle recognition, step recognition, brightness recognition, temperature recognition, voice recognition, word understanding, position estimation, posture estimation, etc.
  • the recognition unit 151 also has a function of estimating and understanding the situation based on the various recognized information. For example, the recognition unit 151 recognizes stimuli given to the autonomous mobile body 11 from the outside and the person who gave the stimuli.
  • the stimuli to be recognized include, for example, visual stimuli, auditory stimuli, and tactile stimuli. At this time, the recognition unit 151 may make a comprehensive estimation of the situation using knowledge stored in advance.
  • the recognition unit 151 supplies data indicating the result of the recognition or estimation of the situation (hereinafter referred to as situation data) to the learning unit 152 and the operation control unit 153. In addition, the recognition unit 151 registers the situation data in the behavior history data stored in the memory 123.
  • the behavior history data is data that indicates the history of the behavior of the autonomous mobile body 11.
  • the behavior history data includes, for example, items such as the date and time when the behavior started, the date and time when the behavior ended, the trigger for performing the behavior, the location where the behavior was instructed (if a location was instructed), the situation when the behavior was performed, and whether the behavior was completed (whether the behavior was performed to the end).
  • an action is executed as a result of, for example, a user instruction
  • the content of that instruction is registered as the trigger for the action.
  • an action is executed as a result of, for example, a specific situation occurring
  • the content of that situation is registered.
  • an action is executed as a result of, for example, an object indicated by the user or a recognized object
  • the type of object is registered.
  • the learning unit 152 learns the situation, behavior, and the effect of the behavior on the environment based on one or more of the sensor data and input data supplied from the input unit 121, the received data supplied from the wireless communication module 135, the mouth drive data supplied from the mouth drive unit 126, the neck drive data supplied from the neck drive control unit 127, the leg drive data supplied from the leg drive control unit 129, the tail drive data supplied from the tail drive control unit 131, the situation data supplied from the recognition unit 151, and the data related to the behavior of the autonomous mobile body 11 supplied from the operation control unit 153.
  • the learning unit 152 performs the above-mentioned pattern recognition learning, and learns behavior patterns corresponding to the user's discipline.
  • the learning unit 152 changes the personality of the autonomous mobile body 11, particularly the acquired personality, by performing a learning process based on experience and discipline after the start of operation.
  • the learning unit 152 realizes the above learning by using a machine learning algorithm such as deep learning.
  • the learning algorithm adopted by the learning unit 152 is not limited to the above example, and can be designed as appropriate.
  • the learning unit 152 supplies data indicating the learning results (hereinafter referred to as learning result data) to the operation control unit 153 and stores the data in the memory 123.
  • the operation control unit 153 controls the operation of the autonomous mobile body 11 based on the recognized or estimated situation and the learning result data.
  • the operation control unit 153 supplies data on the behavior of the autonomous mobile body 11 to the learning unit 152 and registers the data in the behavior history data stored in the memory 123.
  • the operation control unit 153 controls the internal state of the autonomous mobile body 11 based on the recognized or estimated situation and the learning result data. For example, the operation control unit 153 controls the state transition of the internal state of the autonomous mobile body 11.
  • the internal state of the autonomous mobile body 11 is an internal state that is not visible to the outside of the autonomous mobile body 11, and is set based on at least one of the autonomous mobile body 11's behavior, physical condition, emotions, age, remaining charge, etc., for example.
  • the physical condition of the autonomous mobile body 11 includes, for example, hunger level.
  • the hunger level is set based on, for example, the time that has elapsed since the autonomous mobile body 11 took the action of eating food.
  • the age of the autonomous mobile body 11 is set based on, for example, the date of purchase of the autonomous mobile body 11, or the time that has elapsed since the power was first turned on, or the total operating time of the autonomous mobile body 11.
  • the operation control unit 153 controls the operation of the autonomous mobile body 11 by controlling the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and audio control unit 133 based on at least one of the recognized or estimated situation, the learning result data, and the internal state of the autonomous mobile body 11. For example, the operation control unit 153 executes rotation control of each actuator 71, display control of the display 51, audio output control from the speaker 134, etc.
  • the actions of the autonomous mobile body 11 include, for example, actions necessary for the operation of the autonomous mobile body 11, as well as actions expressing will or emotion, and performances. Hereinafter, the latter actions will be referred to as motions.
  • actions that the autonomous mobile body 11 executes for various purposes are referred to as actions.
  • An action may include only the actions necessary for the operation of the autonomous mobile body 11, or may include one or more types of motion.
  • the purpose that is the target of an action is not particularly limited. For example, it may include not only specific purposes such as moving to a target location or transporting a specific object, but also abstract purposes such as expressing will or emotion.
  • Motion data Data for realizing the motion (hereinafter referred to as motion data) is, for example, created in advance using an authoring tool and stored in the memory 123 when the autonomous moving body 11 is manufactured.
  • the motion data is downloaded to the autonomous moving body 11 from the information processing terminal 12 or the information processing server 13.
  • the motion data includes, for example, data continuously describing in a time series the movement of the eyes of the display 51, the target joint angles and joint movement speeds of each joint of each drive unit (each actuator 71) of the autonomous mobile body 11, and control values such as the type and volume of the sound to be output.
  • the movement control unit 153 controls the eye display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and voice control unit 133 based on the motion data, thereby causing the autonomous mobile body 11 to execute a motion.
  • the movement control unit 153 can cause the autonomous mobile body 11 to take any posture by instructing the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 on the target joint angles of each drive unit between successive motions.
  • step S1 the autonomous mobile body 11 performs an autonomous operation under the control of the operation control unit 153. That is, the autonomous mobile body 11 performs various operations autonomously.
  • the content of the operation is not particularly limited.
  • step S2 the operation control unit 153 determines whether charging is necessary. Specifically, the operation control unit 153 detects the remaining charge of the autonomous moving body 11 based on the remaining charge data from the power supply control unit 136. If the remaining charge is equal to or greater than the required charging level, the operation control unit 153 determines that charging is not necessary, and the process returns to step S1.
  • the required charging level can be set to any value, for example, 26%.
  • step S2 the processes of steps S1 and S2 are repeatedly executed until it is determined that charging is necessary.
  • step S2 if the remaining charge is less than the required charging level, the operation control unit 153 determines that charging is necessary, and the process proceeds to step S3.
  • step S3 the autonomous mobile object 11 returns to the charging station 101 under the control of the operation control unit 153. In other words, the autonomous mobile object 11 moves to the charging station 101.
  • step S4 the autonomous mobile object 11 starts charging on the charging stand 101 under the control of the operation control unit 153.
  • step S5 the operation control unit 153 determines whether or not to stop operation on the charging base 101. Specifically, the operation control unit 153 detects the remaining charge of the autonomous mobile body 11 based on the remaining charge data from the power supply control unit 136. If the remaining charge is less than the operation stop level, the operation control unit 153 determines not to stop operation on the charging base 101, and the process proceeds to step S6.
  • the operation stop level can be set to any value within a range greater than the required charging level, for example, 40%.
  • step S6 the autonomous mobile object 11 wakes up on the charging stand 101 under the control of the operation control unit 153, and performs an operation according to the remaining charge and its personality.
  • step S5 the processes of steps S5 and S6 are repeatedly executed until it is determined that operation on the charging stand 101 is to be stopped.
  • the autonomous moving body 11 will operate (behave) in accordance with its remaining charge and its personality while charging, until the remaining charge reaches or exceeds the operation stop level.
  • step S5 if the remaining charge is equal to or greater than the operation stop level, the operation control unit 153 determines to stop operation on the charging stand 101, and the process proceeds to step S7.
  • step S7 the autonomous mobile object 11 goes to sleep on the charging stand 101 under the control of the operation control unit 153.
  • FIG. 10 shows an example of a motion that the autonomous moving body 11 performs on the charging stand 101.
  • the motions that the autonomous mobile body 11 performs on the charging stand 101 include, for example, “flapping legs,” “looking around,” “yawning,” “smelling,” “tilting head,” “burping,” “sneezing,” “shaking hands,” etc.
  • “Flapping legs” is a motion that includes, for example, throwing out the front legs and moving them up and down.
  • Look around is a motion that includes, for example, turning your head to look around.
  • Yawning is a motion that includes turning the head while opening the mouth.
  • “Smelling,” for example, is a motion that includes lifting the head and moving the tip of the nose.
  • Titanate your head is a motion that includes moving your head from side to side.
  • “Burp,” for example, is a motion that includes opening the mouth to burp and then shaking the head from side to side.
  • “Sneezing,” for example, is a motion that includes bobbing the head up and down while moving the neck back, and then opening the mouth while moving the neck forcefully forward.
  • Hands for example, is a motion that includes raising one front leg high and then lowering it diagonally forward.
  • the operation control unit 153 appropriately selects motions to be executed from the above motions and causes the autonomous mobile body 11 to execute them in order. For example, the probability that each motion will be selected is set in advance. Note that, as described below, the probability that each motion will be selected during charging is changed based on the personality and remaining charge of the autonomous mobile body 11.
  • the characteristics of the autonomous moving body 11's operation on the charging stand 101 change depending on, for example, the characteristics of the autonomous moving body 11 and the remaining charge.
  • FIG. 11 shows an example of the characteristics of the autonomous mobile body 11's behavior on the charging stand 101 in relation to the remaining charge and innate personality.
  • the innate personality is classified into four types: “insensitive,” “loyal and sensitive,” “intelligent and cautious,” and “active and sociable.”
  • an insensitive autonomous moving body 11 is not significantly affected by the remaining charge.
  • the operation of an insensitive autonomous moving body 11 on the charging stand 101 does not change significantly whether the remaining charge is high or low.
  • whether the remaining charge is high or low is determined based on a predetermined threshold.
  • This threshold may be variable, and if variable, may be set by the user.
  • the owner is, for example, a user who owns the autonomous mobile unit 11.
  • an intelligent and cautious autonomous mobile body 11 will behave cautiously and not move too much when it has a large remaining charge.
  • an intelligent and cautious autonomous mobile body 11 will move slowly and become smaller when it has a small remaining charge.
  • an active and sociable autonomous mobile body 11 will move more agilely and larger when it has a large remaining charge.
  • an active and sociable autonomous mobile body 11 will continue to play when it has an owner or other individuals (other autonomous mobile bodies 11) present when it has a low remaining charge.
  • each motion of the autonomous moving body 11 is corrected based on, for example, the remaining charge and the innate personality.
  • FIGS. 12 and 13 show examples of methods for correcting each motion of the autonomous moving body 11 in order to realize the characteristics of the operation of FIG. 11.
  • FIG. 12 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is less than 10%.
  • the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed (velocity) to 0.9 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, each leg, and tail is slowed down.
  • the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 0.9 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, each leg, and tail becomes slower.
  • the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle (amplitude) to 1.1 times the motion data.
  • the control values of other motions are not corrected. Therefore, if the owner or another individual is nearby, the movement of the mouth, neck, and each leg becomes faster and larger in each motion.
  • the control values of the motion data are not corrected. Therefore, if the owner or another individual is not nearby, each motion is executed as is according to the motion data, resulting in standard movement.
  • FIG. 13 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is 26% or more.
  • the operation control unit 153 corrects the probability of selecting the operation of shaking its head to look for its owner by 1.2 times. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 increases the operation of shaking its head to look for its owner.
  • the operation control unit 153 corrects the probability that the autonomous mobile body 11 will do nothing to 1.2 times the probability. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 will spend more time doing nothing.
  • the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, and each leg becomes faster and larger.
  • the autonomous mobile body 11 performs a motion specific to each innate personality on the charging stand 101.
  • FIG. 14 shows an example of the motion of the autonomous mobile body 11 on the charging stand 101 that is specific to each innate personality.
  • the insensitive autonomous mobile object 11 performs a motion that includes slowly moving its head up and down on the charging stand 101 as if trying to doze off.
  • the loyal and responsive autonomous mobile unit 11 finds its owner on the charging stand 101, it performs a motion that includes raising both hands (front legs) and barking.
  • an intelligent and alert autonomous mobile body 11 performs a motion on the charging base 101 that includes turning toward the direction of a sound, lowering its head and lying prone.
  • an active and sociable autonomous mobile body 11 performs motions on the charging stand 101, including swinging its head from side to side and flapping its front and back legs.
  • FIG. 15 shows an example of the characteristics of the autonomous mobile body 11's behavior on the charging stand 101 in relation to the remaining charge and acquired personality.
  • the acquired personality is classified into four types: "cute,” “dependent,” “shy,” and "wild.”
  • the cute autonomous mobile body 11 has a strong desire to express emotions and a whimsical personality. For example, when the cute autonomous mobile body 11 has a high remaining charge, it wags its head or tail to show that it is playful. On the other hand, for example, when the cute autonomous mobile body 11 has a low remaining charge, it moves its body slowly but moves its tail a lot. Also, when the cute autonomous mobile body 11 has a low remaining charge, it looks down and has a dissatisfied look in its eyes.
  • the spoiled autonomous mobile body 11 has a personality that has a strong desire to communicate. For example, when the spoiled autonomous mobile body 11 has a large remaining charge, it moves its head more and more to search for its owner. On the other hand, when the spoiled autonomous mobile body 11 has a small remaining charge, it waits quietly to be charged and only moves more when it finds its owner.
  • a shy autonomous mobile unit 11 has a personality with a strong desire to explore and a weak desire to move. For example, when the shy autonomous mobile unit 11 has a large remaining charge, it will look around the room. On the other hand, when the shy autonomous mobile unit 11 has a small remaining charge, it will quietly wait to be charged. Furthermore, when the shy autonomous mobile unit 11 has a small remaining charge, it will move slowly and tend to lie down.
  • the wild autonomous mobile body 11 for example, has a personality with a strong desire to exercise. For example, when the wild autonomous mobile body 11 has a high remaining charge, it wants to move its arms and legs and get out of the charging base 101. On the other hand, when the wild autonomous mobile body 11 has a low remaining charge, it wants to play but runs out of energy. Also, when the wild autonomous mobile body 11 has a low remaining charge, it may be in a tired position, but it will still move its legs well.
  • each motion of the autonomous moving body 11 is corrected based on, for example, the remaining charge and acquired personality.
  • FIGS. 16 and 17 show examples of methods for correcting each motion of the autonomous moving body 11 in order to realize the characteristics of the operation of FIG. 15.
  • FIG. 16 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is less than 10%.
  • the pupil display control unit 124 controls the display 51 so that the eyes appear dissatisfied.
  • the neck drive control unit 127 and leg drive control unit 129 correct the joint movement speed to 0.9 times the motion data
  • the tail drive control unit 131 corrects the joint movement speed to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the autonomous moving body 11 has dissatisfied eyes, the movement of the neck and each leg slows down, and the movement of the tail speeds up.
  • the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, if the owner is nearby, the movement of the mouth, neck, legs, and tail will be faster in each motion. On the other hand, if the owner is not nearby, the control values of the motion data are not corrected. Therefore, if the owner is not nearby, each motion is executed as is according to the motion data, resulting in standard movements.
  • the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 0.9 times the motion data.
  • the neck drive control unit 127 also controls the neck drive unit 128 so that the body assumes a head-down posture between motions. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, legs, and tail slows down, and the body assumes a head-down posture between motions.
  • the pupil display control unit 124 controls the display 51 so that the eyes appear dissatisfied.
  • the neck drive control unit 127 controls the neck drive unit 128 so that the body assumes a posture with its head lowered between motions.
  • the leg drive control unit 129 corrects the joint movement speed and target joint angle to 1.1 times the motion data, and controls the leg drive unit 130 so that the angle in the leg opening direction between motions becomes 1.1 times. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of each leg becomes faster and larger, and the body assumes a posture with its head lowered and its legs widely opened between motions.
  • FIG. 17 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is 26% or more.
  • the neck drive control unit 127 and tail drive control unit 131 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the neck and tail becomes faster and larger.
  • the operation control unit 153 corrects the probability of selecting the operation of shaking its head to look for its owner by a factor of 1.2. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 will frequently perform the operation of shaking its head to look for its owner.
  • the motion control unit 153 corrects the probability of selecting the motion of moving its head around by 1.2 times. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 moves its head around more often.
  • the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, and each leg becomes faster and larger.
  • the autonomous mobile body 11 performs a motion specific to each acquired personality on the charging stand 101.
  • FIG. 18 shows an example of the motion of the autonomous mobile body 11 on the charging stand 101 that is specific to each acquired personality.
  • the cute autonomous mobile body 11 performs a motion that includes swinging its head from side to side in rhythm while moving its ears and tail widely on the charging stand 101.
  • the spoiled autonomous mobile body 11 finds its owner on the charging stand 101, it executes a motion that includes a beckoning motion by moving one of its front legs up and down.
  • a shy autonomous mobile body 11 performs a motion that includes spreading its front and back legs wide, lowering its head, and limplying on the charging stand 101.
  • a wild autonomous mobile body 11 may perform motions on the charging stand 101, including quickly moving its front legs and tail up and down to show that it wants to play.
  • the operation control unit 153 controls the ratio of operations based on innate personality and operations based on acquired personality while the autonomous moving body 11 is charging.
  • the operation control unit 153 weights the motion and posture of the autonomous moving body 11 on the charging stand 101 for corrections based on innate personality and corrections based on acquired personality, and controls the weighting values.
  • the corrected target joint angle and joint movement velocity Mt for each motion of the autonomous moving body 11 are calculated using the following formula (1).
  • Mn indicates the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 before correction (normal).
  • ⁇ Mc indicates the correction amount of the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 based on the innate personality.
  • ⁇ Ma indicates the correction amount of the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 based on the acquired personality.
  • is a coefficient (weight) in the range from 0 to 1.
  • the target joint angle Pt after correcting the posture between motions of the autonomous moving body 11 is calculated by the following formula (2).
  • Pn indicates the target joint angle of the posture of the autonomous mobile body 11 before correction (normal).
  • ⁇ Pc indicates the correction amount of the target joint angle of the posture of the autonomous mobile body 11 based on the innate personality.
  • ⁇ Pa indicates the correction amount of the target joint angle of the posture of the autonomous mobile body 11 based on the acquired personality.
  • the coefficient ⁇ is set to 1 at the time of shipment, and approaches 0 as time passes from when the autonomous mobile body 11 starts operating. This makes it possible to change the personality of the autonomous mobile body 11 so that it is initially strongly influenced by its innate personality, and as it lives with its owner longer, it becomes more influenced by its acquired personality based on experience and relationships with users such as the owner.
  • the required charging level used to determine whether or not to return to the charging base 101 may be changed based on, for example, acquired personalities.
  • the cute autonomous moving body 11 has a volatile personality, so the charging level required changes randomly within a predetermined range (for example, within a range of ⁇ 5%) each time.
  • the required charge level drops to 20% due to its spoiled nature, and the autonomous mobile body 11 continues to play until the limit.
  • a shy autonomous moving body 11 being a reserved person, will quickly return to the charging base 101 when the required charging level rises to 28%.
  • a wild autonomous mobile unit 11 has an active personality, so the required charging level drops to 20% and the unit continues to play until it reaches its limit.
  • the actions related to charging the autonomous mobile body 11 are diversified, and the individuality of the autonomous mobile body 11 is expressed. This allows each owner to get a real sense that their autonomous mobile body 11 is different from other autonomous mobile bodies and that it is their own. In addition, by varying the actions of the autonomous mobile body 11 while charging, it is possible to prevent users such as owners from getting bored.
  • the autonomous mobile body 11 when returning (moving) to the charging stand 101, the autonomous mobile body 11 can execute an operation in cooperation with the other autonomous mobile body 11 by sharing charging-related information with the other autonomous mobile body 11.
  • Figs. 19 to 24 an example of a process in which the autonomous mobile body 11 executes an operation in cooperation with the other autonomous mobile body 11 when returning to the charging stand 101, etc. will be described.
  • the autonomous mobile body 11 that executes the charging base return process will be referred to as the autonomous mobile body 11A.
  • step S101 autonomous operation is performed in the same manner as in step S1 of FIG. 9.
  • step S102 similar to the process in step S2 of FIG. 9, it is determined whether charging is necessary. If it is determined that charging is not necessary, the process returns to step S101.
  • step S102 the processes of steps S101 and S102 are repeatedly executed until it is determined that charging is necessary.
  • step S102 determines whether charging is necessary. If it is determined in step S102 that charging is necessary, the process proceeds to step S103.
  • step S103 the autonomous mobile object 11A starts returning to the charging base 101 under the control of the operation control unit 153. In other words, the autonomous mobile object 11A starts moving in the direction of the charging base 101.
  • step S104 the recognition unit 151 determines whether or not the robot has returned to the charging base 101 based on the sensor data from the input unit 121. If it is determined that the robot has not returned to the charging base 101, the process proceeds to step S105.
  • step S105 the operation control unit 153 determines whether or not there is enough charge remaining to enable movement. Specifically, the operation control unit 153 detects the remaining charge of the autonomous moving body 11A based on the remaining charge data from the power supply control unit 136. If the remaining charge is equal to or greater than the operable level, the operation control unit 153 determines that there is enough charge remaining to enable movement, and the process returns to step S104.
  • the operable level can be set to any value.
  • the operable level may be changed based on the distance to the charging station 101, etc.
  • steps S104 and S105 are repeatedly executed until it is determined in step S104 that the robot has returned to the charging base 101, or until it is determined in step S105 that the robot does not have enough charge left to move.
  • step S105 if the remaining charge is equal to or greater than the operable level, the operation control unit 153 determines that there is not enough charge left to move, i.e., that it is difficult to return (move) to the charging base 101, and the process proceeds to step S106.
  • step S106 the operation control unit 153 notifies the surrounding autonomous moving bodies 11 of the insufficient remaining charge. Specifically, the operation control unit 153 generates information for notifying the insufficient remaining charge (hereinafter referred to as insufficient remaining charge information). The operation control unit 153 transmits the insufficient remaining charge information to the surrounding autonomous moving bodies 11 via the wireless communication module 135.
  • step S107 the autonomous moving body 11A waits in a prone position under the control of the operation control unit 153.
  • step S104 determines whether the device has returned to the charging base 101. If it is determined in step S104 that the device has returned to the charging base 101, steps S105 to S107 are skipped and the charging base return process ends.
  • This process is executed when at least one of the autonomous mobile units 11A and 11B does not have the ability to share power.
  • step S131 autonomous operation is performed in the same manner as in step S1 of FIG. 9.
  • step S132 the recognition unit 151 determines whether or not there is an autonomous moving body 11 with insufficient remaining charge in the vicinity. If the recognition unit 151 has not received the insufficient remaining charge information transmitted in step S106 of FIG. 19, it determines that there is no autonomous moving body 11 with insufficient remaining charge in the vicinity, and the process returns to step S101.
  • step S132 the processes of steps S131 and S132 are repeatedly executed until it is determined that an autonomous moving body 11 with insufficient remaining charge is present in the vicinity.
  • step S132 if the recognition unit 151 receives insufficient remaining charge information from the autonomous mobile unit 11A via the wireless communication module 135, it determines that there is an autonomous mobile unit 11 (autonomous mobile unit 11A) with insufficient remaining charge in the vicinity, that is, it determines that there is an autonomous mobile unit 11 in the vicinity that has difficulty returning (moving) to the charging station 101 due to insufficient remaining charge, and the process proceeds to step S133.
  • autonomous mobile unit 11A autonomous mobile unit 11 with insufficient remaining charge in the vicinity
  • step S133 the recognition unit 151 determines whether or not the owner is nearby based on sensing data from the input unit 121. If it is determined that the owner is nearby, the process proceeds to step S134.
  • step S134 the autonomous mobile unit 11B moves to the owner's side under the control of the operation control unit 153.
  • step S134 is skipped and processing proceeds to step S135.
  • step S135 the autonomous mobile unit 11B turns toward the autonomous mobile unit 11 (autonomous mobile unit 11A) that is running low on charge and barks. This notifies the owner or people around the autonomous mobile unit 11B of the presence of the autonomous mobile unit 11A that is running low on charge.
  • This process is executed when both the autonomous mobile body 11A and the autonomous mobile body 11B have a function for sharing power. For example, as shown in FIG. 22, this process is executed when both the autonomous mobile body 11A and the autonomous mobile body 11B have a wireless power supply unit 302 at the tip of the front leg 301.
  • steps S161 and S162 the same processing as in steps S131 and S132 in FIG. 20 is performed.
  • step S163 the operation control unit 153 determines whether or not it is possible to share power based on information from the power supply control unit 136. For example, if the remaining charge is less than a predetermined threshold, the operation control unit 153 determines that it is not possible to share power, and the process proceeds to step S164.
  • steps S164 through S166 the same processing as in steps S133 through S135 of FIG. 20 is performed.
  • step S163 if the remaining charge is equal to or greater than the predetermined threshold, the operation control unit 153 determines that power can be shared, and the process proceeds to step S167.
  • step S167 the autonomous mobile unit 11B goes to the autonomous mobile unit 11 that is running low on charge and shares some of its power with it.
  • the autonomous moving body 11B moves to the location of the autonomous moving body 11A under the control of the operation control unit 153.
  • the autonomous mobile body 11B places the wireless power supply unit 302B at the tip of the front leg 301B over the wireless power supply unit 302A of the front leg 301A of the autonomous mobile body 11A. Then, the autonomous mobile body 11B wirelessly supplies power from the wireless power supply unit 302B to the wireless power supply unit 302A of the autonomous mobile body 11A. This allows the autonomous mobile body 11A to obtain the power necessary to return to the charging base 101, and enables it to return to the charging base 101 on its own.
  • This process is executed when the autonomous mobile body 11A and another autonomous mobile body 11 have a function to share the charging stand 101 depending on the remaining charge.
  • steps S201 to S203 the same processing as in steps S101 to S103 in FIG. 19 is performed.
  • step S204 the recognition unit 151 determines whether or not there is another autonomous mobile body 11 returning to the same charging base 101 based on the sensor data from the input unit 121 and information received from another autonomous mobile body 11 via the wireless communication module 135. If it is determined that there is no another autonomous mobile body 11 returning to the same charging base 101, the process proceeds to step S205.
  • step S205 similar to the process in step S104 in FIG. 19, it is determined whether or not the battery has returned to the charging base 101. If it is determined that the battery has not returned to the charging base 101, the process returns to step S204.
  • steps S204 and S205 are repeatedly executed until it is determined in step S204 that there is no autonomous moving body 11 returning to the same charging base 101, or until it is determined in step S205 that the autonomous moving body 11 has returned to the charging base 101.
  • step S204 determines whether there is an autonomous moving body 11 returning to the same charging station 101. If it is determined in step S204 that there is an autonomous moving body 11 returning to the same charging station 101, the process proceeds to step S206.
  • step S206 the autonomous mobile body 11A communicates with the other autonomous mobile bodies 11 to share the remaining charge.
  • the recognition unit 151 communicates with an autonomous mobile body 11 (hereinafter referred to as autonomous mobile body 11C) that is returning to the same charging stand 101 via the wireless communication module 135, and shares information regarding the remaining charge with each other.
  • step S207 the recognition unit 151 determines whether the other autonomous mobile body 11 (autonomous mobile body 11C) has a lower remaining charge than the autonomous mobile body 11 itself. That is, the recognition unit 151 compares the remaining charge of the autonomous mobile body 11A with that of the other autonomous mobile body 11 (autonomous mobile body 11C), and if it determines that the other autonomous mobile body 11 has a lower remaining charge, the process proceeds to step S208.
  • step S208 the autonomous mobile object 11A, under the control of the operation control unit 153, makes a motion as if to give up the charging stand 101.
  • step S209 the recognition unit 151 determines whether or not there are any other available charging stations 101 based on the sensor data from the input unit 121 and information received from other autonomous moving bodies 11 via the wireless communication module 135. If it is determined that there are no other available charging stations 101, the process proceeds to step S210.
  • step S210 the autonomous moving body 11A waits in a prone position under the control of the operation control unit 153.
  • step S209 the processes of steps S209 and S210 are repeatedly executed until it is determined that there is another available charging station 101.
  • step S210 determines whether there is another available charging station 101. If it is determined in step S210 that there is another available charging station 101, the process returns to step S203. This also includes the case where the charging station 101 that was handed over to the autonomous mobile unit 11C becomes available as a result of the autonomous mobile unit 11C finishing charging.
  • step S203 processing returns to step S203, and subsequent processing is performed in step S203.
  • step S207 if it is determined in step S207 that another autonomous moving body 11 (autonomous moving body 11C) has a greater remaining charge than the autonomous moving body 11 itself, the process proceeds to step S211.
  • autonomous moving body 11C autonomous moving body 11C
  • step S211 the autonomous moving body 11A performs a motion expressing gratitude for the transfer of the charging stand 101.
  • the autonomous moving body 11C performs a motion as if it is yielding the charging stand 101 by processing similar to step S208 described above.
  • the autonomous mobile body 11A under the control of the operation control unit 153, performs a motion expressing gratitude for the charging stand 101 being handed over to it.
  • the autonomous mobile body 11 that will hand over the charging stand 101 is determined based on other criteria.
  • an autonomous moving body 11 with a low priority may yield the charging stand 101 to an autonomous moving body 11 with a high priority order.
  • an autonomous moving body 11 that is farther from the charging stand 101 may hand over the charging stand 101 to an autonomous moving body 11 that is closer to the charging stand 101.
  • step S205 if it is determined in step S205 that the device has returned to the charging base 101, the charging base return process ends.
  • the operation of the autonomous mobile body 11 changes in response to the remaining charge and the state of the other autonomous mobile bodies 11 when it returns to the charging base 101, making the autonomous mobile body 11 feel alive.
  • the autonomous mobile body 11 when the autonomous mobile body 11 cooperates with other autonomous mobile bodies 11 or supports other autonomous mobile bodies 11, the autonomous mobile body 11 becomes perceived as intelligent.
  • the classification of the personality of the autonomous mobile body 11 can be changed as appropriate.
  • the types of personality of the autonomous mobile body 11 can be increased or decreased.
  • a level e.g., high, medium, low, etc.
  • the innate personality of the autonomous mobile body 11 can be made common to all individuals, and only the acquired personality can be changed for each individual.
  • the external personality of the autonomous moving body 11 may be changed in a similar manner.
  • the appearance of the autonomous moving body 11 may be changed in accordance with a change in the internal personality.
  • the operation of the autonomous moving body 11 while charging can be changed as appropriate.
  • the types of motions that the autonomous moving body 11 performs while charging can be increased or decreased.
  • the method of correcting each motion and the amount of correction can be changed as appropriate based on the characteristics of the autonomous moving body 11 and the remaining charge.
  • the form and charging method of the charging device that charges the autonomous moving body 11 can be changed as appropriate.
  • the charging method for the autonomous moving body 11 may be either wireless or wired.
  • the information processing terminal 12 or the information processing server 13 may execute part of the processing of the autonomous mobile body 11 described above.
  • the information processing terminal 12 or the information processing server 13 may execute all or part of the processing of the main control unit 122 of the autonomous mobile body 11 to remotely control the autonomous mobile body 11.
  • the information processing terminal 12 or the information processing server 13 may control the operation of the autonomous mobile body 11 during charging based on the personality and remaining charge of the autonomous mobile body 11.
  • the information processing terminal 12 or the information processing server 13 may learn the acquired personality of the autonomous mobile body 11.
  • this technology can also be applied to entertainment robots, such as pet-type robots that can express the individual personalities of each individual.
  • FIG. 25 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input/output interface 1005 Connected to the input/output interface 1005 are an input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010.
  • the input unit 1006 includes an input switch, a button, a microphone, an image sensor, etc.
  • the output unit 1007 includes a display, a speaker, etc.
  • the storage unit 1008 includes a hard disk, a non-volatile memory, etc.
  • the communication unit 1009 includes a network interface, etc.
  • the drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 1001 loads a program recorded in the storage unit 1008, for example, into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program, thereby performing the above-mentioned series of processes.
  • the program executed by the computer 1000 can be provided by being recorded on a removable medium 1011 such as a package medium, for example.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 1008 via the input/output interface 1005 by inserting the removable medium 1011 into the drive 1010.
  • the program can also be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008.
  • the program can be pre-installed in the ROM 1002 or storage unit 1008.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
  • a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
  • this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • a single step includes multiple processes
  • the processes included in that single step can be executed by a single device, or can be shared and executed by multiple devices.
  • a control device comprising: an operation control unit that controls an operation of an autonomous moving body during charging based on an individuality of the autonomous moving body and a remaining charge amount of the autonomous moving body.
  • the control device according to (1) further comprising a learning unit that changes a personality of the autonomous moving body based on an experience of the autonomous moving body and an interaction with a user.
  • the personality of the autonomous moving body includes an innate personality and an acquired personality, The control device according to (2), wherein the learning unit changes the acquired personality based on an experience of the autonomous moving body and an interaction with a user.
  • the operation control unit controls a ratio of the operation based on the innate personality and the operation based on the acquired personality during charging of the autonomous moving body.
  • 1 Information processing system 11-1 to 11-n Autonomous mobile body, 12-1 to 12-n Information processing terminal, 13 Information processing server, 51L, 51R Display, 71 Actuator, 101 Charging stand, 121 Input unit, 122 Main control unit, 124 Eye display control unit, 125 Mouth drive control unit, 126 Mouth drive unit, 127 Neck drive control unit, 128 Neck drive unit, 129 Leg drive control unit, 130 Leg drive unit, 131 Tail drive control unit, 132 Tail drive unit, 133 Voice control unit, 134 Speaker, 135 Wireless communication module, 151 Recognition unit, 152 Learning unit, 153 Motion control unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Toys (AREA)

Abstract

The present technology pertains to a control device and a control method capable of diversifying operations related to charging of an autonomous mobile body. The control device is provided with an operation control unit that controls the operation of the autonomous mobile body during charging, on the basis of the individuality and the remaining charge amount of the autonomous mobile body. The present technology can be applied to, for example, a control device that controls the operation of a pet-type robot.

Description

制御装置及び制御方法Control device and control method

 本技術は、制御装置及び制御方法に関し、特に、自律移動体の充電に関わる動作を多様化できるようにした制御装置及び制御方法に関する。 This technology relates to a control device and a control method, and in particular to a control device and a control method that enable diversification of operations related to charging an autonomous mobile object.

 従来、充電中に充電量に応じて所定の動作を行うロボットが提案されている(例えば、特許文献1参照)。  Conventionally, robots have been proposed that perform specific actions depending on the charge level while charging (see, for example, Patent Document 1).

国際公開第2000/038295号International Publication No. 2000/038295

 しかしながら、特許文献1に記載のロボットは、動作選択が充電量のみに依存しており、予め用意された所定の動作しか選択できない。従って、特許文献1に記載のロボットは、充電中の動作の幅が限定されており、他の個体と差別化することが困難である。 However, the robot described in Patent Document 1 depends only on the charge level for its choice of actions, and can only select from pre-prepared actions. Therefore, the range of actions that the robot described in Patent Document 1 can perform while charging is limited, making it difficult to differentiate it from other robots.

 本技術は、このような状況に鑑みてなされたものであり、ペット型ロボット等の自律移動体の充電に関わる動作を多様化できるようにするものである。 This technology was developed in light of these circumstances, and makes it possible to diversify the charging operations of autonomous mobile objects such as pet robots.

 本技術の一側面の制御装置は、自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する動作制御部を備える。 The control device of one aspect of the present technology includes an operation control unit that controls the operation of the autonomous mobile body during charging based on the personality of the autonomous mobile body and the remaining charge.

 本技術の一側面の制御方法は、制御装置が、自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する。 In one aspect of the control method of the present technology, a control device controls the operation of an autonomous moving body during charging based on the autonomous moving body's personality and remaining charge.

 本技術の一側面においては、自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作が制御される。 In one aspect of this technology, the operation of an autonomous mobile body during charging is controlled based on the personality of the autonomous mobile body and the remaining charge.

本技術を適用した情報処理システムの一実施の形態を示すブロック図である。1 is a block diagram showing an embodiment of an information processing system to which the present technology is applied. 自律移動体のハードウエア構成例を示す図である。FIG. 2 is a diagram illustrating an example of a hardware configuration of an autonomous moving body. 自律移動体が備えるアクチュエータの構成例である。2 illustrates a configuration example of an actuator equipped in an autonomous moving body. 自律移動体が備えるディスプレイの機能について説明するための図である。FIG. 2 is a diagram for explaining functions of a display provided in an autonomous moving body. 自律移動体の動作例を示す図である。FIG. 11 is a diagram illustrating an example of the operation of an autonomous moving body. 自律移動体の充電中の状態を模式的に示す図である。FIG. 2 is a diagram illustrating a schematic diagram of an autonomous moving body in a charging state. 自律移動体及び充電台の機能構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of a functional configuration of an autonomous moving body and a charging stand. 自律移動体のメインコントロール部の機能構成例を示すブロック図である。2 is a block diagram showing an example of the functional configuration of a main control unit of the autonomous moving body; 自律移動体の充電対応処理を説明するためのフローチャートである。11 is a flowchart for explaining a charging support process of an autonomous moving body. 自律移動体が充電台上で実行するモーションの例を示す表である。11 is a table showing examples of motions that the autonomous moving body executes on the charging stand. 自律移動体の充電残量及び先天的な個性に対する充電台上での動作の特徴の例を示す表である。11 is a table showing an example of characteristics of an autonomous moving body's behavior on a charging stand with respect to the remaining charge amount and the innate personality of the autonomous moving body. 充電残量が10%未満の場合の自律移動体の先天的な個性に対するモーションの補正方法の例を示している。13 shows an example of a method for correcting the motion of an autonomous moving body with respect to its inherent individuality when the remaining charge is less than 10%. 充電残量が26%以上の場合の自律移動体の先天的な個性に対するモーションの補正方法の例を示している。13 shows an example of a method for correcting the motion of an autonomous moving body with respect to its inherent individuality when the remaining charge is 26% or more. 各先天的な個性に特有の充電台上での自律移動体のモーションの例を示している。1 shows examples of motions of an autonomous mobile object on a charging base specific to each innate personality. 自律移動体の充電残量及び後天的な個性に対する充電台上での動作の特徴の例を示す表である。11 is a table showing an example of characteristics of an autonomous moving body's operation on a charging stand with respect to the remaining charge amount and acquired personality. 充電残量が10%未満の場合の自律移動体の後天的な個性に対するモーションの補正方法の例を示している。13 shows an example of a method for correcting the motion of an autonomous moving body for an acquired personality when the remaining charge is less than 10%. 充電残量が26%以上の場合の自律移動体の後天的な個性に対するモーションの補正方法の例を示している。13 shows an example of a method for correcting the motion of an autonomous moving body for an acquired personality when the remaining charge is 26% or more. 各後天的な個性に特有の充電台上での自律移動体のモーションの例を示している。1 shows examples of motions of an autonomous mobile object on a charging base that are specific to each acquired personality. 自律移動体の充電台帰還処理の第1の実施の形態を説明するためのフローチャートである。5 is a flowchart for explaining a first embodiment of a charging stand return process of an autonomous moving body. 自律移動体の他自律移動体支援処理の第1の実施の形態を説明するためのフローチャートである。11 is a flowchart for explaining a first embodiment of another autonomous mobile body support processing by an autonomous mobile body. 自律移動体の他自律移動体支援処理の第2の実施の形態を説明するためのフローチャートである。13 is a flowchart for explaining a second embodiment of another autonomous mobile body support processing by an autonomous mobile body. 自律移動体の無線給電部を模式的に示す図である。FIG. 2 is a diagram illustrating a wireless power supply unit of an autonomous moving body; 自律移動体間の給電方法の例を模式的に示す図である。FIG. 1 is a diagram illustrating an example of a method for supplying power between autonomous moving bodies. 自律移動体の充電台帰還処理の第2の実施の形態を説明するためのフローチャートである。10 is a flowchart for explaining a second embodiment of a charging stand return process of an autonomous moving body. コンピュータの構成例を示す図である。FIG. 1 illustrates an example of the configuration of a computer.

 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.実施の形態
 2.変形例
 3.その他
Hereinafter, an embodiment of the present technology will be described in the following order.
1. Embodiment 2. Modification 3. Others

<<1.実施の形態>>
 図1乃至図24を参照して、本技術の実施の形態について説明する。
<<1. Embodiment>>
An embodiment of the present technology will be described with reference to FIG. 1 to FIG.

 <情報処理システム1の構成例>
 図1は、本技術を適用した情報処理システム1の一実施の形態を示すブロック図である。
<Configuration example of information processing system 1>
FIG. 1 is a block diagram showing an embodiment of an information processing system 1 to which the present technology is applied.

 情報処理システム1は、自律移動体11-1乃至自律移動体11-n、情報処理端末12-1乃至情報処理端末12-n、及び、情報処理サーバ13を備える。 The information processing system 1 includes autonomous mobile units 11-1 to 11-n, information processing terminals 12-1 to 12-n, and an information processing server 13.

 なお、以下、自律移動体11-1乃至自律移動体11-nを個々に区別する必要がない場合、単に自律移動体11と称する。以下、情報処理端末12-1乃至情報処理端末12-nを個々に区別する必要がない場合、単に情報処理端末12と称する。 Note that, hereinafter, when there is no need to distinguish between the autonomous mobile units 11-1 to 11-n, they will simply be referred to as the autonomous mobile unit 11. Hereinafter, when there is no need to distinguish between the information processing terminals 12-1 to 12-n, they will simply be referred to as the information processing terminals 12.

 各自律移動体11と情報処理サーバ13との間、各情報処理端末12と情報処理サーバ13との間、各自律移動体11と各情報処理端末12との間、各自律移動体11間、及び、各情報処理端末12間において、ネットワーク21を介した通信が可能である。また、各自律移動体11と各情報処理端末12との間、各自律移動体11間、及び、各情報処理端末12間においては、ネットワーク21を介さずに直接通信することも可能である。 Communication is possible between each autonomous mobile body 11 and the information processing server 13, between each information processing terminal 12 and the information processing server 13, between each autonomous mobile body 11 and each information processing terminal 12, between each autonomous mobile body 11, and between each information processing terminal 12, via the network 21. In addition, direct communication is also possible between each autonomous mobile body 11 and each information processing terminal 12, between each autonomous mobile body 11, and between each information processing terminal 12, without going through the network 21.

 自律移動体11は、収集したセンサデータ等に基づいて、自分及び周囲の状況を認識し、状況に応じた種々の動作を自律的に選択し、実行する情報処理装置である。自律移動体11は、単にユーザの指示に従った動作を行うロボットとは異なり、状況に応じた適切な動作を自律的に実行することを特徴の一つとする。 The autonomous mobile unit 11 is an information processing device that recognizes its own and its surrounding situations based on collected sensor data, etc., and autonomously selects and executes various actions according to the situation. Unlike a robot that simply performs actions according to the user's instructions, one of the features of the autonomous mobile unit 11 is that it autonomously executes appropriate actions according to the situation.

 自律移動体11は、例えば、撮影した画像に基づくユーザ認識や、物体認識等を実行し、認識したユーザや物体等に応じた種々の自律行動を行うことが可能である。また、自律移動体11は、例えば、ユーザの発話に基づく音声認識を実行し、ユーザの指示などに基づく行動を行うこともできる。 The autonomous mobile body 11 can, for example, perform user recognition and object recognition based on captured images, and perform various autonomous actions according to the recognized user, object, etc. The autonomous mobile body 11 can also, for example, perform voice recognition based on the user's speech, and perform actions based on the user's instructions, etc.

 さらに、自律移動体11は、ユーザ認識や物体認識の能力を獲得するために、パターン認識学習を行う。この際、自律移動体11は、与えられた学習データに基づく教師学習だけでなく、ユーザ等による教示に基づいて、動的に学習データを収集し、物体などに係るパターン認識学習を行うことが可能である。 Furthermore, the autonomous mobile body 11 performs pattern recognition learning to acquire the ability to recognize users and objects. In this case, the autonomous mobile body 11 can perform pattern recognition learning relating to objects, etc., not only by supervised learning based on given learning data, but also by dynamically collecting learning data based on instructions from a user, etc.

 また、自律移動体11は、ユーザにより躾けられることができる。ここで、自律移動体11の躾とは、例えば、決まりや禁止事項を教えて覚えさせる一般的な躾より広く、ユーザが自律移動体11に関わることにより、自律移動体11にユーザが感じられる変化が表れることをいう。 The autonomous moving body 11 can also be disciplined by the user. Here, discipline of the autonomous moving body 11 is broader than general discipline, such as teaching the autonomous moving body 11 rules and prohibitions and having it memorize them, and refers to changes in the autonomous moving body 11 that the user can sense as a result of the user's interaction with the autonomous moving body 11.

 自律移動体11の形状、能力、欲求等のレベルは、目的や役割に応じて適宜設計され得る。例えば、自律移動体11は、空間内を自律的に移動し、種々の動作を実行する自律移動型ロボットにより構成される。具体的には、例えば、自律移動体11は、ヒトやイヌなどの動物を模した形状や動作能力を有する自律移動型ロボットにより構成される。また、例えば、自律移動体11は、ユーザとのコミュニケーション能力を有する車両やその他の装置により構成される。 The shape, capabilities, desires, and other levels of the autonomous mobile body 11 can be designed appropriately according to the purpose and role. For example, the autonomous mobile body 11 is composed of an autonomous mobile robot that moves autonomously within a space and performs various operations. Specifically, for example, the autonomous mobile body 11 is composed of an autonomous mobile robot that has a shape and operational capabilities that mimic those of an animal such as a human or a dog. Also, for example, the autonomous mobile body 11 is composed of a vehicle or other device that has the ability to communicate with a user.

 また、自律移動体11は、個体毎に特有の個性を備える。自律移動体11の個性とは、例えば、自律移動体11の内面的な個性であり、自律移動体11の動作により表される。すなわち、自律移動体11の個性は、自律移動体11の動作により表される特性(例えば、性格、性質)及び特徴を含む。ここで、自律移動体11の動作は、自律移動体11の各部位の動きだけでなく、表情や音声等によっても表現される。すなわち、自律移動体11の動作は、外部に表れる視覚的、聴覚的、及び、触覚的な動きや変化を含む。 Furthermore, each autonomous mobile body 11 has its own unique personality. The personality of the autonomous mobile body 11 is, for example, the internal personality of the autonomous mobile body 11, and is expressed by the behavior of the autonomous mobile body 11. In other words, the personality of the autonomous mobile body 11 includes characteristics (e.g., personality, disposition) and features expressed by the behavior of the autonomous mobile body 11. Here, the behavior of the autonomous mobile body 11 is expressed not only by the movement of each part of the autonomous mobile body 11, but also by facial expressions, voice, and the like. In other words, the behavior of the autonomous mobile body 11 includes visual, auditory, and tactile movements and changes that are manifested externally.

 また、自律移動体11の個性は、先天的な個性及び後天的な個性を含む。 Furthermore, the personality of the autonomous moving body 11 includes innate personality and acquired personality.

 先天的な個性は、自律移動体11の稼働が開始される前に設定されている個性であり、例えば、自律移動体11の出荷前に予め設定されている各種のパラメータにより表される。例えば、複数の種類の先天的な個性のうちの1つが、各自律移動体11に予め設定される。 The innate personality is a personality that is set before the autonomous moving body 11 starts operating, and is represented by, for example, various parameters that are set in advance before the autonomous moving body 11 is shipped. For example, one of multiple types of innate personality is set in advance for each autonomous moving body 11.

 後天的な個性は、自律移動体11の稼働後の経験や、躾等のユーザとの関わり(インタラクション)により獲得される個性であり、例えば、自律移動体11の稼働後に設定又は変更される各種のパラメータにより表される。例えば、同じ先天的な個性を備える複数の自律移動体11が、後天的な個性により異なる個性を備えるようになる。 The acquired personality is a personality acquired through experiences after the autonomous mobile body 11 is put into operation and through interactions with the user, such as training, and is represented, for example, by various parameters that are set or changed after the autonomous mobile body 11 is put into operation. For example, multiple autonomous mobile bodies 11 with the same innate personality will come to have different personalities due to acquired personality.

 情報処理端末12は、例えば、スマートフォン、タブレット端末、PC(パーソナルコンピュータ)等からなり、自律移動体11のユーザにより使用される。情報処理端末12は、所定のアプリケーションプログラム(以下、単にアプリケーションと称する)を実行することにより、各種の機能を実現する。例えば、情報処理端末12は、所定のアプリケーションを実行することにより、自律移動体11の管理やカスタマイズ等を行う。 The information processing terminal 12 may be, for example, a smartphone, a tablet terminal, or a PC (personal computer), and is used by the user of the autonomous mobile body 11. The information processing terminal 12 executes a specific application program (hereinafter simply referred to as an application) to realize various functions. For example, the information processing terminal 12 manages and customizes the autonomous mobile body 11 by executing a specific application.

 例えば、情報処理端末12は、ネットワーク21を介して情報処理サーバ13と通信を行ったり、自律移動体11と直接通信を行ったりして、自律移動体11に関する各種のデータを収集し、ユーザに提示したり、自律移動体11に指示を与えたりする。 For example, the information processing terminal 12 communicates with the information processing server 13 via the network 21 or directly with the autonomous mobile body 11 to collect various data related to the autonomous mobile body 11, present the data to the user, or give instructions to the autonomous mobile body 11.

 情報処理サーバ13は、例えば、各自律移動体11及び各情報処理端末12から各種のデータを収集したり、各自律移動体11及び各情報処理端末12に各種のデータを提供したり、各自律移動体11の動作を制御したりする。また、例えば、情報処理サーバ13は、各自律移動体11及び各情報処理端末12から収集したデータに基づいて、自律移動体11と同様に、パターン認識学習や、ユーザの躾に対応した処理を行うことも可能である。さらに、例えば、情報処理サーバ13は、上述したアプリケーションや各自律移動体11に関する各種のデータを各情報処理端末12に供給する。 The information processing server 13, for example, collects various data from each autonomous mobile body 11 and each information processing terminal 12, provides various data to each autonomous mobile body 11 and each information processing terminal 12, and controls the operation of each autonomous mobile body 11. For example, the information processing server 13 can also perform pattern recognition learning and processing corresponding to user discipline, similar to the autonomous mobile body 11, based on the data collected from each autonomous mobile body 11 and each information processing terminal 12. Furthermore, for example, the information processing server 13 supplies the above-mentioned applications and various data related to each autonomous mobile body 11 to each information processing terminal 12.

 ネットワーク21は、例えば、インターネット、電話回線網、衛星通信網などの公衆回線網や、Ethernet(登録商標)を含む各種のLAN(Local Area Network)、WAN(Wide Area Network)等のいくつかにより構成される。また、ネットワーク21は、IP-VPN(Internet Protocol-Virtual Private Network)等の専用回線網を含んでもよい。また、ネットワーク21は、Wi-Fi(登録商標)、Bluetooth(登録商標)等の無線通信網を含んでもよい。 Network 21 may be composed of, for example, public line networks such as the Internet, telephone line networks, and satellite communication networks, as well as various LANs (Local Area Networks) including Ethernet (registered trademark), and WANs (Wide Area Networks). Network 21 may also include dedicated line networks such as IP-VPN (Internet Protocol-Virtual Private Network). Network 21 may also include wireless communication networks such as Wi-Fi (registered trademark) and Bluetooth (registered trademark).

 なお、情報処理システム1の構成は、仕様や運用等に応じて柔軟に変更され得る。例えば、自律移動体11は、情報処理端末12及び情報処理サーバ13の他に、種々の外部装置とさらに情報通信を行ってもよい。上記の外部装置には、例えば、天気やニュース、その他のサービス情報を発信するサーバや、ユーザが所持する各種の家電機器などが含まれ得る。 The configuration of the information processing system 1 can be flexibly changed depending on the specifications, operation, etc. For example, the autonomous mobile body 11 may communicate information with various external devices in addition to the information processing terminal 12 and the information processing server 13. The above external devices can include, for example, servers that transmit weather, news, and other service information, and various home appliances owned by the user.

 また、例えば、自律移動体11と情報処理端末12とは、必ずしも一対一の関係である必要はなく、例えば、多対多、多対一、又は、一対多の関係であってもよい。例えば、1人のユーザが、1台の情報処理端末12を用いて複数の自律移動体11に関するデータを確認したり、複数の情報処理端末を用いて1台の自律移動体11に関するデータを確認したりすることが可能である。 Furthermore, for example, the autonomous mobile bodies 11 and the information processing terminals 12 do not necessarily have to have a one-to-one relationship, and may have, for example, a many-to-many, many-to-one, or one-to-many relationship. For example, it is possible for one user to check data related to multiple autonomous mobile bodies 11 using one information processing terminal 12, or to check data related to one autonomous mobile body 11 using multiple information processing terminals.

 <自律移動体11のハードウエア構成例>
 次に、自律移動体11のハードウエア構成例について説明する。なお、以下では、自律移動体11がイヌ型の四足歩行ロボットである場合を例に説明する。
<Hardware Configuration Example of the Autonomous Moving Body 11>
Next, a description will be given of an example of the hardware configuration of the autonomous moving body 11. In the following, a case will be described in which the autonomous moving body 11 is a dog-type quadruped robot.

 図2は、自律移動体11のハードウエア構成例を示す図である。自律移動体11は、頭部、胴体部、4つの脚部、及び、尻尾部を備えるイヌ型の四足歩行ロボットである。 FIG. 2 is a diagram showing an example of the hardware configuration of the autonomous mobile unit 11. The autonomous mobile unit 11 is a dog-like quadruped robot equipped with a head, a body, four legs, and a tail.

 自律移動体11は、頭部に2つのディスプレイ51L及びディスプレイ51Rを備える。なお、以下、ディスプレイ51Lとディスプレイ51Rを個々に区別する必要がない場合、単にディスプレイ51と称する。 The autonomous moving body 11 is equipped with two displays, a display 51L and a display 51R, on its head. In the following, when there is no need to distinguish between the display 51L and the display 51R, they will simply be referred to as the display 51.

 また、自律移動体11は、種々のセンサを備える。自律移動体11は、例えば、マイクロフォン52、カメラ53、ToF(Time Of Flight)センサ54、人感センサ55、測距センサ56、タッチセンサ57、照度センサ58、足裏ボタン59、及び、慣性センサ60を備える。 The autonomous moving body 11 also includes various sensors. For example, the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.

 自律移動体11は、例えば、頭部に4つのマイクロフォン52を備える。各マイクロフォン52は、例えば、ユーザの発話や、周囲の環境音を含む周囲の音を収集する。また、複数のマイクロフォン52を備えることで、周囲で発生する音を感度高く収集すると共に、音源定位が可能となる。 The autonomous mobile body 11 is equipped with, for example, four microphones 52 on its head. Each microphone 52 collects surrounding sounds including, for example, the user's speech and surrounding environmental sounds. Furthermore, by providing multiple microphones 52, surrounding sounds can be collected with high sensitivity and sound source localization becomes possible.

 自律移動体11は、例えば、鼻先と腰部に2つの広角のカメラ53を備え、自律移動体11の周囲を撮影する。例えば、鼻先に配置されたカメラ53は、自律移動体11の前方視野(すなわち、イヌの視野)内の撮影を行う。腰部に配置されたカメラ53は、自律移動体11の上方を中心とする周囲の撮影を行う。自律移動体11は、例えば、腰部に配置されたカメラ53により撮影された画像に基づいて、天井の特徴点などを抽出し、SLAM(Simultaneous Localization and Mapping)を実現することができる。 The autonomous mobile body 11 is equipped with two wide-angle cameras 53, for example, at the nose and waist, which capture images of the autonomous mobile body 11's surroundings. For example, the camera 53 located at the nose captures images within the autonomous mobile body 11's forward field of view (i.e., the dog's field of view). The camera 53 located at the waist captures images of the surroundings centered around the upper part of the autonomous mobile body 11. The autonomous mobile body 11 can extract feature points of the ceiling, for example, based on images captured by the camera 53 located at the waist, and achieve SLAM (Simultaneous Localization and Mapping).

 ToFセンサ54は、例えば、鼻先に設けられ、頭部前方に存在する物体との距離を検出する。自律移動体11は、ToFセンサ54により種々の物体との距離を精度高く検出することができ、ユーザを含む対象物や障害物などとの相対位置に応じた動作を実現することができる。 The ToF sensor 54 is provided, for example, at the tip of the nose and detects the distance to an object that is in front of the head. The ToF sensor 54 enables the autonomous mobile body 11 to detect the distance to various objects with high accuracy, and to realize operations according to the relative position to objects, including the user, obstacles, etc.

 人感センサ55は、例えば、胸部に配置され、ユーザやユーザが飼育するペットなどの所在を検知する。自律移動体11は、人感センサ55により前方に存在する動物体を検知することで、当該動物体に対する種々の動作、例えば、興味、恐怖、驚きなどの感情に応じた動作を実現することができる。 The human presence sensor 55 is placed, for example, on the chest and detects the presence of the user or a pet kept by the user. By detecting an animal object in front of the autonomous mobile body 11 using the human presence sensor 55, the autonomous mobile body 11 can realize various actions toward the animal object, such as actions corresponding to emotions such as interest, fear, or surprise.

 測距センサ56は、例えば、胸部に配置され、自律移動体11の前方床面の状況を検出する。自律移動体11は、測距センサ56により前方床面に存在する物体との距離を精度高く検出することができ、当該物体との相対位置に応じた動作を実現することができる。 The distance sensor 56 is placed, for example, on the chest and detects the condition of the floor surface in front of the autonomous mobile body 11. The distance sensor 56 allows the autonomous mobile body 11 to accurately detect the distance to an object present on the floor surface in front of it, and to realize operations according to the relative position of the object.

 タッチセンサ57は、例えば、頭頂、あご下、背中など、ユーザが自律移動体11に触れる可能性が高い部位に配置され、ユーザによる接触(タッチ)を検知する。タッチセンサ57は、例えば、静電容量式や感圧式のタッチセンサにより構成される。自律移動体11は、タッチセンサ57により、ユーザの触れる、なでる、叩く、押すなどの接触行為を検知することができ、当該接触行為に応じた動作を行うことができる。また、例えば、タッチセンサ57が各部位に線状又は面状に配置することにより、各部位内でタッチされた位置の検出が可能になる。 The touch sensor 57 is arranged in areas where the user is likely to touch the autonomous moving body 11, such as the top of the head, under the chin, and on the back, and detects contact (touch) by the user. The touch sensor 57 is composed of, for example, a capacitive or pressure-sensitive touch sensor. The autonomous moving body 11 can detect contact actions by the user, such as touching, stroking, tapping, and pushing, using the touch sensor 57, and can perform an action according to the contact action. In addition, for example, by arranging the touch sensors 57 in a line or a plane on each part, it becomes possible to detect the position touched within each part.

 照度センサ58は、例えば、頭部背面において尻尾部の付け根などに配置され、自律移動体11が位置する空間の照度を検出する。自律移動体11は、照度センサ58により周囲の明るさを検出し、当該明るさに応じた動作を実行することができる。 The illuminance sensor 58 is located, for example, on the back of the head at the base of the tail, and detects the illuminance of the space in which the autonomous mobile body 11 is located. The autonomous mobile body 11 can detect the surrounding brightness using the illuminance sensor 58 and perform operations according to the brightness.

 足裏ボタン59は、例えば、4つの脚部の肉球に該当する部位にそれぞれ配置され、自律移動体11の脚部底面が床と接触しているか否かを検知する。自律移動体11は、足裏ボタン59により床面との接触または非接触を検知することができ、例えば、ユーザにより抱き上げられたことなどを把握することができる。 The sole buttons 59 are, for example, located on the areas corresponding to the paw pads of each of the four legs, and detect whether the bottom surfaces of the legs of the autonomous mobile body 11 are in contact with the floor. The sole buttons 59 enable the autonomous mobile body 11 to detect contact or non-contact with the floor surface, and can, for example, know when it has been picked up by a user.

 慣性センサ60は、例えば、頭部および胴体部にそれぞれ配置され、頭部や胴体部の速度、加速度、回転等の物理量を検出する。例えば、慣性センサ60は、X軸、Y軸、Z軸の加速度および角速度を検出する6軸センサにより構成される。自律移動体11は、慣性センサ60により頭部および胴体部の運動を精度高く検出し、状況に応じた動作制御を実現することができる。 The inertial sensor 60 is, for example, disposed in the head and torso, respectively, and detects physical quantities such as speed, acceleration, and rotation of the head and torso. For example, the inertial sensor 60 is composed of a six-axis sensor that detects acceleration and angular velocity on the X-axis, Y-axis, and Z-axis. The autonomous mobile body 11 can detect the movement of the head and torso with high accuracy using the inertial sensor 60, and realize operation control according to the situation.

 なお、自律移動体11が備えるセンサの構成は、仕様や運用等に応じて柔軟に変更され得る。例えば、自律移動体11は、上記の構成のほか、例えば、温度センサ、地磁気センサ、GNSS(Global Navigation Satellite System)信号受信機を含む各種の通信装置などをさらに備えてよい。 The configuration of the sensors equipped in the autonomous mobile body 11 can be flexibly changed depending on the specifications, operation, etc. For example, in addition to the above configuration, the autonomous mobile body 11 may further include various communication devices including a temperature sensor, a geomagnetic sensor, and a GNSS (Global Navigation Satellite System) signal receiver.

 次に、図3を参照して、自律移動体11の関節部の構成例について説明する。図3は、自律移動体11が備えるアクチュエータ71の構成例を示している。自律移動体11は、図3に示す回転箇所に加え、耳部と尻尾部に2つずつ、口に1つの合計22の回転自由度を有する。 Next, an example of the configuration of the joints of the autonomous mobile body 11 will be described with reference to FIG. 3. FIG. 3 shows an example of the configuration of an actuator 71 provided in the autonomous mobile body 11. In addition to the rotation points shown in FIG. 3, the autonomous mobile body 11 has two degrees of freedom of rotation each in the ears and tail, and one in the mouth, for a total of 22 degrees of freedom of rotation.

 例えば、自律移動体11は、頭部に3自由度を有することで、頷きや首を傾げる動作を両立することができる。また、自律移動体11は、腰部に備えるアクチュエータ71により、腰のスイング動作を再現することで、より現実のイヌに近い自然かつ柔軟な動作を実現することができる。 For example, the autonomous mobile body 11 has three degrees of freedom in its head, allowing it to perform both nodding and tilting its head. In addition, the autonomous mobile body 11 can reproduce the swinging motion of its hips using the actuator 71 in its hips, allowing it to achieve natural and flexible movements that are closer to those of a real dog.

 なお、自律移動体11は、例えば、1軸アクチュエータと2軸アクチュエータを組み合わせることで、上記の22の回転自由度を実現してもよい。例えば、脚部における肘や膝部分においては1軸アクチュエータを、肩や大腿の付け根には2軸アクチュエータをそれぞれ採用してもよい。 The autonomous mobile body 11 may achieve the above 22 degrees of rotational freedom by combining, for example, a single-axis actuator and a two-axis actuator. For example, single-axis actuators may be used in the elbows and knees of the legs, and two-axis actuators may be used in the shoulders and the base of the thighs.

 次に、図4を参照して、自律移動体11が備えるディスプレイ51の機能について説明する。 Next, the functions of the display 51 provided on the autonomous moving body 11 will be described with reference to FIG. 4.

 自律移動体11は、右眼および左眼にそれぞれ相当する2つのディスプレイ51R及びディスプレイ51Lを備える。各ディスプレイ51は、自律移動体11の目の動きや感情を視覚的に表現する機能を備える。例えば、各ディスプレイ51は、感情や動作に応じた眼球、瞳孔、瞼の動作を表現することで、実在するイヌなどの動物に近い自然な表情や動作を演出し、自律移動体11の視線や感情を高精度かつ柔軟に表現することができる。また、ユーザは、ディスプレイ51に表示される眼球の動作から、自律移動体11の状態を直観的に把握することができる。 The autonomous mobile body 11 is equipped with two displays, 51R and 51L, which correspond to the right and left eyes, respectively. Each display 51 has a function to visually express the eye movements and emotions of the autonomous mobile body 11. For example, each display 51 can express the movements of the eyeballs, pupils, and eyelids according to emotions and actions, thereby producing natural expressions and movements similar to those of real animals such as dogs, and can express the line of sight and emotions of the autonomous mobile body 11 with high precision and flexibility. In addition, the user can intuitively grasp the state of the autonomous mobile body 11 from the eyeball movements displayed on the display 51.

 また、各ディスプレイ51は、例えば、独立した2つのOLED(Organic Light Emitting Diode)により実現される。OLEDを用いることにより、眼球の曲面を再現することが可能となる。その結果、1枚の平面ディスプレイにより一対の眼球を表現する場合や、2枚の独立した平面ディスプレイにより2つの眼球をそれぞれ表現する場合と比較して、より自然な外装を実現することができる。 In addition, each display 51 is realized, for example, by two independent OLEDs (Organic Light Emitting Diodes). By using OLEDs, it is possible to reproduce the curved surface of an eyeball. As a result, a more natural appearance can be realized compared to a pair of eyeballs represented by a single flat display, or two eyeballs each represented by two independent flat displays.

 自律移動体11は、上記の構成により、図5に示されるように、関節部や眼球の動作を精度高く、柔軟に制御することで、より実在の生物に近い動作及び感情表現を再現することができる。 With the above configuration, the autonomous mobile body 11 can reproduce movements and emotional expressions that are closer to those of real living creatures by controlling the movements of the joints and eyeballs with high precision and flexibility, as shown in Figure 5.

 また、図6に示されるように、自律移動体11は、台状の充電台101に乗った状態で充電される。後述するように、自律移動体11は、充電台101上において、個性及び充電残量に基づく動作を行う。 Also, as shown in FIG. 6, the autonomous moving body 11 is charged while placed on a platform-shaped charging stand 101. As described below, the autonomous moving body 11 performs operations on the charging stand 101 based on its personality and remaining charge.

 なお、図5及び図6では、自律移動体11の外部構造を簡略化して示している。 Note that Figures 5 and 6 show a simplified external structure of the autonomous moving body 11.

 <自律移動体11及び充電台101の機能構成例>
 次に、図7を参照して、自律移動体11及び充電台101の機能構成例について説明する。なお、図7では、後述する処理に必要な構成のみが記載されており、後述する処理に必要のない構成の記載は適宜省略されている。
<Example of Functional Configuration of Autonomous Moving Body 11 and Charging Stand 101>
Next, an example of the functional configuration of the autonomous moving body 11 and the charging stand 101 will be described with reference to Fig. 7. Note that Fig. 7 illustrates only the configuration necessary for the processing described later, and the description of the configuration not necessary for the processing described later is appropriately omitted.

 自律移動体11は、ディスプレイ51、入力部121、メインコントロール部122、メモリ123、瞳ディスプレイ制御部124、口駆動制御部125、口駆動部126、首駆動制御部127、首駆動部128、脚駆動制御部129、脚駆動部130、尻尾駆動制御部131、尻尾駆動部132、音声制御部133、スピーカ134、無線通信モジュール135、電源制御部136、及び、充電池137を備える。 The autonomous mobile body 11 includes a display 51, an input unit 121, a main control unit 122, a memory 123, an eye display control unit 124, a mouth drive control unit 125, a mouth drive unit 126, a neck drive control unit 127, a neck drive unit 128, a leg drive control unit 129, a leg drive unit 130, a tail drive control unit 131, a tail drive unit 132, a voice control unit 133, a speaker 134, a wireless communication module 135, a power supply control unit 136, and a rechargeable battery 137.

 充電台101は、充電制御部201、充電回路202、及び、表示部203を備える。 The charging stand 101 includes a charging control unit 201, a charging circuit 202, and a display unit 203.

 自律移動体11と充電台101とは、接続コネクタ102を介して接続される。 The autonomous moving body 11 and the charging stand 101 are connected via a connection connector 102.

 入力部121は、上述したマイクロフォン52、カメラ53、ToFセンサ54、人感センサ55、測距センサ56、タッチセンサ57、照度センサ58、足裏ボタン59、及び、慣性センサ60を備え、ユーザや周囲の状況に関する各種のセンサデータを収集する機能を備える。また、入力部121は、例えば、スイッチ、ボタン等の入力デバイスを備える。入力部121は、収集したセンサデータ、及び、入力デバイスを介して入力される入力データをメインコントロール部122に供給する。 The input unit 121 includes the microphone 52, camera 53, ToF sensor 54, human presence sensor 55, distance measurement sensor 56, touch sensor 57, illuminance sensor 58, sole button 59, and inertial sensor 60 described above, and has the function of collecting various sensor data related to the user and the surrounding conditions. The input unit 121 also includes input devices such as switches and buttons. The input unit 121 supplies the collected sensor data and input data input via the input devices to the main control unit 122.

 メインコントロール部122は、例えば、CPU(Central Processing Unit)等のプロセッサ等を備え、各種の情報処理を行ったり、自律移動体11の各部の制御を行ったりする。 The main control unit 122 includes a processor such as a CPU (Central Processing Unit), and performs various types of information processing and controls each part of the autonomous mobile body 11.

 例えば、メインコントロール部122は、自律移動体11の各部から供給されるデータに基づいて、自律移動体11が置かれている状況の認識を行う。 For example, the main control unit 122 recognizes the situation in which the autonomous mobile body 11 is located based on data supplied from each part of the autonomous mobile body 11.

 例えば、メインコントロール部122は、自律移動体11が置かれている状況の認識結果等に基づいて、瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、音声制御部133を制御することにより、自律移動体11の動作を制御する。 For example, the main control unit 122 controls the operation of the autonomous mobile body 11 by controlling the pupil display control unit 124, the mouth drive control unit 125, the neck drive control unit 127, the leg drive control unit 129, the tail drive control unit 131, and the voice control unit 133 based on the recognition results of the situation in which the autonomous mobile body 11 is placed.

 例えば、メインコントロール部122は、自律移動体11の各部から供給されるデータに基づいて、自律移動体11の行動や周囲の状況の認識等に関する学習処理を実行する。 For example, the main control unit 122 executes learning processes related to the behavior of the autonomous mobile body 11 and recognition of the surrounding conditions, etc., based on data supplied from each part of the autonomous mobile body 11.

 例えば、メインコントロール部122は、電源制御部136を制御することにより、自律移動体11の電源を制御する。 For example, the main control unit 122 controls the power supply of the autonomous moving body 11 by controlling the power supply control unit 136.

 メモリ123は、例えば、不揮発性メモリ及び揮発性メモリを備え、各種のプログラム及びデータを記憶する。 Memory 123 includes, for example, non-volatile memory and volatile memory, and stores various programs and data.

 瞳ディスプレイ制御部124は、ディスプレイ51を制御して、ディスプレイ51に表示される左眼及び右眼の動きを制御する。 The pupil display control unit 124 controls the display 51 to control the movement of the left and right eyes displayed on the display 51.

 口駆動制御部125は、口駆動部126を制御することにより、自律移動体11の口の動きを制御する。口駆動制御部125は、口駆動部126が備えるアクチュエータ71の動作角や動作速度等を示す駆動データ(以下、口駆動データと称する)をメインコントロール部122に供給する。 The mouth drive control unit 125 controls the mouth movement of the autonomous mobile body 11 by controlling the mouth drive unit 126. The mouth drive control unit 125 supplies drive data (hereinafter referred to as mouth drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the mouth drive unit 126 to the main control unit 122.

 口駆動部126は、自律移動体11の口を駆動するアクチュエータ71を備える。 The mouth drive unit 126 includes an actuator 71 that drives the mouth of the autonomous mobile body 11.

 なお、瞳ディスプレイ制御部124及び口駆動制御部125により、自律移動体11の眼及び口の動きが制御されることにより、自律移動体11の表情が変化する。 The pupil display control unit 124 and the mouth drive control unit 125 control the movement of the eyes and mouth of the autonomous mobile body 11, thereby changing the facial expression of the autonomous mobile body 11.

 首駆動制御部127は、首駆動部128を制御することにより、自律移動体11の首の動きを制御する。首駆動制御部127は、首駆動部128が備えるアクチュエータ71の動作角や動作速度等を示す駆動データ(以下、首駆動データと称する)をメインコントロール部122に供給する。 The neck drive control unit 127 controls the neck movement of the autonomous mobile body 11 by controlling the neck drive unit 128. The neck drive control unit 127 supplies drive data (hereinafter referred to as neck drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the neck drive unit 128 to the main control unit 122.

 首駆動部128は、自律移動体11の首の関節部を駆動するアクチュエータ71を備える。 The neck drive unit 128 includes an actuator 71 that drives the neck joint of the autonomous mobile body 11.

 脚駆動制御部129は、脚駆動部130を制御することにより、自律移動体11の各脚の動きを制御する。脚駆動制御部129は、脚駆動部130が備えるアクチュエータ71の動作角や動作速度等を示す駆動データ(以下、脚駆動データと称する)をメインコントロール部122に供給する。 The leg drive control unit 129 controls the movement of each leg of the autonomous mobile body 11 by controlling the leg drive unit 130. The leg drive control unit 129 supplies drive data (hereinafter referred to as leg drive data) indicating the movement angle, movement speed, etc. of the actuator 71 provided in the leg drive unit 130 to the main control unit 122.

 脚駆動部130は、自律移動体11の各脚の関節部を駆動するアクチュエータ71を備える。 The leg drive unit 130 includes actuators 71 that drive the joints of each leg of the autonomous mobile body 11.

 尻尾駆動制御部131は、尻尾駆動部132を制御することにより、自律移動体11の尻尾の動きを制御する。尻尾駆動制御部131は、尻尾駆動部132が備えるアクチュエータ71の動作角や動作速度等を示す駆動データ(以下、尻尾駆動データと称する)をメインコントロール部122に供給する。 The tail drive control unit 131 controls the tail movement of the autonomous mobile body 11 by controlling the tail drive unit 132. The tail drive control unit 131 supplies drive data (hereinafter referred to as tail drive data) indicating the operating angle, operating speed, etc. of the actuator 71 equipped in the tail drive unit 132 to the main control unit 122.

 尻尾駆動部132は、自律移動体11の尻尾を駆動するアクチュエータ71を備える。 The tail drive unit 132 includes an actuator 71 that drives the tail of the autonomous mobile body 11.

 音声制御部133は、自律移動体11が出力する音声に対応する音声データの生成及び加工、並びに、音声の特性及び出力タイミングの制御を実行する。 The audio control unit 133 generates and processes audio data corresponding to the audio output by the autonomous mobile body 11, and controls the characteristics and output timing of the audio.

 自律移動体11が出力する音声は、例えば、自律移動体11がユーザとコミュニケーションをとったり、状態や感情を表したりするための音声、自律移動体11の動作に伴う動作音、及び、自律移動体11の演出を高めるための演出音を含む。自律移動体11がユーザとコミュニケーションをとったり、状態や感情を表したりするための音声は、例えば、鳴き声、会話音、寝言等を含む。動作音は、例えば、鳴き声、足音等を含む。演出音は、例えば、効果音、音楽等を含む。 The sounds output by the autonomous mobile body 11 include, for example, sounds by which the autonomous mobile body 11 communicates with the user and expresses its state or emotion, operation sounds accompanying the operation of the autonomous mobile body 11, and performance sounds for enhancing the performance of the autonomous mobile body 11. The sounds by which the autonomous mobile body 11 communicates with the user and expresses its state or emotion include, for example, cries, conversation sounds, talking in its sleep, etc. Operation sounds include, for example, cries, footsteps, etc. Performance sounds include, for example, sound effects, music, etc.

 また、自律移動体11が出力する音声は、例えば、外部からの刺激に反応して出力又は変化する音声(以下、刺激反応音と称する)、及び、自律移動体11の動作に合わせて(連動して)出力又は変化する音声を含む。刺激反応音は、例えば、鳴き声、会話音、寝言等を含む。自律移動体11の動作に合わせて出力又は変化する音声は、例えば、動作音及び演出音を含む。 The sounds output by the autonomous mobile body 11 include, for example, sounds that are output or change in response to external stimuli (hereinafter referred to as stimulus-responsive sounds), and sounds that are output or change in accordance with (linked to) the operation of the autonomous mobile body 11. Stimulus-responsive sounds include, for example, cries, conversation sounds, talking in one's sleep, etc. Sounds that are output or change in accordance with the operation of the autonomous mobile body 11 include, for example, operation sounds and performance sounds.

 制御対象となる音声の特性は、例えば、音声の種類(例えば、鳴き声、会話音等)、内容、特徴(例えば、音程、大きさ、音色等)、音質を含む。音声の内容は、例えば、会話音の場合、会話の内容を含む。 The characteristics of the sound to be controlled include, for example, the type of sound (e.g., bird cry, conversation, etc.), content, characteristics (e.g., pitch, volume, timbre, etc.), and sound quality. For example, in the case of conversation, the content of the sound includes the content of the conversation.

 スピーカ134は、音声制御部133から供給される音声データに基づいて、上述した各種の音声を出力する。 The speaker 134 outputs the various sounds described above based on the audio data supplied from the audio control unit 133.

 無線通信モジュール135は、ネットワーク21を介して、又は、ネットワーク21を介さずに、他の自律移動体11、情報処理端末12、及び、情報処理サーバ13と通信を行い、各種のデータの送受信を行う。無線通信モジュール135は、受信したデータをメインコントロール部122に供給し、送信するデータをメインコントロール部122から取得する。 The wireless communication module 135 communicates with other autonomous mobile bodies 11, the information processing terminal 12, and the information processing server 13, either via the network 21 or without the network 21, and transmits and receives various types of data. The wireless communication module 135 supplies the received data to the main control unit 122, and obtains data to be transmitted from the main control unit 122.

 なお、無線通信モジュール135の通信方式は、特に限定されず、仕様や運用に応じて柔軟に変更することが可能である。 The communication method of the wireless communication module 135 is not particularly limited and can be flexibly changed according to the specifications and operation.

 電源制御部136は、充電池137に蓄積されている電力の自律移動体11の各部への供給を制御する。電源制御部136は、充電池の充電残量を検出し、検出結果を示す充電残量データをメインコントロール部122に供給する。 The power supply control unit 136 controls the supply of power stored in the rechargeable battery 137 to each unit of the autonomous mobile body 11. The power supply control unit 136 detects the remaining charge of the rechargeable battery and supplies remaining charge data indicating the detection result to the main control unit 122.

 充電制御部201は、充電回路202による自律移動体11の充電池137の充電を制御する。 The charging control unit 201 controls the charging of the rechargeable battery 137 of the autonomous moving body 11 by the charging circuit 202.

 充電回路202は、充電制御部201の制御の下に、自律移動体11の充電池137を充電する。 The charging circuit 202 charges the rechargeable battery 137 of the autonomous mobile unit 11 under the control of the charging control unit 201.

 表示部203は、例えば、LED(Light Emitting Diode)等を備え、自律移動体11の充電池137の充電の状態等を表示する。 The display unit 203 includes, for example, an LED (Light Emitting Diode) and displays the charging status of the rechargeable battery 137 of the autonomous mobile unit 11.

 <メインコントロール部122の構成例>
 図8は、図7のメインコントロール部122の機能構成例を示している。メインコントロール部122は、認識部151、学習部152、及び、動作制御部153を備える。
<Example of configuration of main control unit 122>
Fig. 8 shows an example of the functional configuration of the main control unit 122 in Fig. 7. The main control unit 122 includes a recognition unit 151, a learning unit 152, and an operation control unit 153.

 認識部151は、入力部121から供給されるセンサデータ及び入力データ、無線通信モジュール135から供給される受信データ、口駆動部126から供給される口駆動データ、首駆動制御部127から供給される首駆動データ、脚駆動制御部129から供給される脚駆動データ、並びに、尻尾駆動制御部131から供給される尻尾駆動データに基づいて、自律移動体11が置かれている状況の認識を行う。 The recognition unit 151 recognizes the situation in which the autonomous mobile body 11 is placed, based on the sensor data and input data supplied from the input unit 121, the received data supplied from the wireless communication module 135, the mouth drive data supplied from the mouth drive unit 126, the neck drive data supplied from the neck drive control unit 127, the leg drive data supplied from the leg drive control unit 129, and the tail drive data supplied from the tail drive control unit 131.

 自律移動体11が置かれている状況は、例えば、自分及び周囲の状況を含む。自分の状況は、例えば、自律移動体11の状態及び動きを含む。周囲の状況は、例えば、ユーザ等の周囲の人の状態、動き、及び、指示、ペット等の周囲の生物の状態及び動き、周囲の物体の状態及び動き、時間、場所、並びに、周囲の環境等を含む。周囲の物体は、例えば、他の自律移動体を含む。また、認識部151は、状況を認識するために、例えば、人識別、表情や視線の認識、感情認識、物体認識、動作認識、空間領域認識、色認識、形認識、マーカ認識、障害物認識、段差認識、明るさ認識、温度認識、音声認識、単語理解、位置推定、姿勢推定等を行う。 The situation in which the autonomous mobile body 11 is placed includes, for example, the situation of itself and the surroundings. The situation of itself includes, for example, the state and movement of the autonomous mobile body 11. The surrounding situation includes, for example, the state, movement, and instructions of people in the vicinity such as a user, the state and movement of living things in the vicinity such as pets, the state and movement of surrounding objects, time, place, and the surrounding environment. The surrounding objects include, for example, other autonomous mobile bodies. Furthermore, in order to recognize the situation, the recognition unit 151 performs, for example, person identification, facial expression and gaze recognition, emotion recognition, object recognition, action recognition, spatial region recognition, color recognition, shape recognition, marker recognition, obstacle recognition, step recognition, brightness recognition, temperature recognition, voice recognition, word understanding, position estimation, posture estimation, etc.

 また、認識部151は、認識した各種の情報に基づいて、状況を推定し、理解する機能を備える。例えば、認識部151は、外部から自律移動体11に与えられた刺激、及び、刺激を与えた相手を認識する。認識対象となる刺激は、例えば、視覚的な刺激、聴覚的な刺激、及び、触覚的な刺激を含む。この際、認識部151は、事前に記憶される知識を用いて総合的に状況の推定を行ってもよい。 The recognition unit 151 also has a function of estimating and understanding the situation based on the various recognized information. For example, the recognition unit 151 recognizes stimuli given to the autonomous mobile body 11 from the outside and the person who gave the stimuli. The stimuli to be recognized include, for example, visual stimuli, auditory stimuli, and tactile stimuli. At this time, the recognition unit 151 may make a comprehensive estimation of the situation using knowledge stored in advance.

 認識部151は、状況の認識結果又は推定結果を示すデータ(以下、状況データと称する)を学習部152及び動作制御部153に供給する。また、認識部151は、状況データを、メモリ123に記憶されている行動履歴データに登録する。 The recognition unit 151 supplies data indicating the result of the recognition or estimation of the situation (hereinafter referred to as situation data) to the learning unit 152 and the operation control unit 153. In addition, the recognition unit 151 registers the situation data in the behavior history data stored in the memory 123.

 行動履歴データは、自律移動体11の行動の履歴を示すデータである。行動履歴データは、例えば、行動を開始した日時、行動を終了した日時、行動を実行したきっかけ、行動が指示された場所(ただし、場所が指示された場合)、行動したときの状況、行動を完了したか(行動を最後まで実行したか)否かの項目を含む。 The behavior history data is data that indicates the history of the behavior of the autonomous mobile body 11. The behavior history data includes, for example, items such as the date and time when the behavior started, the date and time when the behavior ended, the trigger for performing the behavior, the location where the behavior was instructed (if a location was instructed), the situation when the behavior was performed, and whether the behavior was completed (whether the behavior was performed to the end).

 行動を実行したきっかけには、例えば、ユーザの指示をきっかけに行動が実行された場合、その指示内容が登録される。また、例えば、所定の状況になったことをきっかけに行動が実行された場合、その状況の内容が登録される。さらに、例えば、ユーザにより指示された物体、又は、認識した物体をきっかけに行動が実行された場合、その物体の種類が登録される。 When an action is executed as a result of, for example, a user instruction, the content of that instruction is registered as the trigger for the action. When an action is executed as a result of, for example, a specific situation occurring, the content of that situation is registered. Furthermore, when an action is executed as a result of, for example, an object indicated by the user or a recognized object, the type of object is registered.

 学習部152は、入力部121から供給されるセンサデータ及び入力データ、無線通信モジュール135から供給される受信データ、口駆動部126から供給される口駆動データ、首駆動制御部127から供給される首駆動データ、脚駆動制御部129から供給される脚駆動データ、尻尾駆動制御部131から供給される尻尾駆動データ、認識部151から供給される状況データ、並びに、動作制御部153から供給される自律移動体11の行動に関するデータのうち1つ以上に基づいて、状況と行動、及び、当該行動による環境への作用を学習する。例えば、学習部152は、上述したパターン認識学習を行ったり、ユーザの躾に対応する行動パターンの学習を行ったりする。例えば、学習部152は、稼働開始後の経験及び躾に基づく学習処理を行うことにより、自律移動体11の個性、特に、後天的な個性を変化させる。 The learning unit 152 learns the situation, behavior, and the effect of the behavior on the environment based on one or more of the sensor data and input data supplied from the input unit 121, the received data supplied from the wireless communication module 135, the mouth drive data supplied from the mouth drive unit 126, the neck drive data supplied from the neck drive control unit 127, the leg drive data supplied from the leg drive control unit 129, the tail drive data supplied from the tail drive control unit 131, the situation data supplied from the recognition unit 151, and the data related to the behavior of the autonomous mobile body 11 supplied from the operation control unit 153. For example, the learning unit 152 performs the above-mentioned pattern recognition learning, and learns behavior patterns corresponding to the user's discipline. For example, the learning unit 152 changes the personality of the autonomous mobile body 11, particularly the acquired personality, by performing a learning process based on experience and discipline after the start of operation.

 例えば、学習部152は、深層学習(Deep Learning)などの機械学習アルゴリズムを用いて、上記の学習を実現する。なお、学習部152が採用する学習アルゴリズムは、上記の例に限定されず、適宜設計可能である。 For example, the learning unit 152 realizes the above learning by using a machine learning algorithm such as deep learning. Note that the learning algorithm adopted by the learning unit 152 is not limited to the above example, and can be designed as appropriate.

 学習部152は、学習結果を示すデータ(以下、学習結果データと称する)を動作制御部153に供給したり、メモリ123に記憶させたりする。 The learning unit 152 supplies data indicating the learning results (hereinafter referred to as learning result data) to the operation control unit 153 and stores the data in the memory 123.

 動作制御部153は、認識又は推定された状況、及び、学習結果データに基づいて、自律移動体11の動作を制御する。動作制御部153は、自律移動体11の行動に関するデータを学習部152に供給したり、メモリ123に記憶されている行動履歴データに登録したりする。 The operation control unit 153 controls the operation of the autonomous mobile body 11 based on the recognized or estimated situation and the learning result data. The operation control unit 153 supplies data on the behavior of the autonomous mobile body 11 to the learning unit 152 and registers the data in the behavior history data stored in the memory 123.

 動作制御部153は、認識又は推定された状況、及び、学習結果データに基づいて、自律移動体11の内部状態を制御する。例えば、動作制御部153は、自律移動体11の内部状態の状態遷移を制御する。 The operation control unit 153 controls the internal state of the autonomous mobile body 11 based on the recognized or estimated situation and the learning result data. For example, the operation control unit 153 controls the state transition of the internal state of the autonomous mobile body 11.

 自律移動体11の内部状態は、自律移動体11の外部に表れない内的な状態であり、例えば、自律移動体11の行動、体調、感情、年齢、及び、充電残量等のうち少なくとも1つ以上に基づいて設定される。自律移動体11の体調は、例えば、空腹度を含む。空腹度は、例えば、自律移動体11がエサを食べる動作をしてからの経過時間に基づいて設定される。自律移動体11の年齢は、例えば、自律移動体11の購入日、若しくは、最初に電源がオンされた日からの経過時間、又は、自律移動体11の総稼働時間に基づいて設定される。 The internal state of the autonomous mobile body 11 is an internal state that is not visible to the outside of the autonomous mobile body 11, and is set based on at least one of the autonomous mobile body 11's behavior, physical condition, emotions, age, remaining charge, etc., for example. The physical condition of the autonomous mobile body 11 includes, for example, hunger level. The hunger level is set based on, for example, the time that has elapsed since the autonomous mobile body 11 took the action of eating food. The age of the autonomous mobile body 11 is set based on, for example, the date of purchase of the autonomous mobile body 11, or the time that has elapsed since the power was first turned on, or the total operating time of the autonomous mobile body 11.

 動作制御部153は、認識又は推定された状況、学習結果データ、及び、自律移動体11の内部状態のうち少なくとも1つに基づいて、瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、音声制御部133を制御することにより、自律移動体11の動作を制御する。例えば、動作制御部153は、各アクチュエータ71の回転制御、ディスプレイ51の表示制御、スピーカ134からの音声出力制御等を実行する。 The operation control unit 153 controls the operation of the autonomous mobile body 11 by controlling the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and audio control unit 133 based on at least one of the recognized or estimated situation, the learning result data, and the internal state of the autonomous mobile body 11. For example, the operation control unit 153 executes rotation control of each actuator 71, display control of the display 51, audio output control from the speaker 134, etc.

 なお、自律移動体11の動作には、例えば、自律移動体11の稼働に必要な動作に加えて、意思又は感情を表現する動作、及び、パフォーマンスがある。以下、後者の動作をモーションと称する。 Note that the actions of the autonomous mobile body 11 include, for example, actions necessary for the operation of the autonomous mobile body 11, as well as actions expressing will or emotion, and performances. Hereinafter, the latter actions will be referred to as motions.

 また、自律移動体11が、各種の目的のために実行する動作を行動と称する。行動は、自律移動体11の稼働に必要な動作のみを含む場合もあるし、1種類以上のモーションを含む場合もある。行動の対象となる目的は、特に限定されない。例えば、目標地点まで移動したり、所定の物体を運んだりするような具体的な目的だけでなく、意思又は感情を表現する等の抽象的な目的も含まれる。 Furthermore, the actions that the autonomous mobile body 11 executes for various purposes are referred to as actions. An action may include only the actions necessary for the operation of the autonomous mobile body 11, or may include one or more types of motion. The purpose that is the target of an action is not particularly limited. For example, it may include not only specific purposes such as moving to a target location or transporting a specific object, but also abstract purposes such as expressing will or emotion.

 モーションを実現するためのデータ(以下、モーションデータと称する)は、例えば、事前にオーサリングツールを用いて作成され、自律移動体11の製造時にメモリ123に格納される。または、例えば、モーションデータは、情報処理端末12又は情報処理サーバ13から自律移動体11にダウンロードされる。 Data for realizing the motion (hereinafter referred to as motion data) is, for example, created in advance using an authoring tool and stored in the memory 123 when the autonomous moving body 11 is manufactured. Alternatively, for example, the motion data is downloaded to the autonomous moving body 11 from the information processing terminal 12 or the information processing server 13.

 モーションデータは、例えば、ディスプレイ51の眼の動き、自律移動体11の各駆動部(各アクチュエータ71)の各関節の目標関節角及び関節移動速度、出力する音声の種類及び音量等の制御値を、時系列に連続的に記述したデータを含む。 The motion data includes, for example, data continuously describing in a time series the movement of the eyes of the display 51, the target joint angles and joint movement speeds of each joint of each drive unit (each actuator 71) of the autonomous mobile body 11, and control values such as the type and volume of the sound to be output.

 動作制御部153は、モーションデータに基づいて、瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、音声制御部133を制御することにより、自律移動体11にモーションを実行させる。また、動作制御部153は、連続するモーションの間に、口駆動制御部125、首駆動制御部127、脚駆動制御部129、及び、尻尾駆動制御部131に対して、各駆動部の目標関節角を指示することにより、自律移動体11に任意の姿勢を取らせることができる。 The movement control unit 153 controls the eye display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and voice control unit 133 based on the motion data, thereby causing the autonomous mobile body 11 to execute a motion. In addition, the movement control unit 153 can cause the autonomous mobile body 11 to take any posture by instructing the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 on the target joint angles of each drive unit between successive motions.

 <自律移動体11の処理>
 次に、図9乃至図23を参照して、自律移動体11の処理について説明する。
<Processing of the autonomous moving body 11>
Next, the processing of the autonomous moving body 11 will be described with reference to FIGS.

  <充電対応処理>
 まず、図9のフローチャートを参照して、自律移動体11により実行される充電対応処理について説明する。
<Charging processing>
First, the charging handling process executed by the autonomous moving body 11 will be described with reference to the flowchart of FIG.

 ステップS1において、自律移動体11は、動作制御部153の制御の下に、自律動作を実行する。すなわち、自律移動体11は、自律的に各種の動作を実行する。動作の内容は、特に限定されない。 In step S1, the autonomous mobile body 11 performs an autonomous operation under the control of the operation control unit 153. That is, the autonomous mobile body 11 performs various operations autonomously. The content of the operation is not particularly limited.

 ステップS2において、動作制御部153は、充電が必要であるか否かを判定する。具体的には、動作制御部153は、電源制御部136からの充電残量データに基づいて、自律移動体11の充電残量を検出する。動作制御部153は、充電残量が要充電レベル以上である場合、充電が必要でないと判定し、処理はステップS1に戻る。 In step S2, the operation control unit 153 determines whether charging is necessary. Specifically, the operation control unit 153 detects the remaining charge of the autonomous moving body 11 based on the remaining charge data from the power supply control unit 136. If the remaining charge is equal to or greater than the required charging level, the operation control unit 153 determines that charging is not necessary, and the process returns to step S1.

 なお、要充電レベルは、任意の値に設定することが可能であり、例えば、26%に設定される。 The required charging level can be set to any value, for example, 26%.

 その後、ステップS2において、充電が必要であると判定されるまで、ステップS1及びステップS2の処理が繰り返し実行される。 Then, in step S2, the processes of steps S1 and S2 are repeatedly executed until it is determined that charging is necessary.

 一方、ステップS2において、動作制御部153は、充電残量が要充電レベル未満である場合、充電が必要であると判定し、処理はステップS3に進む。 On the other hand, in step S2, if the remaining charge is less than the required charging level, the operation control unit 153 determines that charging is necessary, and the process proceeds to step S3.

 ステップS3において、自律移動体11は、動作制御部153の制御の下に、充電台101へ帰還する。すなわち、自律移動体11は、充電台101まで移動する。 In step S3, the autonomous mobile object 11 returns to the charging station 101 under the control of the operation control unit 153. In other words, the autonomous mobile object 11 moves to the charging station 101.

 ステップS4において、自律移動体11は、動作制御部153の制御の下に、充電台101で充電を開始する。 In step S4, the autonomous mobile object 11 starts charging on the charging stand 101 under the control of the operation control unit 153.

 ステップS5において、動作制御部153は、充電台101上での動作を停止するか否かを判定する。具体的には、動作制御部153は、電源制御部136からの充電残量データに基づいて、自律移動体11の充電残量を検出する。動作制御部153は、充電残量が動作停止レベル未満である場合、充電台101上での動作を停止しないと判定し、処理はステップS6に進む。 In step S5, the operation control unit 153 determines whether or not to stop operation on the charging base 101. Specifically, the operation control unit 153 detects the remaining charge of the autonomous mobile body 11 based on the remaining charge data from the power supply control unit 136. If the remaining charge is less than the operation stop level, the operation control unit 153 determines not to stop operation on the charging base 101, and the process proceeds to step S6.

 なお、動作停止レベルは、要充電レベルより大きい範囲において、任意の値に設定することが可能であり、例えば、40%に設定される。 The operation stop level can be set to any value within a range greater than the required charging level, for example, 40%.

 ステップS6において、自律移動体11は、動作制御部153の制御の下に、充電台101上で起きて、充電残量及び個性に応じた動作を行う。 In step S6, the autonomous mobile object 11 wakes up on the charging stand 101 under the control of the operation control unit 153, and performs an operation according to the remaining charge and its personality.

 その後、ステップS5において、充電台101上での動作を停止すると判定されるまで、ステップS5及びステップS6の処理が繰り返し実行される。 Then, in step S5, the processes of steps S5 and S6 are repeatedly executed until it is determined that operation on the charging stand 101 is to be stopped.

 これにより、自律移動体11は、充電残量が動作停止レベル以上になるまで、充電中に充電残量及び個性に応じた動作(振る舞い)を行う。 As a result, the autonomous moving body 11 will operate (behave) in accordance with its remaining charge and its personality while charging, until the remaining charge reaches or exceeds the operation stop level.

 なお、自律移動体11の充電台101上での動作の具体例については、後述する。 Specific examples of the operation of the autonomous moving body 11 on the charging stand 101 will be described later.

 一方、ステップS5において、動作制御部153は、充電残量が動作停止レベル以上である場合、充電台101上での動作を停止すると判定し、処理はステップS7に進む。 On the other hand, in step S5, if the remaining charge is equal to or greater than the operation stop level, the operation control unit 153 determines to stop operation on the charging stand 101, and the process proceeds to step S7.

 ステップS7において、自律移動体11は、動作制御部153の制御の下に、充電台101上で眠る。 In step S7, the autonomous mobile object 11 goes to sleep on the charging stand 101 under the control of the operation control unit 153.

 その後、充電対応処理は終了する。 Then the charging process ends.

 ここで、図10乃至図18を参照して、自律移動体11の充電台101上での動作の例について説明する。 Now, with reference to Figures 10 to 18, an example of the operation of the autonomous moving body 11 on the charging stand 101 will be described.

 図10は、自律移動体11が充電台101上で実行するモーションの例を示している。 FIG. 10 shows an example of a motion that the autonomous moving body 11 performs on the charging stand 101.

 自律移動体11が充電台101上で実行するモーションには、例えば、「足パタパタ」、「見回す」、「あくび」、「においをかぐ」、「首をかしげる」、「げっぷ」、「くしゃみ」、「お手」等がある。 The motions that the autonomous mobile body 11 performs on the charging stand 101 include, for example, "flapping legs," "looking around," "yawning," "smelling," "tilting head," "burping," "sneezing," "shaking hands," etc.

 「足パタパタ」は、例えば、前脚を投げ出して上下させる動作を含むモーションである。 "Flapping legs" is a motion that includes, for example, throwing out the front legs and moving them up and down.

 「見回す」は、例えば、首を回して辺りを見渡す動作を含むモーションである。 "Look around" is a motion that includes, for example, turning your head to look around.

 「あくび」は、例えば、口を開けながら首を回す動作を含むモーションである。 "Yawning," for example, is a motion that includes turning the head while opening the mouth.

 「においをかぐ」は、例えば、顔を上げて鼻先を動かす動作を含むモーションである。 "Smelling," for example, is a motion that includes lifting the head and moving the tip of the nose.

 「首をかしげる」は、例えば、首を左右方向に動かす動作を含むモーションである。 "Tilt your head," for example, is a motion that includes moving your head from side to side.

 「げっぷ」は、例えば、口を開けてげっぷをした後、首を左右に振る動作を含むモーションである。 "Burp," for example, is a motion that includes opening the mouth to burp and then shaking the head from side to side.

 「くしゃみ」は、例えば、頭を上下させながら首を後ろに動かし、口を開けながら勢いよく首を前に動かす動作を含むモーションである。 "Sneezing," for example, is a motion that includes bobbing the head up and down while moving the neck back, and then opening the mouth while moving the neck forcefully forward.

 「お手」は、例えば、片方の前脚を高く上げたあと、斜め前の方向に下ろす動作を含むモーションである。 "Shake hands," for example, is a motion that includes raising one front leg high and then lowering it diagonally forward.

 例えば、動作制御部153は、自律移動体11の充電中、かつ、充電残量が動作停止レベル未満である場合、適宜上記のモーションの中から実行対象となるモーションを選択し、自律移動体11に順に実行させる。例えば、各モーションが選択される確率は、予め設定されている。なお、後述するように、充電中に各モーションが選択される確率は、自律移動体11の個性及び充電残量に基づいて変更される。 For example, when the autonomous mobile body 11 is charging and the remaining charge is less than the operation stop level, the operation control unit 153 appropriately selects motions to be executed from the above motions and causes the autonomous mobile body 11 to execute them in order. For example, the probability that each motion will be selected is set in advance. Note that, as described below, the probability that each motion will be selected during charging is changed based on the personality and remaining charge of the autonomous mobile body 11.

 また、自律移動体11の充電台101上での動作の特徴は、例えば、自律移動体11の個性及び充電残量により変化する。 Furthermore, the characteristics of the autonomous moving body 11's operation on the charging stand 101 change depending on, for example, the characteristics of the autonomous moving body 11 and the remaining charge.

 図11は、自律移動体11の充電残量及び先天的な個性に対する充電台101上での動作の特徴の例を示している。この例では、先天的な個性が、「鈍感」、「忠実で敏感」、「賢く警戒心が強い」、及び、「活発で社交的」の4種類に分類されている。 FIG. 11 shows an example of the characteristics of the autonomous mobile body 11's behavior on the charging stand 101 in relation to the remaining charge and innate personality. In this example, the innate personality is classified into four types: "insensitive," "loyal and sensitive," "intelligent and cautious," and "active and sociable."

 例えば、鈍感な自律移動体11は、充電残量の影響をあまり受けない。すなわち、鈍感な自律移動体11は、充電残量が多い場合と少ない場合とで、あまり充電台101上での動作が変化しない。 For example, an insensitive autonomous moving body 11 is not significantly affected by the remaining charge. In other words, the operation of an insensitive autonomous moving body 11 on the charging stand 101 does not change significantly whether the remaining charge is high or low.

 ここで、充電残量が多い場合と少ない場合は、所定の閾値に基づいて判定される。この閾値は、可変にされてもよく、可変の場合、ユーザが設定できるようにしてもよい。 Here, whether the remaining charge is high or low is determined based on a predetermined threshold. This threshold may be variable, and if variable, may be set by the user.

 例えば、忠実で敏感な自律移動体11は、充電残量が多い場合、首を動かしてオーナを探す振る舞いが増える。一方、例えば、忠実で敏感な自律移動体11は、充電残量が少ない場合、動きが鈍く小さくなる。 For example, when the remaining charge of a loyal and responsive autonomous mobile unit 11 is high, it will move its head more to search for its owner. On the other hand, when the remaining charge of a loyal and responsive autonomous mobile unit 11 is low, it will move more slowly and become smaller.

 ここで、オーナとは、例えば、自律移動体11を所有するユーザである。 Here, the owner is, for example, a user who owns the autonomous mobile unit 11.

 例えば、賢く警戒心が強い自律移動体11は、充電残量が多い場合、慎重な行動で動きすぎない。一方、例えば、賢く警戒心が強い自律移動体11は、充電残量が少ない場合、動きが鈍く小さくなる。 For example, an intelligent and cautious autonomous mobile body 11 will behave cautiously and not move too much when it has a large remaining charge. On the other hand, an intelligent and cautious autonomous mobile body 11 will move slowly and become smaller when it has a small remaining charge.

 例えば、活発で社交的な自律移動体11は、充電残量が多い場合、動きが俊敏で大きくなる。一方、例えば、活発で社交的な自律移動体11は、充電残量が少ない場合、オーナや他の個体(他の自律移動体11)がいると遊び続ける。 For example, an active and sociable autonomous mobile body 11 will move more agilely and larger when it has a large remaining charge. On the other hand, for example, an active and sociable autonomous mobile body 11 will continue to play when it has an owner or other individuals (other autonomous mobile bodies 11) present when it has a low remaining charge.

 図11の動作の特徴を実現するために、例えば、充電残量及び先天的な個性に基づいて、自律移動体11の各モーションが補正される。図12及び図13は、図11の動作の特徴を実現するために、自律移動体11の各モーションを補正する方法の例を示している。 In order to realize the characteristics of the operation of FIG. 11, each motion of the autonomous moving body 11 is corrected based on, for example, the remaining charge and the innate personality. FIGS. 12 and 13 show examples of methods for correcting each motion of the autonomous moving body 11 in order to realize the characteristics of the operation of FIG. 11.

 図12は、充電残量が10%未満の場合の自律移動体11の瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、動作制御部153によるモーションの補正方法の例を示している。 FIG. 12 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is less than 10%.

 鈍感な自律移動体11においては、モーションデータの各制御値の補正は行われない。従って、各モーションが、モーションデータに従ってそのまま実行され、標準的な動きとなる。 In an insensitive autonomous mobile body 11, no correction is made to the control values of the motion data. Therefore, each motion is executed as is according to the motion data, resulting in standard movement.

 忠実で敏感な自律移動体11においては、口駆動制御部125、首駆動制御部127、脚駆動制御部129、及び、尻尾駆動制御部131が、関節移動速度(速度)をモーションデータの0.9倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、口、首、各脚、及び、尻尾の動きが遅くなる。 In the faithful and responsive autonomous mobile body 11, the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed (velocity) to 0.9 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, each leg, and tail is slowed down.

 賢く警戒心が強い自律移動体11においては、口駆動制御部125、首駆動制御部127、脚駆動制御部129、及び、尻尾駆動制御部131が、関節移動速度をモーションデータの0.9倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、口、首、各脚、及び、尻尾の動きが遅くなる。 In the intelligent and vigilant autonomous mobile body 11, the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 0.9 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, each leg, and tail becomes slower.

 活発で社交的な自律移動体11においては、オーナや他の個体が近くにいる場合、口駆動制御部125、首駆動制御部127、及び、脚駆動制御部129が、関節移動速度及び目標関節角(振幅)をモーションデータの1.1倍に補正する。その他のモーションの制御値は補正されない。従って、オーナや他の個体が近くにいる場合、各モーションにおいて、口、首、及び、各脚の動きが速くかつ大きくなる。一方、オーナや他の個体が近くにいない場合、モーションデータの各制御値の補正は行われない。従って、オーナや他の個体が近くにいない場合、各モーションが、モーションデータに従ってそのまま実行され、標準的な動きとなる。 In an active and sociable autonomous mobile body 11, if the owner or another individual is nearby, the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle (amplitude) to 1.1 times the motion data. The control values of other motions are not corrected. Therefore, if the owner or another individual is nearby, the movement of the mouth, neck, and each leg becomes faster and larger in each motion. On the other hand, if the owner or another individual is not nearby, the control values of the motion data are not corrected. Therefore, if the owner or another individual is not nearby, each motion is executed as is according to the motion data, resulting in standard movement.

 図13は、充電残量が26%以上の場合の自律移動体11の瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、動作制御部153によるモーションの補正方法の例を示している。 FIG. 13 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is 26% or more.

 鈍感な自律移動体11においては、モーションデータの各制御値の補正は行われない。従って、各モーションが、モーションデータに従ってそのまま実行され、標準的な動きとなる。 In an insensitive autonomous mobile body 11, no correction is made to the control values of the motion data. Therefore, each motion is executed as is according to the motion data, resulting in standard movement.

 忠実で敏感な自律移動体11においては、動作制御部153が、首を振ってオーナを探す動作を選択する確率を1.2倍に補正する。モーションデータのその他の制御値は補正されない。従って、自律移動体11が首を振ってオーナを探す動作が増える。 In a faithful and responsive autonomous mobile body 11, the operation control unit 153 corrects the probability of selecting the operation of shaking its head to look for its owner by 1.2 times. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 increases the operation of shaking its head to look for its owner.

 賢く警戒心が強い自律移動体11においては、動作制御部153が、自律移動体11が何もしない確率を1.2倍に補正する。モーションデータのその他の制御値は補正されない。従って、自律移動体11が何もしない時間が多くなる。 In the case of an autonomous mobile body 11 that is intelligent and highly alert, the operation control unit 153 corrects the probability that the autonomous mobile body 11 will do nothing to 1.2 times the probability. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 will spend more time doing nothing.

 活発で社交的な自律移動体11においては、口駆動制御部125、首駆動制御部127、及び、脚駆動制御部129が、関節移動速度及び目標関節角をモーションデータの1.1倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、口、首、及び、各脚の動きが速くかつ大きくなる。 In an active and sociable autonomous mobile body 11, the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, and each leg becomes faster and larger.

 また、例えば、自律移動体11は、充電台101上で、各先天的な個性に特有のモーションを実行する。 Also, for example, the autonomous mobile body 11 performs a motion specific to each innate personality on the charging stand 101.

 図14は、各先天的な個性に特有の充電台101上での自律移動体11のモーションの例を示している。 FIG. 14 shows an example of the motion of the autonomous mobile body 11 on the charging stand 101 that is specific to each innate personality.

 例えば、鈍感な自律移動体11は、充電台101上において、首をゆっくり上下に動かしてうとうとする動作を含むモーションを実行する。 For example, the insensitive autonomous mobile object 11 performs a motion that includes slowly moving its head up and down on the charging stand 101 as if trying to doze off.

 例えば、忠実で敏感な自律移動体11は、充電台101上において、オーナを見つけると、両手(前脚)を上げて吠える動作を含むモーションを実行する。 For example, when the loyal and responsive autonomous mobile unit 11 finds its owner on the charging stand 101, it performs a motion that includes raising both hands (front legs) and barking.

 例えば、賢く警戒心が強い自律移動体11は、充電台101上において、物音がした方向を向き、頭を下げて伏せの姿勢を取る動作を含むモーションを実行する。 For example, an intelligent and alert autonomous mobile body 11 performs a motion on the charging base 101 that includes turning toward the direction of a sound, lowering its head and lying prone.

 例えば、活発で社交的な自律移動体11は、充電台101上において、首を左右に大きく振り、前脚、後脚をぱたぱたする動作を含むモーションを実行する。 For example, an active and sociable autonomous mobile body 11 performs motions on the charging stand 101, including swinging its head from side to side and flapping its front and back legs.

 図15は、自律移動体11の充電残量及び後天的な個性に対する充電台101上での動作の特徴の例を示している。この例では、後天的な個性が、「キュート(cute)」、「甘えん坊(dependent)」、「シャイ(shy)」、及び、「ワイルド(wild)」の4種類に分類されている。 FIG. 15 shows an example of the characteristics of the autonomous mobile body 11's behavior on the charging stand 101 in relation to the remaining charge and acquired personality. In this example, the acquired personality is classified into four types: "cute," "dependent," "shy," and "wild."

 キュートな自律移動体11は、例えば、感情表現欲が強く、気まぐれな性格を備える。例えば、キュートな自律移動体11は、充電残量が多い場合、首や尻尾を振って遊べることをアピールする。一方、例えば、キュートな自律移動体11は、充電残量が少ない場合、身体の動きはゆっくりだが、尻尾をたくさん動かす。また、キュートな自律移動体11は、充電残量が少ない場合、伏せて不満げな目をする。 The cute autonomous mobile body 11, for example, has a strong desire to express emotions and a whimsical personality. For example, when the cute autonomous mobile body 11 has a high remaining charge, it wags its head or tail to show that it is playful. On the other hand, for example, when the cute autonomous mobile body 11 has a low remaining charge, it moves its body slowly but moves its tail a lot. Also, when the cute autonomous mobile body 11 has a low remaining charge, it looks down and has a dissatisfied look in its eyes.

 甘えん坊の自律移動体11は、例えば、コミュニケーション欲が強い性格を備える。例えば、甘えん坊の自律移動体11は、充電残量が多い場合、首を動かして、オーナを探す動作が増える。一方、例えば、甘えん坊の自律移動体11は、充電残量が少ない場合、おとなしく充電されるのを待ち、オーナを見つけたときの動きだけは大きくなる。 The spoiled autonomous mobile body 11, for example, has a personality that has a strong desire to communicate. For example, when the spoiled autonomous mobile body 11 has a large remaining charge, it moves its head more and more to search for its owner. On the other hand, when the spoiled autonomous mobile body 11 has a small remaining charge, it waits quietly to be charged and only moves more when it finds its owner.

 シャイな自律移動体11は、例えば、探求欲が強く、運動欲は弱い性格を備える。例えば、シャイな自律移動体11は、充電残量が多い場合、部屋の中をきょろきょろする。一方、例えば、シャイな自律移動体11は、充電残量が少ない場合、おとなしく充電されるのを待つ。また、シャイな自律移動体11は、充電残量が少ない場合、動作がゆっくりになり、伏せがちになる。 A shy autonomous mobile unit 11, for example, has a personality with a strong desire to explore and a weak desire to move. For example, when the shy autonomous mobile unit 11 has a large remaining charge, it will look around the room. On the other hand, when the shy autonomous mobile unit 11 has a small remaining charge, it will quietly wait to be charged. Furthermore, when the shy autonomous mobile unit 11 has a small remaining charge, it will move slowly and tend to lie down.

 ワイルドな自律移動体11は、例えば、運動欲が強い性格を備える。例えば、ワイルドな自律移動体11は、充電残量が多い場合、手足を動かして、充電台101から出たがる。一方、例えば、ワイルドな自律移動体11は、充電残量が少ない場合、遊びたいけれど力尽きる。また、ワイルドな自律移動体11は、充電残量が少ない場合、疲れた体勢気味だが、脚だけはよく動かす。 The wild autonomous mobile body 11, for example, has a personality with a strong desire to exercise. For example, when the wild autonomous mobile body 11 has a high remaining charge, it wants to move its arms and legs and get out of the charging base 101. On the other hand, when the wild autonomous mobile body 11 has a low remaining charge, it wants to play but runs out of energy. Also, when the wild autonomous mobile body 11 has a low remaining charge, it may be in a tired position, but it will still move its legs well.

 図15の動作の特徴を実現するために、例えば、充電残量及び後天的な個性に基づいて、自律移動体11の各モーションが補正される。図16及び図17は、図15の動作の特徴を実現するために、自律移動体11の各モーションを補正する方法の例を示している。 In order to realize the characteristics of the operation of FIG. 15, each motion of the autonomous moving body 11 is corrected based on, for example, the remaining charge and acquired personality. FIGS. 16 and 17 show examples of methods for correcting each motion of the autonomous moving body 11 in order to realize the characteristics of the operation of FIG. 15.

 図16は、充電残量が10%未満の場合の自律移動体11の瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、動作制御部153によるモーションの補正方法の例を示している。 FIG. 16 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is less than 10%.

 キュートな自律移動体11においては、瞳ディスプレイ制御部124が、不満げな眼になるようにディスプレイ51を制御する。また、首駆動制御部127及び脚駆動制御部129が、関節移動速度をモーションデータの0.9倍に補正し、尻尾駆動制御部131が、関節移動速度をモーションデータの1.1倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、自律移動体11が不満げな眼をして、首及び各脚の動きが遅くなり、尻尾の動きが速くなる。 In the cute autonomous moving body 11, the pupil display control unit 124 controls the display 51 so that the eyes appear dissatisfied. In addition, the neck drive control unit 127 and leg drive control unit 129 correct the joint movement speed to 0.9 times the motion data, and the tail drive control unit 131 corrects the joint movement speed to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the autonomous moving body 11 has dissatisfied eyes, the movement of the neck and each leg slows down, and the movement of the tail speeds up.

 甘えん坊の自律移動体11においては、オーナが近くにいる場合、口駆動制御部125、首駆動制御部127、脚駆動制御部129、及び、尻尾駆動制御部131が、関節移動速度をモーションデータの1.1倍に補正する。モーションデータのその他の制御値は補正されない。従って、オーナに近くにいる場合、各モーションにおいて、口、首、各脚、及び、尻尾の動きが速くなる。一方、オーナが近くにいない場合、モーションデータの各制御値の補正は行われない。従って、オーナが近くにいない場合、各モーションが、モーションデータに従ってそのまま実行され、標準的な動きとなる。 In the spoiled autonomous mobile body 11, if the owner is nearby, the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, if the owner is nearby, the movement of the mouth, neck, legs, and tail will be faster in each motion. On the other hand, if the owner is not nearby, the control values of the motion data are not corrected. Therefore, if the owner is not nearby, each motion is executed as is according to the motion data, resulting in standard movements.

 シャイな自律移動体11においては、口駆動制御部125、首駆動制御部127、脚駆動制御部129、及び、尻尾駆動制御部131が、関節移動速度をモーションデータの0.9倍に補正する。また、首駆動制御部127は、モーション間に首を下げた姿勢になるように、首駆動部128を制御する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、口、首、各脚、及び、尻尾の動きが遅くなり、モーション間において、首を下げた姿勢がとられる。 In the shy autonomous mobile body 11, the mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, and tail drive control unit 131 correct the joint movement speed to 0.9 times the motion data. The neck drive control unit 127 also controls the neck drive unit 128 so that the body assumes a head-down posture between motions. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, legs, and tail slows down, and the body assumes a head-down posture between motions.

 ワイルドな自律移動体11においては、瞳ディスプレイ制御部124が、不満げな眼になるようにディスプレイ51を制御する。首駆動制御部127は、モーション間に首を下げた姿勢を取るように首駆動部128を制御する。脚駆動制御部129が、関節移動速度及び目標関節角をモーションデータの1.1倍に補正し、モーション間の脚の開き方向の角度が1.1倍になるように脚駆動部130を制御する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、各脚の動きが速くかつ大きくなり、モーション間において、首を下げ、脚を大きく開いた姿勢となる。 In the wild autonomous mobile body 11, the pupil display control unit 124 controls the display 51 so that the eyes appear dissatisfied. The neck drive control unit 127 controls the neck drive unit 128 so that the body assumes a posture with its head lowered between motions. The leg drive control unit 129 corrects the joint movement speed and target joint angle to 1.1 times the motion data, and controls the leg drive unit 130 so that the angle in the leg opening direction between motions becomes 1.1 times. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of each leg becomes faster and larger, and the body assumes a posture with its head lowered and its legs widely opened between motions.

 図17は、充電残量が26%以上の場合の自律移動体11の瞳ディスプレイ制御部124、口駆動制御部125、首駆動制御部127、脚駆動制御部129、尻尾駆動制御部131、及び、動作制御部153によるモーションの補正方法の例を示している。 FIG. 17 shows an example of a method of correcting motion by the pupil display control unit 124, mouth drive control unit 125, neck drive control unit 127, leg drive control unit 129, tail drive control unit 131, and operation control unit 153 of the autonomous moving body 11 when the remaining charge is 26% or more.

 キュートな自律移動体11においては、首駆動制御部127及び尻尾駆動制御部131が、関節移動速度及び目標関節角をモーションデータの1.1倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、首及び尻尾の動きが速くかつ大きくなる。 In the cute autonomous mobile body 11, the neck drive control unit 127 and tail drive control unit 131 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the neck and tail becomes faster and larger.

 甘えん坊の自律移動体11においては、動作制御部153が、首を振ってオーナを探す動作を選択する確率を1.2倍に補正する。モーションデータのその他の制御値は補正されない。従って、自律移動体11が首を振ってオーナを探す動作が多くなる。 In the case of a spoiled autonomous mobile body 11, the operation control unit 153 corrects the probability of selecting the operation of shaking its head to look for its owner by a factor of 1.2. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 will frequently perform the operation of shaking its head to look for its owner.

 シャイな自律移動体11においては、動作制御部153が、首を動かしてキョロキョロする動作を選択する確率を1.2倍に補正する。モーションデータのその他の制御値は補正されない。従って、自律移動体11が首を動かしてキョロキョロする動作が多くなる。 In a shy autonomous mobile body 11, the motion control unit 153 corrects the probability of selecting the motion of moving its head around by 1.2 times. Other control values of the motion data are not corrected. Therefore, the autonomous mobile body 11 moves its head around more often.

 ワイルドな自律移動体11においては、口駆動制御部125、首駆動制御部127、及び、脚駆動制御部129が、関節移動速度及び目標関節角をモーションデータの1.1倍に補正する。モーションデータのその他の制御値は補正されない。従って、各モーションにおいて、口、首、及び、各脚の動きが速くかつ大きさが大きくなる。 In the wild autonomous mobile body 11, the mouth drive control unit 125, neck drive control unit 127, and leg drive control unit 129 correct the joint movement speed and target joint angle to 1.1 times the motion data. Other control values of the motion data are not corrected. Therefore, in each motion, the movement of the mouth, neck, and each leg becomes faster and larger.

 また、例えば、自律移動体11は、充電台101上で、各後天的な個性に特有のモーションを実行する。 Also, for example, the autonomous mobile body 11 performs a motion specific to each acquired personality on the charging stand 101.

 図18は、各後天的な個性に特有の充電台101上での自律移動体11のモーションの例を示している。 FIG. 18 shows an example of the motion of the autonomous mobile body 11 on the charging stand 101 that is specific to each acquired personality.

 例えば、キュートな自律移動体11は、充電台101上において、耳、尻尾を大きく動かしながら、リズムに乗って首を左右に振る動作を含むモーションを実行する。 For example, the cute autonomous mobile body 11 performs a motion that includes swinging its head from side to side in rhythm while moving its ears and tail widely on the charging stand 101.

 例えば、甘えん坊の自律移動体11は、充電台101上において、オーナを見つけると、前片脚を上下に動かして手招きする動作を含むモーションを実行する。 For example, when the spoiled autonomous mobile body 11 finds its owner on the charging stand 101, it executes a motion that includes a beckoning motion by moving one of its front legs up and down.

 例えば、シャイな自律移動体11は、充電台101上において、前脚、後脚を大の字に広げて、顔を下げて、ぐったりする動作を含むモーションを実行する。 For example, a shy autonomous mobile body 11 performs a motion that includes spreading its front and back legs wide, lowering its head, and limplying on the charging stand 101.

 例えば、ワイルドな自律移動体11は、充電台101上において、前脚、尻尾を素早く上下に動かして、遊びたいことをアピールする動作を含むモーションを実行する。 For example, a wild autonomous mobile body 11 may perform motions on the charging stand 101, including quickly moving its front legs and tail up and down to show that it wants to play.

 なお、例えば、動作制御部153は、自律移動体11の充電中の先天的な個性に基づく動作と後天的な個性に基づく動作の比率を制御する。例えば、動作制御部153は、充電台101上の自律移動体11の動作におけるモーション及び姿勢に対して、先天的な個性に基づく補正と後天的な個性に基づく補正とを、それぞれ重み付けして行うとともに、重みの値を制御する。 For example, the operation control unit 153 controls the ratio of operations based on innate personality and operations based on acquired personality while the autonomous moving body 11 is charging. For example, the operation control unit 153 weights the motion and posture of the autonomous moving body 11 on the charging stand 101 for corrections based on innate personality and corrections based on acquired personality, and controls the weighting values.

 例えば、自律移動体11の各モーションの補正後の目標関節角及び関節移動速度Mtは、次式(1)により算出される。 For example, the corrected target joint angle and joint movement velocity Mt for each motion of the autonomous moving body 11 are calculated using the following formula (1).

 Mt=Mn×(1+λ×ΔMc)×{1+(1-λ)×ΔMa}
                            ・・・(1)
Mt=Mn×(1+λ×ΔMc)×{1+(1−λ)×ΔMa}
...(1)

 Mnは、補正前(通常)の自律移動体11のモーションの目標関節角及び関節移動速度を示す。ΔMcは、先天的な個性に基づく自律移動体11のモーションの目標関節角及び関節移動速度の補正量を示す。ΔMaは、後天的な個性に基づく自律移動体11のモーションの目標関節角及び関節移動速度の補正量を示す。λは、0から1までの範囲内の係数(重み)である。 Mn indicates the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 before correction (normal). ΔMc indicates the correction amount of the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 based on the innate personality. ΔMa indicates the correction amount of the target joint angle and joint movement speed of the motion of the autonomous mobile body 11 based on the acquired personality. λ is a coefficient (weight) in the range from 0 to 1.

 また、例えば、自律移動体11のモーション間の姿勢の補正後の目標関節角Ptは、次式(2)により算出される。 Also, for example, the target joint angle Pt after correcting the posture between motions of the autonomous moving body 11 is calculated by the following formula (2).

 Pt=Pn×(1+λ×ΔPc)×{1+(1-λ)×ΔPa}
                            ・・・(2)
Pt=Pn×(1+λ×ΔPc)×{1+(1−λ)×ΔPa}
...(2)

 Pnは、補正前(通常)の自律移動体11の姿勢の目標関節角を示す。ΔPcは、先天的な個性に基づく自律移動体11の姿勢の目標関節角の補正量を示す。ΔPaは、後天的な個性に基づく自律移動体11の姿勢の目標関節角の補正量を示す。 Pn indicates the target joint angle of the posture of the autonomous mobile body 11 before correction (normal). ΔPc indicates the correction amount of the target joint angle of the posture of the autonomous mobile body 11 based on the innate personality. ΔPa indicates the correction amount of the target joint angle of the posture of the autonomous mobile body 11 based on the acquired personality.

 例えば、係数λは、出荷時に1に設定され、自律移動体11の稼働開始時からの時間の経過に伴い、0に近づけられる。これにより、最初は先天的な個性による影響が強く、オーナとの生活が長くなるにつれ、経験やオーナ等のユーザとの関わりに基づく後天的な個性による影響が強くなるように、自律移動体11の個性を変化させることが可能になる。 For example, the coefficient λ is set to 1 at the time of shipment, and approaches 0 as time passes from when the autonomous mobile body 11 starts operating. This makes it possible to change the personality of the autonomous mobile body 11 so that it is initially strongly influenced by its innate personality, and as it lives with its owner longer, it becomes more influenced by its acquired personality based on experience and relationships with users such as the owner.

 また、自律移動体11の個性の違いをより際立たせるために、例えば、後天的な個性に基づいて、充電台101へ帰還するか否かの判定に用いる要充電レベルが変化するようにしてもよい。 In addition, to further highlight the differences in the personalities of the autonomous mobile bodies 11, the required charging level used to determine whether or not to return to the charging base 101 may be changed based on, for example, acquired personalities.

 例えば、キュートな自律移動体11は、気まぐれな性格であるため、要充電レベルが、毎回所定の範囲内(例えば、±5%の範囲内)でランダムに変化する。 For example, the cute autonomous moving body 11 has a volatile personality, so the charging level required changes randomly within a predetermined range (for example, within a range of ±5%) each time.

 例えば、甘えん坊の自律移動体11は、甘えん坊な性格のため、オーナと遊んでいる場合、要充電レベルが20%に下がり、限界まで遊び続ける。 For example, when a spoiled autonomous mobile body 11 is playing with its owner, the required charge level drops to 20% due to its spoiled nature, and the autonomous mobile body 11 continues to play until the limit.

 例えば、シャイな自律移動体11は、控えめな性格のため、要充電レベルが28%に上がり、早々に充電台101に戻る。 For example, a shy autonomous moving body 11, being a reserved person, will quickly return to the charging base 101 when the required charging level rises to 28%.

 例えば、ワイルドな自律移動体11は、活発な性格のため、要充電レベルが20%に下がり、限界まで遊び続ける。 For example, a wild autonomous mobile unit 11 has an active personality, so the required charging level drops to 20% and the unit continues to play until it reaches its limit.

 以上のように、各自律移動体11の個性が反映されることにより、自律移動体11の充電に関わる動作が多様化し、自律移動体11の個性が表現される。これにより、各オーナは、自分の自律移動体11が他の個体と異なり、自分のものであるという実感を得ることができる。また、自律移動体11の充電中の動作が様々に変化することにより、オーナ等のユーザが退屈しないようにさせることができる。 As described above, by reflecting the individuality of each autonomous mobile body 11, the actions related to charging the autonomous mobile body 11 are diversified, and the individuality of the autonomous mobile body 11 is expressed. This allows each owner to get a real sense that their autonomous mobile body 11 is different from other autonomous mobile bodies and that it is their own. In addition, by varying the actions of the autonomous mobile body 11 while charging, it is possible to prevent users such as owners from getting bored.

  <他の自律移動体11との連携方法>
 例えば、自律移動体11は、充電台101への帰還(移動)時等に、他の自律移動体11と充電関連の情報を共有することにより、他の自律移動体11と連携した動作を実行することができる。ここで、図19乃至図24を参照して、自律移動体11が、充電台101への帰還時等に、他の自律移動体11と連携した動作を実行する処理の例について説明する。
<Method of cooperation with other autonomous moving bodies 11>
For example, when returning (moving) to the charging stand 101, the autonomous mobile body 11 can execute an operation in cooperation with the other autonomous mobile body 11 by sharing charging-related information with the other autonomous mobile body 11. Here, with reference to Figs. 19 to 24, an example of a process in which the autonomous mobile body 11 executes an operation in cooperation with the other autonomous mobile body 11 when returning to the charging stand 101, etc. will be described.

   <充電台帰還処理の第1の実施の形態>
 まず、図19のフローチャートを参照して、自律移動体11により実行される充電台帰還処理の第1の実施の形態について説明する。
<First embodiment of charging base return processing>
First, a first embodiment of the charging base return process executed by the autonomous moving body 11 will be described with reference to the flowchart of FIG.

 なお、以下、充電台帰還処理を実行する自律移動体11を自律移動体11Aと称する。 In the following, the autonomous mobile body 11 that executes the charging base return process will be referred to as the autonomous mobile body 11A.

 ステップS101において、図9のステップS1の処理と同様に、自律動作が実行される。 In step S101, autonomous operation is performed in the same manner as in step S1 of FIG. 9.

 ステップS102において、図9のステップS2の処理と同様に、充電が必要であるか否かが判定される。充電が必要でないと判定された場合、処理はステップS101に戻る。 In step S102, similar to the process in step S2 of FIG. 9, it is determined whether charging is necessary. If it is determined that charging is not necessary, the process returns to step S101.

 その後、ステップS102において、充電が必要であると判定されるまで、ステップS101及びステップS102の処理が繰り返し実行される。 Then, in step S102, the processes of steps S101 and S102 are repeatedly executed until it is determined that charging is necessary.

 一方、ステップS102において、充電が必要であると判定された場合、処理はステップS103に進む。 On the other hand, if it is determined in step S102 that charging is necessary, the process proceeds to step S103.

 ステップS103において、自律移動体11Aは、動作制御部153の制御の下に、充電台101への帰還を開始する。すなわち、自律移動体11Aは、充電台101の方向への移動を開始する。 In step S103, the autonomous mobile object 11A starts returning to the charging base 101 under the control of the operation control unit 153. In other words, the autonomous mobile object 11A starts moving in the direction of the charging base 101.

 ステップS104において、認識部151は、入力部121からのセンサデータ等に基づいて、充電台101に帰還したか否かを判定する。充電台101に帰還していないと判定された場合、処理はステップS105に進む。 In step S104, the recognition unit 151 determines whether or not the robot has returned to the charging base 101 based on the sensor data from the input unit 121. If it is determined that the robot has not returned to the charging base 101, the process proceeds to step S105.

 ステップS105において、動作制御部153は、動けるだけの充電残量が残っているか否かを判定する。具体的には、動作制御部153は、電源制御部136からの充電残量データに基づいて、自律移動体11Aの充電残量を検出する。動作制御部153は、充電残量が動作可能レベル以上である場合、動けるだけの充電残量が残っていると判定し、処理はステップS104に戻る。 In step S105, the operation control unit 153 determines whether or not there is enough charge remaining to enable movement. Specifically, the operation control unit 153 detects the remaining charge of the autonomous moving body 11A based on the remaining charge data from the power supply control unit 136. If the remaining charge is equal to or greater than the operable level, the operation control unit 153 determines that there is enough charge remaining to enable movement, and the process returns to step S104.

 なお、動作可能レベルは、任意の値に設定することが可能である。また、例えば、動作可能レベルが、充電台101までの距離等に基づいて、変化するようにしてもよい。 The operable level can be set to any value. For example, the operable level may be changed based on the distance to the charging station 101, etc.

 その後、ステップS104において、充電台101に帰還したと判定されるか、ステップS105において、動けるだけの充電残量が残っていないと判定されるまで、ステップS104及びステップS105の処理が繰り返し実行される。 Then, the processes of steps S104 and S105 are repeatedly executed until it is determined in step S104 that the robot has returned to the charging base 101, or until it is determined in step S105 that the robot does not have enough charge left to move.

 一方、ステップS105において、動作制御部153は、充電残量が動作可能レベル以上である場合、動けるだけの充電残量が残っていないと判定し、すなわち、充電台101への帰還(移動)が困難であると判定し、処理はステップS106に進む。 On the other hand, in step S105, if the remaining charge is equal to or greater than the operable level, the operation control unit 153 determines that there is not enough charge left to move, i.e., that it is difficult to return (move) to the charging base 101, and the process proceeds to step S106.

 ステップS106において、動作制御部153は、周囲の自律移動体11に充電残量の不足を通知する。具体的には、動作制御部153は、充電残量の不足を通知するための情報(以下、充電残量不足情報と称する)を生成する。動作制御部153は、無線通信モジュール135を介して、充電残量不足情報を周囲の自律移動体11に送信する。 In step S106, the operation control unit 153 notifies the surrounding autonomous moving bodies 11 of the insufficient remaining charge. Specifically, the operation control unit 153 generates information for notifying the insufficient remaining charge (hereinafter referred to as insufficient remaining charge information). The operation control unit 153 transmits the insufficient remaining charge information to the surrounding autonomous moving bodies 11 via the wireless communication module 135.

 ステップS107において、自律移動体11Aは、動作制御部153の制御の下に、伏せて待機する。 In step S107, the autonomous moving body 11A waits in a prone position under the control of the operation control unit 153.

 その後、充電台帰還処理は終了する。 Then the charging base return process ends.

 一方、ステップS104において、充電台101に帰還したと判定された場合、ステップS105乃至ステップS107の処理はスキップされ、充電台帰還処理は終了する。 On the other hand, if it is determined in step S104 that the device has returned to the charging base 101, steps S105 to S107 are skipped and the charging base return process ends.

   <他自律移動体支援処理の第1の実施の形態>
 次に、図20のフローチャートを参照して、図19の自律移動体11Aの充電台帰還処理に対して、周囲の自律移動体11(以下、自律移動体11Bと称する)により実行される他自律移動体支援処理の第1の実施の形態について説明する。
<First embodiment of other autonomous mobile body support processing>
Next, referring to the flowchart of FIG. 20 , a first embodiment of a process for supporting other autonomous mobile bodies that is executed by a surrounding autonomous mobile body 11 (hereinafter referred to as the autonomous mobile body 11B) in response to the charging base return process of the autonomous mobile body 11A in FIG. 19 will be described.

 なお、この処理は、自律移動体11A及び自律移動体11Bのうち少なくとも1つが、電力を共有する機能を備えていない場合に実行される。 This process is executed when at least one of the autonomous mobile units 11A and 11B does not have the ability to share power.

 ステップS131において、図9のステップS1の処理と同様に、自律動作が実行される。 In step S131, autonomous operation is performed in the same manner as in step S1 of FIG. 9.

 ステップS132において、認識部151は、充電残量が不足している自律移動体11が周囲にいるか否かを判定する。認識部151は、図19のステップS106において送信される充電残量不足情報を受信していない場合、充電残量が不足している自律移動体11が周囲にいないと判定し、処理はステップS101に戻る。 In step S132, the recognition unit 151 determines whether or not there is an autonomous moving body 11 with insufficient remaining charge in the vicinity. If the recognition unit 151 has not received the insufficient remaining charge information transmitted in step S106 of FIG. 19, it determines that there is no autonomous moving body 11 with insufficient remaining charge in the vicinity, and the process returns to step S101.

 その後、ステップS132において、充電残量が不足している自律移動体11が周囲にいると判定されるまで、ステップS131及びステップS132の処理が繰り返し実行される。 Then, in step S132, the processes of steps S131 and S132 are repeatedly executed until it is determined that an autonomous moving body 11 with insufficient remaining charge is present in the vicinity.

 一方、ステップS132において、認識部151は、無線通信モジュール135を介して、充電残量不足情報を自律移動体11Aから受信した場合、充電残量が不足している自律移動体11(自律移動体11A)が周囲にいると判定し、すなわち、充電残量不足により充電台101まで帰還(移動)することが困難な自律移動体11が周囲にいると判定し、処理はステップS133に進む。 On the other hand, in step S132, if the recognition unit 151 receives insufficient remaining charge information from the autonomous mobile unit 11A via the wireless communication module 135, it determines that there is an autonomous mobile unit 11 (autonomous mobile unit 11A) with insufficient remaining charge in the vicinity, that is, it determines that there is an autonomous mobile unit 11 in the vicinity that has difficulty returning (moving) to the charging station 101 due to insufficient remaining charge, and the process proceeds to step S133.

 ステップS133において、認識部151は、入力部121からのセンシングデータ等に基づいて、オーナが近くにいるか否かを判定する。オーナが近くにいると判定された場合、処理はステップS134に進む。 In step S133, the recognition unit 151 determines whether or not the owner is nearby based on sensing data from the input unit 121. If it is determined that the owner is nearby, the process proceeds to step S134.

 ステップS134において、自律移動体11Bは、動作制御部153の制御の下に、オーナの側に行く。 In step S134, the autonomous mobile unit 11B moves to the owner's side under the control of the operation control unit 153.

 その後、処理はステップS135に進む。 Then, processing proceeds to step S135.

 一方、ステップS133において、オーナが近くににいないと判定された場合、ステップS134の処理はスキップされ、処理はステップS135に進む。 On the other hand, if it is determined in step S133 that the owner is not nearby, step S134 is skipped and processing proceeds to step S135.

 ステップS135において、自律移動体11Bは、その場で充電残量が不足している自律移動体11(自律移動体11A)の方を向いて吠える。これにより、オーナ、又は、自律移動体11Bの周囲の人に、充電残量が不足している自律移動体11Aの存在が通知される。 In step S135, the autonomous mobile unit 11B turns toward the autonomous mobile unit 11 (autonomous mobile unit 11A) that is running low on charge and barks. This notifies the owner or people around the autonomous mobile unit 11B of the presence of the autonomous mobile unit 11A that is running low on charge.

 その後、他自律移動体支援処理は終了する。 Then, the other autonomous mobile body support process ends.

   <他自律移動体支援処理の第2の実施の形態>
 次に、図21のフローチャートを参照して、図19の自律移動体11Aの充電台帰還処理に対して、周囲の自律移動体11Bにより実行される他自律移動体支援処理の第2の実施の形態について説明する。
<Second embodiment of other autonomous mobile body support processing>
Next, with reference to a flowchart in FIG. 21, a second embodiment of another autonomous mobile body support processing executed by a surrounding autonomous mobile body 11B in response to the charging base return processing of the autonomous mobile body 11A in FIG. 19 will be described.

 この処理は、自律移動体11A及び自律移動体11Bの両方が、電力を共有する機能を備えている場合に実行される。例えば、図22に示されるように、自律移動体11A及び自律移動体11Bの両方が、前脚301の先端に無線給電部302を備えている場合に、この処理は実行される。 This process is executed when both the autonomous mobile body 11A and the autonomous mobile body 11B have a function for sharing power. For example, as shown in FIG. 22, this process is executed when both the autonomous mobile body 11A and the autonomous mobile body 11B have a wireless power supply unit 302 at the tip of the front leg 301.

 ステップS161及びステップS162において、図20のステップS131及びステップS132と同様の処理が実行される。 In steps S161 and S162, the same processing as in steps S131 and S132 in FIG. 20 is performed.

 ステップS163において、動作制御部153は、電源制御部136からの情報に基づいて、電力を分け与えることが可能であるか否かを判定する。例えば、動作制御部153は、充電残量が所定の閾値未満である場合、電力を分け与えることができないと判定し、処理はステップS164に進む。 In step S163, the operation control unit 153 determines whether or not it is possible to share power based on information from the power supply control unit 136. For example, if the remaining charge is less than a predetermined threshold, the operation control unit 153 determines that it is not possible to share power, and the process proceeds to step S164.

 ステップS164乃至ステップS166において、図20のステップS133乃至ステップS135と同様の処理が実行される。 In steps S164 through S166, the same processing as in steps S133 through S135 of FIG. 20 is performed.

 一方、ステップS163において、動作制御部153は、充電残量が所定の閾値以上である場合、電力を分け与えることができると判定し、処理はステップS167に進む。 On the other hand, in step S163, if the remaining charge is equal to or greater than the predetermined threshold, the operation control unit 153 determines that power can be shared, and the process proceeds to step S167.

 ステップS167において、自律移動体11Bは、充電残量が不足している自律移動体11のところへ行き、電力を分け与える。 In step S167, the autonomous mobile unit 11B goes to the autonomous mobile unit 11 that is running low on charge and shares some of its power with it.

 具体的には、自律移動体11Bは、動作制御部153の制御の下に、自律移動体11Aのところへ移動する。 Specifically, the autonomous moving body 11B moves to the location of the autonomous moving body 11A under the control of the operation control unit 153.

 次に、図23に示されるように、自律移動体11Bは、動作制御部153の制御の下に、前脚301Bの先端の無線給電部302Bを、自律移動体11Aの前脚301Aの無線給電部302Aに重ねる。そして、自律移動体11Bは、無線給電部302Bから、自律移動体11Aの無線給電部302Aに無線で給電する。これにより、自律移動体11Aは、充電台101に帰還するのに必要な電力を得ることができ、自力で充電台101に帰還することが可能になる。 Next, as shown in FIG. 23, under the control of the operation control unit 153, the autonomous mobile body 11B places the wireless power supply unit 302B at the tip of the front leg 301B over the wireless power supply unit 302A of the front leg 301A of the autonomous mobile body 11A. Then, the autonomous mobile body 11B wirelessly supplies power from the wireless power supply unit 302B to the wireless power supply unit 302A of the autonomous mobile body 11A. This allows the autonomous mobile body 11A to obtain the power necessary to return to the charging base 101, and enables it to return to the charging base 101 on its own.

 その後、他自律移動体支援処理は終了する。 Then, the other autonomous mobile body support process ends.

   <充電台帰還処理の第2の実施の形態>
 次に、図24のフローチャートを参照して、自律移動体11Aにより実行される充電台帰還処理の第2の実施の形態について説明する。
<Second embodiment of charging base return process>
Next, a second embodiment of the charging base return process executed by the autonomous moving body 11A will be described with reference to the flowchart of FIG.

 この処理は、自律移動体11Aと他の自律移動体11が、充電残量に応じて充電台101を譲り合う機能を備える場合に実行される。 This process is executed when the autonomous mobile body 11A and another autonomous mobile body 11 have a function to share the charging stand 101 depending on the remaining charge.

 ステップS201乃至ステップS203において、図19のステップS101乃至ステップS103と同様の処理が実行される。 In steps S201 to S203, the same processing as in steps S101 to S103 in FIG. 19 is performed.

 ステップS204において、認識部151は、入力部121からのセンサデータ、及び、無線通信モジュール135を介して他の自律移動体11から受信した情報等に基づいて、同じ充電台101に帰還中の自律移動体11がいるか否かを判定する。同じ充電台101に帰還中の自律移動体11がいないと判定された場合、処理はステップS205に進む。 In step S204, the recognition unit 151 determines whether or not there is another autonomous mobile body 11 returning to the same charging base 101 based on the sensor data from the input unit 121 and information received from another autonomous mobile body 11 via the wireless communication module 135. If it is determined that there is no another autonomous mobile body 11 returning to the same charging base 101, the process proceeds to step S205.

 ステップS205において、図19のステップS104の処理と同様に、充電台101に帰還したか否かが判定される。充電台101に帰還していないと判定された場合、処理はステップS204に戻る。 In step S205, similar to the process in step S104 in FIG. 19, it is determined whether or not the battery has returned to the charging base 101. If it is determined that the battery has not returned to the charging base 101, the process returns to step S204.

 その後、ステップS204において、同じ充電台101に帰還中の自律移動体11がいないと判定されるか、ステップS205において、充電台101に帰還したと判定されるまで、ステップS204及びステップS205の処理が繰り返し実行される。 Then, the processes of steps S204 and S205 are repeatedly executed until it is determined in step S204 that there is no autonomous moving body 11 returning to the same charging base 101, or until it is determined in step S205 that the autonomous moving body 11 has returned to the charging base 101.

 一方、ステップS204において、同じ充電台101に帰還中の自律移動体11がいると判定された場合、処理はステップS206に進む。 On the other hand, if it is determined in step S204 that there is an autonomous moving body 11 returning to the same charging station 101, the process proceeds to step S206.

 ステップS206において、自律移動体11Aは、他の自律移動体11と通信して充電残量を共有する。具体的には、認識部151は、無線通信モジュール135を介して、同じ充電台101に帰還中の自律移動体11(以下、自律移動体11Cと称する)と通信を行い、お互いの充電残量に関する情報を共有する。 In step S206, the autonomous mobile body 11A communicates with the other autonomous mobile bodies 11 to share the remaining charge. Specifically, the recognition unit 151 communicates with an autonomous mobile body 11 (hereinafter referred to as autonomous mobile body 11C) that is returning to the same charging stand 101 via the wireless communication module 135, and shares information regarding the remaining charge with each other.

 ステップS207において、認識部151は、自分より他の自律移動体11(自律移動体11C)の方が、充電残量が少ないか否かを判定する。すなわち、認識部151は、自律移動体11Aと他の自律移動体11(自律移動体11C)の充電残量を比較し、他の自律移動体11の方が充電残量が少ないと判定した場合、処理はステップS208に進む。 In step S207, the recognition unit 151 determines whether the other autonomous mobile body 11 (autonomous mobile body 11C) has a lower remaining charge than the autonomous mobile body 11 itself. That is, the recognition unit 151 compares the remaining charge of the autonomous mobile body 11A with that of the other autonomous mobile body 11 (autonomous mobile body 11C), and if it determines that the other autonomous mobile body 11 has a lower remaining charge, the process proceeds to step S208.

 ステップS208において、自律移動体11Aは、動作制御部153の制御の下に、充電台101を譲るようなモーションをする。 In step S208, the autonomous mobile object 11A, under the control of the operation control unit 153, makes a motion as if to give up the charging stand 101.

 ステップS209において、認識部151は、入力部121からのセンサデータ、及び、無線通信モジュール135を介して他の自律移動体11から受信した情報等に基づいて、他に空いている充電台101があるか否かを判定する。他に空いている充電台101がないと判定された場合、処理はステップS210に進む。 In step S209, the recognition unit 151 determines whether or not there are any other available charging stations 101 based on the sensor data from the input unit 121 and information received from other autonomous moving bodies 11 via the wireless communication module 135. If it is determined that there are no other available charging stations 101, the process proceeds to step S210.

 ステップS210において、自律移動体11Aは、動作制御部153の制御の下に、伏せて待機する。 In step S210, the autonomous moving body 11A waits in a prone position under the control of the operation control unit 153.

 その後、ステップS209において、他に空いている充電台101があると判定されるまで、ステップS209及びステップS210の処理が繰り返し実行される。 Then, in step S209, the processes of steps S209 and S210 are repeatedly executed until it is determined that there is another available charging station 101.

 一方、ステップS210において、他に空いている充電台101があると判定された場合、処理はステップS203に戻る。これは、例えば、自律移動体11Cに譲った充電台101が、自律移動体11Cの充電の終了に伴い空いた場合も含む。 On the other hand, if it is determined in step S210 that there is another available charging station 101, the process returns to step S203. This also includes the case where the charging station 101 that was handed over to the autonomous mobile unit 11C becomes available as a result of the autonomous mobile unit 11C finishing charging.

 その後、処理はステップS203に戻り、ステップS203に以降の処理が実行される。 Then, processing returns to step S203, and subsequent processing is performed in step S203.

 一方、ステップS207において、自分より他の自律移動体11(自律移動体11C)の方が、充電残量が多いと判定された場合、処理はステップS211に進む。 On the other hand, if it is determined in step S207 that another autonomous moving body 11 (autonomous moving body 11C) has a greater remaining charge than the autonomous moving body 11 itself, the process proceeds to step S211.

 ステップS211において、自律移動体11Aは、充電台101を譲ってもらうことを感謝するモーションをする。 In step S211, the autonomous moving body 11A performs a motion expressing gratitude for the transfer of the charging stand 101.

 例えば、この場合、自律移動体11Cが、上述したステップS208と同様の処理により、充電台101を譲るようなモーションをする。 For example, in this case, the autonomous moving body 11C performs a motion as if it is yielding the charging stand 101 by processing similar to step S208 described above.

 これに対して、自律移動体11Aは、動作制御部153の制御に下に、充電台101を譲ってもらうことを感謝するモーションをする。 In response to this, the autonomous mobile body 11A, under the control of the operation control unit 153, performs a motion expressing gratitude for the charging stand 101 being handed over to it.

 なお、例えば、自律移動体11Aと自律移動体11Cとの充電残量がほぼ同じ場合、他の基準に基づいて、いずれの自律移動体11が充電台101を譲るかを決定する。 For example, if the remaining charge of the autonomous mobile body 11A and the autonomous mobile body 11C is approximately the same, the autonomous mobile body 11 that will hand over the charging stand 101 is determined based on other criteria.

 例えば、所定の優先順位に従って、優先順位が低い自律移動体11は、優先順位が高い自律移動体11に充電台101を譲るようにしてもよい。 For example, according to a predetermined priority order, an autonomous moving body 11 with a low priority may yield the charging stand 101 to an autonomous moving body 11 with a high priority order.

 例えば、充電台101から遠い方の自律移動体11が、充電台101に近い方の自律移動体11に充電台101を譲るようにしてもよい。 For example, an autonomous moving body 11 that is farther from the charging stand 101 may hand over the charging stand 101 to an autonomous moving body 11 that is closer to the charging stand 101.

 その後、処理はステップS205に戻る。 Then, processing returns to step S205.

 一方、ステップS205において、充電台101に帰還したと判定された場合、充電台帰還処理は終了する。 On the other hand, if it is determined in step S205 that the device has returned to the charging base 101, the charging base return process ends.

 以上のように、自律移動体11の動作が、充電台101に帰還する際に、充電残量や、他の自律移動体11の状態に対応して変化することにより、自律移動体11に対して生命感が感じられるようになる。 As described above, the operation of the autonomous mobile body 11 changes in response to the remaining charge and the state of the other autonomous mobile bodies 11 when it returns to the charging base 101, making the autonomous mobile body 11 feel alive.

 また、自律移動体11が、他の自律移動体11と連携したり、他の自律移動体11を支援したりすることにより、自律移動体11に対して知性が感じられるようになる。 In addition, when the autonomous mobile body 11 cooperates with other autonomous mobile bodies 11 or supports other autonomous mobile bodies 11, the autonomous mobile body 11 becomes perceived as intelligent.

<<2.変形例>>
 以下、上述した本技術の実施の形態の変形例について説明する。
<<2. Modified Examples>>
Below, a modification of the above-described embodiment of the present technology will be described.

 自律移動体11の個性の分類は、適宜変更することが可能である。例えば、自律移動体11の個性の種類を増やしたり減らしたりすることができる。例えば、自律移動体11の各個性にレベル(例えば、高、中、低等)を設けて、レベルに応じて、各モーションの補正量を変動させるようにしてもよい。例えば、自律移動体11の先天的な個性を全ての個体で共通にして、後天的な個性のみを個体毎に変化させるようにしてもよい。 The classification of the personality of the autonomous mobile body 11 can be changed as appropriate. For example, the types of personality of the autonomous mobile body 11 can be increased or decreased. For example, a level (e.g., high, medium, low, etc.) can be set for each personality of the autonomous mobile body 11, and the amount of correction for each motion can be varied according to the level. For example, the innate personality of the autonomous mobile body 11 can be made common to all individuals, and only the acquired personality can be changed for each individual.

 また、例えば、自律移動体11の外面的な個性も、同様に変化させるようにしてもよい。例えば、自律移動体11の外観が、内面的な個性の変化に伴い変化するようにしてもよい。 Furthermore, for example, the external personality of the autonomous moving body 11 may be changed in a similar manner. For example, the appearance of the autonomous moving body 11 may be changed in accordance with a change in the internal personality.

 自律移動体11の充電中の動作は、適宜変更することが可能である。例えば、自律移動体11が充電中に実行するモーションの種類を増やしたり、減らしたりすることができる。また、自律移動体11の個性及び充電残量に基づいて、各モーションを補正する方法や補正量を適宜変更することが可能である。 The operation of the autonomous moving body 11 while charging can be changed as appropriate. For example, the types of motions that the autonomous moving body 11 performs while charging can be increased or decreased. In addition, the method of correcting each motion and the amount of correction can be changed as appropriate based on the characteristics of the autonomous moving body 11 and the remaining charge.

 自律移動体11を充電する充電装置の形態及び充電方式は、適宜変更することが可能である。例えば、自律移動体11の充電方式は、無線又は有線のいずれであってもよい。 The form and charging method of the charging device that charges the autonomous moving body 11 can be changed as appropriate. For example, the charging method for the autonomous moving body 11 may be either wireless or wired.

 例えば、上述した自律移動体11の処理の一部を情報処理端末12又は情報処理サーバ13が実行するようにしてもよい。例えば、情報処理端末12又は情報処理サーバ13が、自律移動体11のメインコントロール部122の処理の全部又は一部を実行し、自律移動体11を遠隔制御するようにしてもよい。具体的には、例えば、情報処理端末12又は情報処理サーバ13が、自律移動体11の個性及び充電残量に基づいて、自律移動体11の充電中の動作を制御するようにしてもよい。例えば、情報処理端末12又は情報処理サーバ13が、自律移動体11の後天的な個性を学習するようにしてもよい。 For example, the information processing terminal 12 or the information processing server 13 may execute part of the processing of the autonomous mobile body 11 described above. For example, the information processing terminal 12 or the information processing server 13 may execute all or part of the processing of the main control unit 122 of the autonomous mobile body 11 to remotely control the autonomous mobile body 11. Specifically, for example, the information processing terminal 12 or the information processing server 13 may control the operation of the autonomous mobile body 11 during charging based on the personality and remaining charge of the autonomous mobile body 11. For example, the information processing terminal 12 or the information processing server 13 may learn the acquired personality of the autonomous mobile body 11.

 本技術は、上述したイヌ型の四足歩行ロボット以外にも、例えば、各個体の個性を表現することが可能なペット型のロボット等のエンタテインメントロボットに適用することができる。 In addition to the dog-type quadruped robot described above, this technology can also be applied to entertainment robots, such as pet-type robots that can express the individual personalities of each individual.

<<3.その他>>
 <コンピュータの構成例>
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<<3. Others>>
<Example of computer configuration>
The above-mentioned series of processes can be executed by hardware or software. When the series of processes is executed by software, the programs constituting the software are installed in a computer. Here, the computer includes a computer built into dedicated hardware, and a general-purpose personal computer, for example, capable of executing various functions by installing various programs.

 図25は、上述した一連の処理をプログラムにより実行するコンピュータのハードウエアの構成例を示すブロック図である。 FIG. 25 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.

 コンピュータ1000において、CPU(Central Processing Unit)1001,ROM(Read Only Memory)1002,RAM(Random Access Memory)1003は、バス1004により相互に接続されている。 In computer 1000, a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, and a RAM (Random Access Memory) 1003 are interconnected by a bus 1004.

 バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、入力部1006、出力部1007、記憶部1008、通信部1009、及びドライブ1010が接続されている。 Further connected to the bus 1004 is an input/output interface 1005. Connected to the input/output interface 1005 are an input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010.

 入力部1006は、入力スイッチ、ボタン、マイクロフォン、撮像素子などよりなる。出力部1007は、ディスプレイ、スピーカなどよりなる。記憶部1008は、ハードディスクや不揮発性のメモリなどよりなる。通信部1009は、ネットワークインタフェースなどよりなる。ドライブ1010は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブルメディア1011を駆動する。 The input unit 1006 includes an input switch, a button, a microphone, an image sensor, etc. The output unit 1007 includes a display, a speaker, etc. The storage unit 1008 includes a hard disk, a non-volatile memory, etc. The communication unit 1009 includes a network interface, etc. The drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.

 以上のように構成されるコンピュータ1000では、CPU1001が、例えば、記憶部1008に記録されているプログラムを、入出力インタフェース1005及びバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer 1000 configured as above, the CPU 1001 loads a program recorded in the storage unit 1008, for example, into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program, thereby performing the above-mentioned series of processes.

 コンピュータ1000(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer 1000 (CPU 1001) can be provided by being recorded on a removable medium 1011 such as a package medium, for example. The program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.

 コンピュータ1000では、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記憶部1008にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部1009で受信し、記憶部1008にインストールすることができる。その他、プログラムは、ROM1002や記憶部1008に、あらかじめインストールしておくことができる。 In the computer 1000, the program can be installed in the storage unit 1008 via the input/output interface 1005 by inserting the removable medium 1011 into the drive 1010. The program can also be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. Alternatively, the program can be pre-installed in the ROM 1002 or storage unit 1008.

 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.

 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In addition, in this specification, a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.

 さらに、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Furthermore, the embodiments of this technology are not limited to the above-mentioned embodiments, and various modifications are possible without departing from the spirit of this technology.

 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.

 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.

 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when a single step includes multiple processes, the processes included in that single step can be executed by a single device, or can be shared and executed by multiple devices.

 <構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
<Examples of configuration combinations>
The present technology can also be configured as follows.

(1)
 自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する動作制御部を
 備える制御装置。
(2)
 前記自律移動体の経験及びユーザとの関わりに基づいて、前記自律移動体の個性を変化させる学習部を
 さらに備える前記(1)に記載の制御装置。
(3)
 前記自律移動体の個性は、先天的な個性及び後天的な個性を含み、
 前記学習部は、前記自律移動体の経験及びユーザとの関わりに基づいて、前記後天的な個性を変化させる
 前記(2)に記載の制御装置。
(4)
 前記動作制御部は、前記自律移動体の充電中の前記先天的な個性に基づく動作と前記後天的な個性に基づく動作の比率を制御する
 前記(3)に記載の制御装置。
(5)
 前記動作制御部は、前記自律移動体の稼働開始時からの時間の経過に伴い、前記後天的な個性に基づく動作の比率を大きくする
 前記(4)に記載の制御装置。
(6)
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が充電中に実行するモーション及び姿勢のうち少なくとも1つを変化させる
 前記(1)乃至(5)のいずれかに記載の制御装置。
(7)
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が充電中に実行する前記モーションの速度及び大きさのうち少なくとも1つを変化させる
 前記(6)に記載の制御装置。
(8)
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が複数の前記モーションの中から充電中に実行する前記モーションを選択する確率を変更する
 前記(6)又は(7)に記載の制御装置。
(9)
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記モーション間の姿勢を変化させる
 前記(6)乃至(8)のいずれかに記載の制御装置。
(10)
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記モーションにおける前記自律移動体の表情を変化させる
 前記(6)乃至(9)のいずれかに記載の制御装置。
(11)
 前記動作制御部は、充電中に前記自律移動体の個性に特有の前記モーションを実行させる
 前記(6)乃至(10)のいずれかに記載の制御装置。
(12)
 前記動作制御部は、オーナ及び他の自律移動体のうち少なくとも1つが周囲に存在するか否かにさらに基づいて、前記自律移動体の充電中の動作を制御する
 前記(1)乃至(11)のいずれかに記載の制御装置。
(13)
 前記動作制御部は、充電するために充電装置まで移動する際の前記自律移動体の動作をさらに制御する
 前記(1)乃至(12)のいずれかに記載の制御装置。
(14)
 前記動作制御部は、前記自律移動体が充電残量不足により前記充電装置までの移動が困難である場合、他の自律移動体に充電残量の不足を通知し、待機するように前記自律移動体の動作を制御する
 前記(13)に記載の制御装置。
(15)
 前記動作制御部は、他の自律移動体が同じ前記充電装置の方に移動している場合、前記自律移動体と前記他の自律移動体の充電残量とを比較した結果に基づいて、前記自律移動体の動作を制御する
 前記(13)又は(14)に記載の制御装置。
(16)
 前記動作制御部は、充電残量不足により充電装置まで移動することが困難な他の自律移動体が存在する場合、前記他の自律移動体を支援するように前記自律移動体の動作を制御する
 前記(1)乃至(15)のいずれかに記載の制御装置。
(17)
 前記動作制御部は、前記他の自律移動体に電力を分け与えるように前記自律移動体の動作を制御する
 前記(16)に記載の制御装置。
(18)
 前記動作制御部は、前記他の自律移動体の存在を周囲に通知するように前記自律移動体の動作を制御する
 前記(16)又は(17)に記載の制御装置。
(19)
 前記自律移動体は、ペット型のロボットである
 前記(1)乃至(18)のいずれかに記載の制御装置。
(20)
 制御装置が、
 自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する
 制御方法。
(1)
A control device comprising: an operation control unit that controls an operation of an autonomous moving body during charging based on an individuality of the autonomous moving body and a remaining charge amount of the autonomous moving body.
(2)
The control device according to (1), further comprising a learning unit that changes a personality of the autonomous moving body based on an experience of the autonomous moving body and an interaction with a user.
(3)
The personality of the autonomous moving body includes an innate personality and an acquired personality,
The control device according to (2), wherein the learning unit changes the acquired personality based on an experience of the autonomous moving body and an interaction with a user.
(4)
The control device according to (3), wherein the operation control unit controls a ratio of the operation based on the innate personality and the operation based on the acquired personality during charging of the autonomous moving body.
(5)
The control device according to (4), wherein the operation control unit increases a ratio of the operation based on the acquired personality as time passes from the start of operation of the autonomous moving body.
(6)
The control device according to any one of (1) to (5), wherein the operation control unit changes at least one of a motion and a posture performed by the autonomous moving body during charging, based on a personality and a remaining charge of the autonomous moving body.
(7)
The control device according to (6), wherein the operation control unit changes at least one of a speed and a magnitude of the motion performed by the autonomous moving body while charging, based on an individuality of the autonomous moving body and a remaining charge level.
(8)
The control device according to (6) or (7), wherein the operation control unit changes a probability that the autonomous moving body will select the motion to be executed while charging from among the plurality of motions, based on an individuality of the autonomous moving body and a remaining charge.
(9)
The control device according to any one of (6) to (8), wherein the operation control unit changes a posture between the motions based on a personality and a remaining charge of the autonomous moving body.
(10)
The control device according to any one of (6) to (9), wherein the operation control unit changes a facial expression of the autonomous moving body in the motion based on a personality of the autonomous moving body and a remaining charge level.
(11)
The control device according to any one of (6) to (10), wherein the operation control unit executes the motion specific to an individuality of the autonomous moving body during charging.
(12)
The control device according to any one of (1) to (11), wherein the operation control unit controls an operation of the autonomous moving body during charging, further based on whether or not at least one of an owner and another autonomous moving body is present in the vicinity.
(13)
The control device according to any one of (1) to (12), wherein the operation control unit further controls an operation of the autonomous moving body when the autonomous moving body moves to a charging device for charging.
(14)
The control device according to (13), wherein, when it is difficult for the autonomous moving body to move to the charging device due to a lack of remaining charge, the operation control unit notifies other autonomous moving bodies of the lack of remaining charge and controls the operation of the autonomous moving body to wait.
(15)
The control device according to (13) or (14), wherein, when another autonomous moving body is moving toward the same charging device, the operation control unit controls the operation of the autonomous moving body based on a result of comparing a remaining charge of the autonomous moving body with that of the other autonomous moving body.
(16)
The control device according to any one of (1) to (15), wherein the operation control unit controls the operation of the autonomous moving body to assist another autonomous moving body when there is another autonomous moving body that has difficulty moving to a charging device due to a lack of remaining charge.
(17)
The control device according to (16), wherein the operation control unit controls an operation of the autonomous moving body so as to share power with the other autonomous moving body.
(18)
The control device according to (16) or (17), wherein the operation control unit controls an operation of the autonomous moving body so as to notify surroundings of the presence of the other autonomous moving body.
(19)
The control device according to any one of (1) to (18), wherein the autonomous moving body is a pet-type robot.
(20)
The control device,
A control method for controlling an operation of an autonomous moving body during charging based on an individuality and a remaining charge of the autonomous moving body.

 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.

 1 情報処理システム, 11-1乃至11-n 自律移動体, 12-1乃至12-n 情報処理端末, 13 情報処理サーバ, 51L,51R ディスプレイ, 71 アクチュエータ, 101 充電台, 121 入力部, 122 メインコントロール部, 124 瞳ディスプレイ制御部, 125 口駆動制御部, 126 口駆動部, 127 首駆動制御部, 128 首駆動部, 129 脚駆動制御部, 130 脚駆動部, 131 尻尾駆動制御部, 132 尻尾駆動部, 133 音声制御部, 134 スピーカ, 135 無線通信モジュール, 151 認識部, 152 学習部, 153 動作制御部 1 Information processing system, 11-1 to 11-n Autonomous mobile body, 12-1 to 12-n Information processing terminal, 13 Information processing server, 51L, 51R Display, 71 Actuator, 101 Charging stand, 121 Input unit, 122 Main control unit, 124 Eye display control unit, 125 Mouth drive control unit, 126 Mouth drive unit, 127 Neck drive control unit, 128 Neck drive unit, 129 Leg drive control unit, 130 Leg drive unit, 131 Tail drive control unit, 132 Tail drive unit, 133 Voice control unit, 134 Speaker, 135 Wireless communication module, 151 Recognition unit, 152 Learning unit, 153 Motion control unit

Claims (20)

 自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する動作制御部を
 備える制御装置。
A control device comprising: an operation control unit that controls an operation of an autonomous moving body during charging based on an individuality of the autonomous moving body and a remaining charge amount of the autonomous moving body.
 前記自律移動体の経験及びユーザとの関わりに基づいて、前記自律移動体の個性を変化させる学習部を
 さらに備える請求項1に記載の制御装置。
The control device according to claim 1 , further comprising a learning unit that changes a personality of the autonomous mobile body based on an experience of the autonomous mobile body and an interaction with a user.
 前記自律移動体の個性は、先天的な個性及び後天的な個性を含み、
 前記学習部は、前記自律移動体の経験及びユーザとの関わりに基づいて、前記後天的な個性を変化させる
 請求項2に記載の制御装置。
The personality of the autonomous moving body includes an innate personality and an acquired personality,
The control device according to claim 2 , wherein the learning unit changes the acquired personality based on an experience of the autonomous moving body and an interaction with a user.
 前記動作制御部は、前記自律移動体の充電中の前記先天的な個性に基づく動作と前記後天的な個性に基づく動作の比率を制御する
 請求項3に記載の制御装置。
The control device according to claim 3 , wherein the operation control unit controls a ratio of the operation based on the innate personality and the operation based on the acquired personality during charging of the autonomous moving body.
 前記動作制御部は、前記自律移動体の稼働開始時からの時間の経過に伴い、前記後天的な個性に基づく動作の比率を大きくする
 請求項4に記載の制御装置。
The control device according to claim 4 , wherein the operation control unit increases a ratio of the operation based on the acquired personality as time passes from the start of operation of the autonomous moving body.
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が充電中に実行するモーション及び姿勢のうち少なくとも1つを変化させる
 請求項1に記載の制御装置。
The control device according to claim 1 , wherein the operation control unit changes at least one of a motion and a posture performed by the autonomous moving body during charging, based on a personality and a remaining charge of the autonomous moving body.
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が充電中に実行する前記モーションの速度及び大きさのうち少なくとも1つを変化させる
 請求項6に記載の制御装置。
The control device according to claim 6 , wherein the operation control unit changes at least one of a speed and a magnitude of the motion performed by the autonomous moving body while the autonomous moving body is charging, based on an individuality of the autonomous moving body and a remaining charge level.
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記自律移動体が複数の前記モーションの中から充電中に実行する前記モーションを選択する確率を変更する
 請求項6に記載の制御装置。
The control device according to claim 6 , wherein the operation control unit changes a probability that the autonomous moving body will select the motion to be executed while charging from among the plurality of motions, based on an individuality of the autonomous moving body and a remaining charge level.
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記モーション間の姿勢を変化させる
 請求項6に記載の制御装置。
The control device according to claim 6 , wherein the operation control unit changes a posture between the motions based on a personality of the autonomous moving body and a remaining charge level.
 前記動作制御部は、前記自律移動体の個性及び充電残量に基づいて、前記モーションにおける前記自律移動体の表情を変化させる
 請求項6に記載の制御装置。
The control device according to claim 6 , wherein the operation control unit changes a facial expression of the autonomous moving body in the motion based on a personality of the autonomous moving body and a remaining charge level.
 前記動作制御部は、充電中に前記自律移動体の個性に特有の前記モーションを実行させる
 請求項6に記載の制御装置。
The control device according to claim 6 , wherein the operation control unit causes the autonomous moving body to execute the motion specific to the personality of the autonomous moving body during charging.
 前記動作制御部は、オーナ及び他の自律移動体のうち少なくとも1つが周囲に存在するか否かにさらに基づいて、前記自律移動体の充電中の動作を制御する
 請求項1に記載の制御装置。
The control device according to claim 1 , wherein the operation control unit controls the operation of the autonomous moving body during charging further based on whether or not at least one of an owner and another autonomous moving body is present in the vicinity.
 前記動作制御部は、充電するために充電装置まで移動する際の前記自律移動体の動作をさらに制御する
 請求項1に記載の制御装置。
The control device according to claim 1 , wherein the operation control unit further controls an operation of the autonomous moving body when the autonomous moving body moves to a charging device for charging.
 前記動作制御部は、前記自律移動体が充電残量不足により前記充電装置までの移動が困難である場合、他の自律移動体に充電残量の不足を通知し、待機するように前記自律移動体の動作を制御する
 請求項13に記載の制御装置。
The control device according to claim 13, wherein when it is difficult for the autonomous moving body to move to the charging device due to a lack of remaining charge, the operation control unit notifies another autonomous moving body of the lack of remaining charge and controls the operation of the autonomous moving body to wait.
 前記動作制御部は、他の自律移動体が同じ前記充電装置の方に移動している場合、前記自律移動体と前記他の自律移動体の充電残量とを比較した結果に基づいて、前記自律移動体の動作を制御する
 請求項13に記載の制御装置。
The control device according to claim 13, wherein, when another autonomous moving body is moving toward the same charging device, the operation control unit controls the operation of the autonomous moving body based on a result of comparing a remaining charge amount of the autonomous moving body with that of the other autonomous moving body.
 前記動作制御部は、充電残量不足により充電装置まで移動することが困難な他の自律移動体が存在する場合、前記他の自律移動体を支援するように前記自律移動体の動作を制御する
 請求項1に記載の制御装置。
The control device according to claim 1 , wherein, when there is another autonomous moving body that has difficulty moving to a charging device due to a lack of a remaining charge, the operation control unit controls the operation of the autonomous moving body so as to assist the other autonomous moving body.
 前記動作制御部は、前記他の自律移動体に電力を分け与えるように前記自律移動体の動作を制御する
 請求項16に記載の制御装置。
The control device according to claim 16 , wherein the operation control unit controls the operation of the autonomous moving body so as to share power with the other autonomous moving body.
 前記動作制御部は、前記他の自律移動体の存在を周囲に通知するように前記自律移動体の動作を制御する
 請求項16に記載の制御装置。
The control device according to claim 16 , wherein the operation control unit controls the operation of the autonomous moving body so as to notify surroundings of the presence of the other autonomous moving body.
 前記自律移動体は、ペット型のロボットである
 請求項1に記載の制御装置。
The control device according to claim 1 , wherein the autonomous moving body is a pet-type robot.
 制御装置が、
 自律移動体の個性及び充電残量に基づいて、前記自律移動体の充電中の動作を制御する
 制御方法。
The control device,
A control method for controlling an operation of an autonomous moving body during charging based on an individuality and a remaining charge of the autonomous moving body.
PCT/JP2024/026975 2023-08-16 2024-07-29 Control device and control method Pending WO2025037529A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-132503 2023-08-16
JP2023132503 2023-08-16

Publications (1)

Publication Number Publication Date
WO2025037529A1 true WO2025037529A1 (en) 2025-02-20

Family

ID=94632326

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/026975 Pending WO2025037529A1 (en) 2023-08-16 2024-07-29 Control device and control method

Country Status (1)

Country Link
WO (1) WO2025037529A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002059389A (en) * 2000-08-16 2002-02-26 Sony Corp Autonomous walking robot device
JP2010218530A (en) * 2009-02-19 2010-09-30 Yaskawa Electric Corp Motion control system
JP2021181141A (en) * 2020-05-20 2021-11-25 セイコーエプソン株式会社 Charging method and charging system
JP2022142112A (en) * 2021-03-16 2022-09-30 カシオ計算機株式会社 Equipment control device, equipment control method and program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002059389A (en) * 2000-08-16 2002-02-26 Sony Corp Autonomous walking robot device
JP2010218530A (en) * 2009-02-19 2010-09-30 Yaskawa Electric Corp Motion control system
JP2021181141A (en) * 2020-05-20 2021-11-25 セイコーエプソン株式会社 Charging method and charging system
JP2022142112A (en) * 2021-03-16 2022-09-30 カシオ計算機株式会社 Equipment control device, equipment control method and program

Similar Documents

Publication Publication Date Title
JP7400923B2 (en) Information processing device and information processing method
JP7238796B2 (en) ANIMAL-TYPE AUTONOMOUS MOBILE BODY, OPERATION METHOD OF ANIMAL-TYPE AUTONOMOUS MOBILE BODY, AND PROGRAM
EP3456487A2 (en) Robot, method of controlling the same, and program
US12204338B2 (en) Information processing apparatus, information processing method, and program
JP7375770B2 (en) Information processing device, information processing method, and program
JP7363809B2 (en) Information processing device, information processing method, and program
US11938625B2 (en) Information processing apparatus, information processing method, and program
US20250282050A1 (en) Autonomous mobile body, information processing method, and information processing device
US20250026020A1 (en) Information processing device and information processing method
CN113305884A (en) Action autonomous robot with emergency shutdown function
US11986959B2 (en) Information processing device, action decision method and program
WO2020166373A1 (en) Information processing device and information processing method
US20240066420A1 (en) Autonomous moving object and information processing method
JP2004130427A (en) Robot apparatus and operation control method for robot apparatus
US20240367065A1 (en) Autonomous mobile body, information processing method, and program
WO2025037529A1 (en) Control device and control method
JP2007125629A (en) Robot apparatus and behavior control method thereof
US20240367066A1 (en) Autonomous mobile body, information processing method, and program
US12487594B2 (en) Information processing apparatus, and information processing method for interaction with an autonomous mobile body
US20240019868A1 (en) Autonomous mobile body, information processing apparatus, information processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24854120

Country of ref document: EP

Kind code of ref document: A1