[go: up one dir, main page]

WO2024166661A1 - Moving body - Google Patents

Moving body Download PDF

Info

Publication number
WO2024166661A1
WO2024166661A1 PCT/JP2024/001640 JP2024001640W WO2024166661A1 WO 2024166661 A1 WO2024166661 A1 WO 2024166661A1 JP 2024001640 W JP2024001640 W JP 2024001640W WO 2024166661 A1 WO2024166661 A1 WO 2024166661A1
Authority
WO
WIPO (PCT)
Prior art keywords
tail
bending
straightening
moving body
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2024/001640
Other languages
French (fr)
Japanese (ja)
Inventor
文哉 中野
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Group Corp
Original Assignee
Sony Group Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Group Corp filed Critical Sony Group Corp
Priority to JP2024576213A priority Critical patent/JPWO2024166661A1/ja
Publication of WO2024166661A1 publication Critical patent/WO2024166661A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63HTOYS, e.g. TOPS, DOLLS, HOOPS OR BUILDING BLOCKS
    • A63H11/00Self-movable toy figures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J5/00Manipulators mounted on wheels or on carriages

Definitions

  • This technology relates to moving bodies, and in particular to moving bodies with parts such as tails that can bend and stretch.
  • the core member and multiple support members may give the user a rough impression that differs from the tail of a real pet. As a result, the expressiveness of the pet robot may be reduced.
  • This technology was developed in light of these circumstances, and aims to improve the expressiveness of moving objects that have parts that can bend and stretch, such as the tails of pet-type robots.
  • a moving body includes a first portion, a second portion connected to the first portion and having a bendable elastic body, a bending and straightening mechanism that bends and straightens the second portion by bending and straightening the elastic body using a wire inserted into the second portion in the direction in which the second portion extends from the first portion, and a rotation mechanism that rotates the second portion around a rotation axis parallel to the direction in which the second portion is connected to the first portion.
  • the elastic body is bent and stretched using a wire inserted into the second portion in the direction in which the second portion extends from the first portion, thereby bending and stretching the second portion and rotating the second portion around an axis of rotation parallel to the direction in which the second portion connects to the first portion.
  • FIG. 1 is a left side view showing an example of the external configuration of an autonomous moving body to which the present technology is applied.
  • 1 is a top view showing an example of the external configuration of an autonomous moving body to which the present technology is applied.
  • FIG. 1 is a perspective view showing an example of the external configuration of an autonomous moving body to which the present technology is applied;
  • 1 is a diagram showing an example of the configuration of a display and a sensor provided in an autonomous moving body to which the present technology is applied.
  • 2 illustrates an example of the configuration of an actuator provided on an autonomous moving body to which the present technology is applied.
  • 1 is a cross-sectional view showing a schematic configuration example of a flexible active mechanism of an autonomous moving body.
  • FIG. 1 is a perspective view showing a schematic configuration example of a flexible active mechanism of an autonomous moving body.
  • 11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail.
  • 11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail.
  • 11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail.
  • FIG. 2 is a block diagram showing an example of a functional configuration of an autonomous moving body.
  • FIG. 2 is a block diagram showing a configuration example of a tail drive unit.
  • 11A and 11B are diagrams for explaining an example of a method for estimating an external force.
  • FIG. 11 is a diagram for explaining an example of a method for detecting an obstacle.
  • 11A to 11C are diagrams illustrating an example of a method for grasping an object by the tail.
  • 11 is a flowchart for explaining a favorable interaction operation of an autonomous moving body.
  • FIG. 13 is a diagram for explaining a favorable interaction operation of an autonomous moving body.
  • 11 is a flowchart for explaining an adversarial interaction operation of an autonomous moving body.
  • FIG. 13 is a diagram for explaining an adversarial interaction operation of an autonomous moving body.
  • 11 is a flowchart for explaining a backward movement operation of the autonomous moving body.
  • 11 is a flowchart for explaining an object grasping operation of an autonomous moving body.
  • FIG. 13 is a diagram for explaining an object grasping operation of the autonomous moving body.
  • FIG. 11 is a cross-sectional view showing a schematic diagram of a modified example of the flexible active mechanism of the autonomous moving body.
  • FIG. 13 is a perspective view showing a schematic diagram of a modified example of the flexible active mechanism of the autonomous moving body.
  • FIG. 1 is a block diagram illustrating an example of the configuration of a computer.
  • FIG. 1 is a left side view of the autonomous mobile body 11.
  • FIG. 2 is a top view of the autonomous mobile body 11.
  • FIG. 3 is a perspective view of the autonomous mobile body 11.
  • the autonomous mobile body 11 is a dog-type quadruped robot that has a head 21, a torso 22, four legs 23FL to 23HR, and a tail 24.
  • legs 23 when there is no need to distinguish between legs 23FL to 23HR, they will simply be referred to as legs 23.
  • FIG. 4 shows an example of the configuration of the display and sensors equipped on the autonomous moving body 11.
  • the autonomous moving body 11 is equipped with two displays, a display 51L and a display 51R, on the head 21.
  • a display 51L and a display 51R on the head 21.
  • the display 51 when there is no need to distinguish between the display 51L and the display 51R, they will simply be referred to as the display 51.
  • Each display 51 has the function of visually expressing the eye movements and emotions of the autonomous mobile body 11.
  • each display 51 can express the movements of the eyeballs, pupils, and eyelids according to emotions and actions, producing natural movements close to those of real animals such as dogs, and can express the line of sight and emotions of the autonomous mobile body 11 with high precision and flexibility.
  • the user can intuitively grasp the state of the autonomous mobile body 11 from the eye movements displayed on the display 51.
  • the autonomous moving body 11 also includes various sensors.
  • the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.
  • a microphone 52 for example, the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.
  • ToF Time Of Flight
  • the autonomous mobile body 11 has, for example, four microphones 52 on the head 21.
  • Each microphone 52 collects surrounding sounds including, for example, the user's speech and surrounding environmental sounds. Furthermore, by having multiple microphones 52, it becomes possible to collect surrounding sounds with high sensitivity and to localize the sound source.
  • the autonomous mobile body 11 is equipped with two wide-angle cameras 53, for example, at the nose and waist, which capture images of the autonomous mobile body 11's surroundings.
  • the camera 53 located at the nose captures images within the autonomous mobile body 11's forward field of view (i.e., the dog's field of view).
  • the camera 53 located at the waist captures images of the surroundings centered around the upper part of the autonomous mobile body 11.
  • the autonomous mobile body 11 can extract feature points of the ceiling, for example, based on images captured by the camera 53 located at the waist, and achieve SLAM (Simultaneous Localization and Mapping).
  • SLAM Simultaneous Localization and Mapping
  • the ToF sensor 54 is provided, for example, at the tip of the nose, and detects the distance to an object present in front of the head 21.
  • the ToF sensor 54 enables the autonomous mobile body 11 to detect the distance to various objects with high accuracy, and can realize operations according to the relative position to targets including the user, obstacles, etc.
  • the human presence sensor 55 is placed, for example, on the chest and detects the location of the user or a pet kept by the user. By detecting an animal object in front of the autonomous mobile body 11 using the human presence sensor 55, the autonomous mobile body 11 can perform various actions toward the animal object, such as actions corresponding to emotions such as interest, fear, or surprise.
  • the distance measurement sensor 56 is placed, for example, on the chest and detects the condition of the floor surface in front of the autonomous mobile body 11.
  • the distance measurement sensor 56 allows the autonomous mobile body 11 to accurately detect the distance to an object present on the floor surface in front of it, and to realize operations according to the relative position of the object.
  • the touch sensor 57 is arranged in areas where the user is likely to touch the autonomous moving body 11, such as the top of the head, under the chin, and on the back, and detects contact (touch) by the user.
  • the touch sensor 57 is composed of, for example, a capacitive or pressure-sensitive touch sensor.
  • the autonomous moving body 11 can detect contact actions by the user, such as touching, stroking, tapping, and pushing, using the touch sensor 57, and can perform an action according to the contact action.
  • the touch sensor 57 is arranged in a line or a plane on each part, making it possible to detect the position touched within each part.
  • the illuminance sensor 58 is disposed, for example, at the base of the tail 24 on the back of the head 21, and detects the illuminance of the space in which the autonomous mobile body 11 is located.
  • the autonomous mobile body 11 can detect the surrounding brightness using the illuminance sensor 58 and perform an operation according to the brightness.
  • the sole buttons 59 are, for example, arranged on the areas of the four legs 23 that correspond to the paw pads, and detect whether the bottom surfaces of the legs 23 of the autonomous mobile body 11 are in contact with the floor.
  • the sole buttons 59 enable the autonomous mobile body 11 to detect contact or non-contact with the floor surface, and can, for example, know when it has been picked up by a user.
  • the inertial sensor 60 is, for example, disposed in the head 21 and torso 22, respectively, and detects physical quantities such as the speed, acceleration, and rotation of the head 21 and torso 22.
  • the inertial sensor 60 is configured with a six-axis sensor that detects the acceleration and angular velocity of the X-axis, Y-axis, and Z-axis.
  • the autonomous mobile body 11 can detect the movement of the head 21 and torso 22 with high accuracy using the inertial sensor 60, and realize operation control according to the situation.
  • the configuration of the sensors equipped in the autonomous mobile body 11 can be flexibly changed depending on the specifications, operation, etc.
  • the autonomous mobile body 11 may further include various communication devices including a temperature sensor, a geomagnetic sensor, and a GNSS (Global Navigation Satellite System) signal receiver.
  • GNSS Global Navigation Satellite System
  • FIG. 5 shows an example of the configuration of an actuator 71 of the autonomous mobile body 11.
  • the autonomous mobile body 11 has a total of 22 degrees of rotational freedom, two each in the ears and tail 24, and one in the mouth.
  • the autonomous mobile body 11 has three degrees of freedom in the head 21, which allows it to perform both nodding and tilting its head.
  • the autonomous mobile body 11 can reproduce the swinging motion of its hips using the actuator 71 in its hips, allowing it to achieve natural and flexible movements that are closer to those of a real dog.
  • the autonomous mobile body 11 may achieve the above 22 degrees of rotational freedom by combining, for example, a single-axis actuator and a two-axis actuator.
  • single-axis actuators may be used in the elbows and knees of the legs 23, and two-axis actuators may be used in the shoulders and the bases of the thighs.
  • Fig. 6 is a cross-sectional view that shows a flexible active mechanism.
  • Fig. 7 is a perspective view that shows a flexible active mechanism.
  • the diagonal line pattern is omitted in part of the cross section to make the drawing easier to understand.
  • the flexible active mechanism of the autonomous mobile body 11 includes a tail section 24 and a drive mechanism 101.
  • the tail section 24 includes an elastic body 111, a cap 112, and a connection member 113.
  • the drive mechanism 101 includes a connection member 113, a wire 121, a bearing 122, a roll axis actuator 123b, a pitch axis actuator 123a, a winding section 124, and a gear 125.
  • the connection member 113 is a component of both the tail section 24 and the drive mechanism 101.
  • the rotation axis 113B, the bearing 122, the pitch axis actuator 123a, the roll axis actuator 123b, the winding section 124, and the gear 125 of the connection member 113 are disposed within the body section 22.
  • the elastic body 111 has a long, thin shape that resembles a dog's tail.
  • the elastic body 111 is made of a single elastic body that is flexible and can be bent (curved), and is made of a resin such as silicone or an elastomer that has a texture similar to that of a dog's tail.
  • a spherical crown-shaped cap 112 is placed over the tip of the elastic body 111.
  • the base of the elastic body 111 is attached to a spherical crown-shaped attachment portion 113A of the connection member 113.
  • a cylindrical rotating shaft 113B protrudes from the tip of the mounting portion 113A of the connecting member 113.
  • a gear is formed at the tip of the rotating shaft 113B.
  • the rotating shaft 113B is inserted into the body section 22 from the rear end of the body section 22, and is inserted into a bearing 122 inside the body section 22.
  • This bearing 122 supports the tail section 24 so that it can rotate around the rotating shaft 113B (hereinafter referred to as the roll axis).
  • the roll axis is a rotating axis that is parallel to the connection direction of the tail section 24 to the body section 22 (the direction in which the connecting member 113 is inserted into the body section 22).
  • a single wire 121 is inserted into the tail 24 in the direction in which the tail 24 extends from the body 22.
  • the wire 121 is made of, for example, a metal such as a shape memory alloy, or resin.
  • the tip of the wire 121 is connected to and fixed in the cap 112.
  • the wire 121 extends in the longitudinal direction of the tail 24 (elastic body 111), passes through the connecting member 113, and is connected to the winding section 124 inside the body 22.
  • a winding section 124 is connected to the tip of the pitch axis actuator 123a.
  • the pitch axis actuator 123a rotates the winding section 124 to wind and unwind the wire 121.
  • the wire 121 is pulled and pushed out, and the elastic body 111 (tail section 24) is curved in the direction of arrow A1 (pitch axis direction). That is, when the wire 121 is wound and pulled, the elastic body 111 (tail section 24) is curved upward.
  • the elastic body 111 (tail section 24) stretches. That is, when the tail section 24 is not rotating around the roll axis, it is possible for the wire 121 to bend and stretch in the vertical direction.
  • the heightwise position of the tail 24 is controlled by controlling the amount of wire 121 wound up and down.
  • the tail 24 swings up and down by winding and unwinding the wire 121.
  • the speed at which the tail 24 swings up and down is controlled by controlling the speed at which the wire 121 is wound up and unwinded.
  • the width at which the tail 24 swings up and down is controlled by controlling the amount at which the wire 121 is wound up and unwinded.
  • the wire 121, pitch axis actuator 123a, and winding section 124 constitute a bending and straightening mechanism that bends and straightens the tail section 24.
  • the bending and straightening mechanism also includes a winding mechanism that winds and unwinds the wire 121.
  • a gear 125 is connected to the tip of the roll axis actuator 123b.
  • the gear 125 is engaged with a gear at the tip of the rotating shaft 113B of the connecting member 113.
  • the connecting member 113 supported by the bearing 122 rotates around the roll axis. This causes the tail section 24 to rotate around the roll axis as shown by the arrow A2.
  • the tail section 24 can swing left and right by rotating about the roll axis.
  • the speed at which the tail section 24 swings left and right is controlled by controlling the rotation speed of the tail section 24 about the roll axis.
  • the width at which the tail section 24 swings left and right is controlled by controlling the amount of rotation of the tail section 24 about the roll axis.
  • the rotation shaft 113B, the bearing 122, the roll axis actuator 123b, and the gear 125 form a rotation mechanism that rotates the tail section 24 around the roll axis.
  • actuator 123 when there is no need to distinguish between the pitch axis actuator 123a and the roll axis actuator 123b, they will simply be referred to as actuator 123.
  • the above configuration of the autonomous mobile body 11 allows precise and flexible control of the movements of the joints, eyeballs, and tail 24, making it possible to reproduce movements and emotional expressions that are closer to those of real living creatures.
  • Dogs express various emotions through tail movements, although there are some differences depending on the breed, size, etc.
  • the dog's emotions are expressed in the movement of its tail.
  • the dog's emotions are expressed by the vertical position of the tail, the speed and strength of the tail wagging up and down or from side to side, etc.
  • the tail will be raised higher than normal.
  • the tail will be lower than normal.
  • the speed and strength of tail wagging changes depending on the strength of the emotion.
  • the autonomous mobile body 11 can bend, straighten, and rotate the tail 24, as described above.
  • the autonomous mobile body 11 can express emotions in the same way as a real dog.
  • the autonomous mobile body 11 expresses a relaxed feeling by letting the tail 24 hang down as shown in FIG. 8A, and then swinging the tail 24 from side to side as shown in FIG. 8B and C.
  • the autonomous mobile body 11 expresses excitement by swinging the tail 24 up and down as shown in A and B of FIG. 9, in other words, by repeatedly hanging the tail 24 down and rolling it up.
  • the autonomous mobile body 11 expresses high tension by curving the tail 24 upward as shown in A of FIG. 10, and then swinging the tail 24 from side to side as shown in B and C of FIG. 10.
  • the autonomous mobile body 11 expresses a frightened emotion by rotating the tail 24 180 degrees as shown in A and B of FIG. 11, and then curling the tail 24 downward as shown in C of FIG. 11.
  • the autonomous mobile body 11 includes an input unit 201, a communication unit 202, an information processing unit 203, a driving unit 204, an output unit 205, and a storage unit 206.
  • the input unit 201 includes various sensors as shown in FIG. 4 and has a function of collecting various sensor data related to the user and the surrounding conditions.
  • the input unit 201 also includes input devices such as switches and buttons.
  • the input unit 201 supplies the collected sensor data and input data input via the input devices to the information processing unit 203.
  • the communication unit 202 communicates with other autonomous mobile bodies 11, information processing terminals such as smartphones (not shown), and information processing servers (not shown), and transmits and receives various types of data.
  • the communication unit 202 supplies the received data to the information processing unit 203, and obtains data to be transmitted from the information processing unit 203.
  • the communication method of the communication unit 202 is not particularly limited and can be flexibly changed according to the specifications and operation.
  • the information processing unit 203 includes, for example, a processor such as a CPU (Central Processing Unit), and performs various types of information processing and controls each part of the autonomous mobile body 11.
  • the information processing unit 203 includes a recognition unit 221, a learning unit 222, an action planning unit 223, and an operation control unit 224.
  • the recognition unit 221 recognizes the situation in which the autonomous mobile body 11 is placed, based on the sensor data and input data supplied from the input unit 201, the received data supplied from the communication unit 202, and data (hereinafter referred to as external force estimation data) supplied from the drive unit 204 indicating the estimated results of the external force applied to the tail unit 24 from the outside.
  • the situation in which the autonomous mobile body 11 is placed includes, for example, the situation of itself and its surroundings.
  • the situation of itself includes, for example, the state and movement of the autonomous mobile body 11.
  • the surrounding situation includes, for example, the state, movement, and instructions of people in the vicinity such as a user, the state and movement of surrounding living things such as pets, the state and movement of surrounding objects, time, place, and the surrounding environment.
  • the surrounding objects include, for example, other autonomous mobile bodies.
  • the recognition unit 221 performs, for example, person identification, facial expression and gaze recognition, emotion recognition, object recognition, action recognition, spatial area recognition, color recognition, shape recognition, marker recognition, obstacle recognition, step recognition, brightness recognition, temperature recognition, voice recognition, word understanding, position estimation, posture estimation, etc.
  • the recognition unit 221 has a function of estimating and understanding the situation based on the various recognized information. At this time, the recognition unit 221 may make a comprehensive estimation of the situation using knowledge stored in advance.
  • the recognition unit 221 supplies data indicating the result of the recognition or estimation of the situation (hereinafter referred to as situation data) to the learning unit 222 and the action planning unit 223. In addition, the recognition unit 221 registers the situation data in the action history data stored in the memory unit 206.
  • the behavior history data is data that indicates the history of the behavior of the autonomous mobile body 11.
  • the behavior history data includes, for example, items such as the date and time when the behavior started, the date and time when the behavior ended, the trigger for performing the behavior, the location where the behavior was instructed (if a location was instructed), the situation when the behavior was performed, and whether the behavior was completed (whether the behavior was performed to the end).
  • the learning unit 222 learns the situation, behavior, and the effect of the behavior on the environment based on the sensor data and input data supplied from the input unit 201, the received data supplied from the communication unit 202, the situation data supplied from the recognition unit 221, the data related to the behavior of the autonomous mobile body 11 supplied from the behavior planning unit 223, and the behavior history data stored in the memory unit 206. For example, the learning unit 222 performs pattern recognition learning and learns behavior patterns corresponding to the user's discipline.
  • the learning unit 222 realizes the above learning by using a machine learning algorithm such as deep learning.
  • the learning algorithm adopted by the learning unit 222 is not limited to the above example, and can be designed as appropriate.
  • the learning unit 222 supplies data indicating the learning results (hereinafter referred to as learning result data) to the action planning unit 223 and stores the data in the memory unit 206.
  • the action planning unit 223 plans an action (e.g., behavior) to be performed by the autonomous mobile body 11 based on the recognized or estimated situation, the learning result data, and feedback data indicating the state of the autonomous mobile body 11 supplied from the operation control unit 224.
  • the action planning unit 223 supplies data indicating the planned action (hereinafter referred to as action plan data) to the operation control unit 224.
  • the action planning unit 223 also supplies data regarding the action of the autonomous mobile body 11 to the learning unit 222, and registers the data in the action history data stored in the memory unit 206.
  • the operation control unit 224 controls the operation of the autonomous mobile body 11 to execute the planned action by controlling the drive unit 204 and the output unit 205 based on the action plan data. For example, the operation control unit 224 controls the rotation of the actuators 71 and 123, the display of the display 51, and the audio output from the speaker based on the action plan.
  • the operation control unit 224 supplies feedback data indicating the state of the autonomous mobile body 11 after the control of its operation to the action plan unit 223.
  • the drive unit 204 includes a joint drive unit 231 and a tail drive unit 232.
  • the joint driving unit 231 drives the actuators 71 provided in each joint based on the control of the motion control unit 224, thereby bending and extending the multiple joints of the autonomous mobile body 11.
  • the tail drive unit 232 drives the tail 24 of the autonomous mobile body 11 based on the control of the motion control unit 224.
  • the tail drive unit 232 includes a bending and straightening drive unit 232a and a rotation drive unit 232b.
  • the bending and straightening drive unit 232a bends and straightens the tail section 24 by driving a bending and straightening mechanism including the pitch axis actuator 123a based on the control of the motion control unit 224.
  • the bending and straightening drive unit 232a estimates the external force applied to the tail section 24 in the bending and straightening direction, and supplies external force estimation data indicating the estimated external force to the information processing unit 203.
  • the rotation drive unit 232b drives a rotation mechanism including the roll axis actuator 123b under the control of the motion control unit 224 to rotate the tail section 24 around the roll axis.
  • the rotation drive unit 232b estimates the external force in the rotational direction applied to the tail section 24, and supplies external force estimation data indicating the estimated external force to the information processing unit 203.
  • tail drive unit 232 when there is no need to distinguish between the bending and straightening drive unit 232a and the rotation drive unit 232b, they will be collectively referred to as the tail drive unit 232.
  • the output unit 205 includes, for example, a display 51, a speaker, a haptic device, etc., and outputs visual information, auditory information, tactile information, etc. based on the control of the operation control unit 224.
  • the storage unit 206 includes, for example, non-volatile memory and volatile memory, and stores various programs and data.
  • FIG. 13 shows an example of the configuration of the tail drive unit 232 (the bending and straightening drive unit 232a and the rotation drive unit 232b).
  • the tail drive unit 232 includes a force controller 261, a position controller 262, an instruction value combiner 263, a driver 264, and a force estimator 265.
  • the force controller 261 obtains data indicating a target value of the force (hereinafter referred to as driving force) to be applied to the tail section 24 by the actuator 123 from the motion control section 224.
  • the force controller 261 obtains data indicating an estimated value of an external force applied to the tail section 24 from the force estimator 265.
  • the force controller 261 calculates a command value of the driving force to be applied to the tail section 24 based on the target value of the driving force and the estimated value of the external force.
  • the force controller 261 supplies data indicating the command value of the driving force to be applied to the tail section 24 to the command value combiner 263.
  • the force controller 261 of the bending and straightening drive unit 232a calculates an instruction value of the driving force in the bending and straightening direction for the tail 24 based on a target value of the driving force in the bending and straightening direction for the tail 24 and an estimated value of the external force in the bending and straightening direction.
  • the force controller 261 of the bending and straightening drive unit 232a supplies data indicating the instruction value of the driving force in the bending and straightening direction for the tail 24 to the instruction value combiner 263.
  • the force controller 261 of the rotational drive unit 232b calculates a command value for the driving force in the rotational direction of the tail section 24 based on a target value for the driving force in the rotational direction for the tail section 24 and an estimated value for the external force in the rotational direction.
  • the force controller 261 of the rotational drive unit 232b supplies data indicating the command value for the driving force in the rotational direction for the tail section 24 to the command value combiner 263.
  • the position controller 262 obtains data indicating a target value of the position (or speed) at which the actuator 123 moves the tail section 24 from the motion control section 224.
  • the position controller 262 obtains data indicating a detected value of the position (or speed) of the tail section 24 from the position sensor 251.
  • the position controller 262 calculates an instruction value for the position (or speed) of the tail section 24 based on the target value and the detected value of the position (or speed) of the tail section 24.
  • the force controller 261 supplies data indicating the instruction value for the position (or speed) of the tail section 24 to the instruction value combiner 263.
  • the position controller 262 of the bending and straightening drive unit 232a calculates an instruction value for the position (or speed) in the bending and straightening direction of the tail section 24 based on a target value and a detected value of the position (or speed) in the bending and straightening direction of the tail section 24.
  • the position controller 262 of the bending and straightening drive unit 232a supplies data indicating the instruction value for the position (or speed) in the bending and straightening direction of the tail section 24 to the instruction value combiner 263.
  • the position controller 262 of the rotation drive unit 232b calculates an instruction value for the position (or speed) in the rotation direction of the tail section 24 based on a target value and a detected value of the position (or speed) in the rotation direction of the tail section 24.
  • the position controller 262 of the rotation drive unit 232b supplies data indicating the instruction value for the position (or speed) in the rotation direction of the tail section 24 to the instruction value combiner 263.
  • the instruction value combiner 263 calculates target values for the amount of rotation and the rotation speed of the actuator 123 based on an instruction value for the driving force to be applied to the tail section 24 and an instruction value for the position (or speed) of the tail section 24.
  • the instruction value combiner 263 calculates a current value for achieving the target values for the amount of rotation and the rotation speed of the actuator 123.
  • the instruction value combiner 263 supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.
  • the instruction value combiner 263 of the bending and straightening drive unit 232a calculates target values for the rotation amount and rotation speed of the pitch axis actuator 123a based on an instruction value for the driving force in the bending and straightening direction of the tail section 24 and an instruction value for the position (or speed) in the bending and straightening direction of the tail section 24.
  • the instruction value combiner 263 of the bending and straightening drive unit 232a calculates a current value for achieving the target values for the rotation amount and rotation speed of the pitch axis actuator 123a.
  • the instruction value combiner 263 of the bending and straightening drive unit 232a supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.
  • the instruction value combiner 263 of the rotation drive unit 232b calculates target values for the rotation amount and rotation speed of the roll axis actuator 123b based on an instruction value for the driving force in the rotation direction of the tail section 24 and an instruction value for the position (or speed) in the rotation direction of the tail section 24.
  • the instruction value combiner 263 of the rotation drive unit 232b calculates a current value for achieving the target values for the rotation amount and rotation speed of the roll axis actuator 123b.
  • the instruction value combiner 263 of the rotation drive unit 232b supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.
  • the driver 264 applies a current based on the current command value from the command value combiner 263 to the actuator 123.
  • the driver 264 supplies data indicating the current response value of the actuator 123 to the force estimator 265.
  • the driver 264 of the bending and straightening drive unit 232a applies a current based on the current command value from the command value combiner 263 to the pitch axis actuator 123a.
  • the driver 264 of the bending and straightening drive unit 232a supplies data indicating the current response value of the pitch axis actuator 123a to the force estimator 265.
  • the driver 264 of the rotation drive unit 232b applies a current based on the current command value from the command value combiner 263 to the roll axis actuator 123b.
  • the driver 264 of the rotation drive unit 232b supplies data indicating the current response value of the roll axis actuator 123b to the force estimator 265.
  • the force estimator 265 acquires data indicating the detection result of the position (or speed) of the actuator 123 in the rotational direction from the position sensor 251.
  • the force estimator 265 estimates the external force on the tail section 24 based on the current response value of the driver 264 (current value to the actuator 123) and the detection value of the position (or speed) of the actuator 123 in the rotational direction.
  • the force estimator 265 estimates the magnitude and direction of the external force on the tail section 24 based on the difference between the position (or speed) of the actuator 123 in the rotational direction assumed from the current response value of the driver 264 and the actually detected position (or speed) of the actuator 123 in the rotational direction.
  • the force estimator 265 supplies external force estimation data indicating an estimate of the external force to the force controller 261 and the information processing section 203.
  • the force estimator 265 of the bending and straightening drive unit 232a estimates the external force in the bending and straightening direction on the tail section 24 based on the current response value of the driver 264 (current value to the pitch axis actuator 123a) and the detected value of the position (or speed) in the rotational direction of the pitch axis actuator 123a.
  • the force estimator 265 supplies external force estimation data indicating an estimated value of the external force in the bending and straightening direction to the force controller 261 and the information processing unit 203.
  • the force estimator 265 of the rotational drive unit 232b estimates the external force in the rotational direction on the tail section 24 based on the current response value of the driver 264 (current value to the roll axis actuator 123b) and the detected value of the position (or speed) in the rotational direction of the roll axis actuator 123b.
  • the force estimator 265 supplies external force estimation data indicating an estimated value of the external force in the rotational direction to the force controller 261 and the information processing unit 203.
  • the force estimator 265 can also estimate the external force on the tail 24 using the current command value to the driver 264 instead of the current response value of the driver 264.
  • the position sensor 251 for example, constitutes part of the input unit 201 in FIG. 12, and is provided for each of the pitch axis actuator 123a and the roll axis actuator 123b.
  • the position sensor 251 detects the rotational position (or speed) of the pitch axis actuator 123a or the roll axis actuator 123b by detecting the physical amount of rotation of the pitch axis actuator 123a or the roll axis actuator 123b.
  • the position sensor 251 supplies data indicating the detected value of the rotational position (or speed) of the pitch axis actuator 123a or the roll axis actuator 123b to the force estimator 265 and the information processing unit 203.
  • the current consumption of the pitch axis actuator 123a increases or decreases when an external force is applied to the tail section 24.
  • the current consumption of the pitch axis actuator 123a required to hold the position of the tail section 24 increases.
  • the current consumption of the pitch axis actuator 123a required to hold the position of the tail section 24 decreases.
  • the force estimator 265 of the bending and straightening drive unit 232a estimates the magnitude and direction of the external force in the bending and straightening direction on the tail section 24 based on the difference between the rotational position (or speed) of the pitch axis actuator 123a estimated from the current response value of the driver 264 and the actually detected rotational position (or speed) of the pitch axis actuator 123a.
  • the recognition unit 221 can also recognize obstacles behind the autonomous mobile body 11 based on the estimated magnitude and direction of the external force on the tail 24.
  • the force estimator 265 of the bending and straightening drive unit 232a or the rotation drive unit 232b estimates the magnitude and direction of the external force on the tail 24 by the wall 301 using the method described above.
  • the recognition unit 221 recognizes that there is an obstacle behind the autonomous moving body 11 based on the estimation result of the magnitude and direction of the external force on the tail 24.
  • the recognition unit 221 can search for obstacles behind the autonomous mobile body 11 based on the estimated magnitude and direction of the external force acting on the tail 24.
  • the autonomous mobile body 11 rotates the tail section 24 alternately in both directions around the roll axis with the tail section 24 bent slightly upward, thereby backing up while swinging the tail section 24 from side to side.
  • an external force is applied to the tail section 24 by the obstacle.
  • the force estimator 265 of the bending and straightening drive unit 232a or the rotation drive unit 232b estimates the magnitude and direction of the external force on the tail section 24 caused by the obstacle using the method described above.
  • the recognition unit 221 recognizes that some kind of obstacle is present behind the autonomous moving body 11 based on the estimation result of the magnitude and direction of the external force on the tail section 24.
  • the recognition unit 221 can determine whether or not the tail portion 24 has been able to grasp an object based on the estimated magnitude and direction of the external force acting on the tail portion 24.
  • the force estimator 265 of the bending and straightening drive unit 232a estimates the magnitude and direction of the external force applied to the tail section 24 by the toy 311 using the method described above. Furthermore, the recognition unit 221 determines whether or not the autonomous mobile body 11 has grasped the toy 311 with the tail section 24 based on the estimation result of the magnitude and direction of the external force applied to the tail section 24.
  • the behavior planning unit 223 sets the operation mode of the autonomous mobile body 11 to the friendly mode.
  • the friendly mode is, for example, a mode that corresponds to a case where the autonomous mobile body 11 wants attention from a user, etc.
  • step S1 the autonomous mobile body 11 executes a play bow.
  • a play bow is, for example, a posture in which the autonomous mobile body 11 lowers its head 21 and raises its buttocks high, as shown in A of FIG. 18.
  • a play bow is performed when the autonomous mobile body 11 wants to play with the user.
  • the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform a play bow.
  • step S2 the recognition unit 221 determines whether or not the user has touched the tail 24. Specifically, the recognition unit 221 determines whether or not the user has touched the tail 24 by the method described above, based on the external force estimation data from the force estimator 265 of the tail drive unit 232. If it is determined that the user has not touched the tail 24, the process proceeds to step S3.
  • step S3 the recognition unit 221 determines whether or not the user has come into contact with the body based on the sensor data and input data supplied from the input unit 201.
  • the body refers to any part of the autonomous mobile body 11 other than the tail 24. If it is determined that the user has not come into contact with the body, the process proceeds to step S4.
  • step S4 the action planning unit 223 determines whether or not to continue the favor mode based on the situation recognized or estimated by the recognition unit 221. If it is determined that the favor mode should be continued, the process returns to step S1.
  • steps S1 to S4 is repeatedly executed until it is determined in step S2 that the user has come into contact with the tail 24, in step S3 that the user has come into contact with the body, or in step S4 that the favor mode is to be ended.
  • the autonomous mobile body 11 continues playing bow.
  • step S2 determines whether the user has touched the tail 24. If it is determined in step S2 that the user has touched the tail 24, processing proceeds to step S5.
  • step S5 the autonomous mobile body 11 plays with the object.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform a play action.
  • the autonomous mobile body 11 expresses a playful motion by wrapping the tail 24 around the user's hand or the like with a slow, constant force.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail 24, thereby wrapping the tail 24 around the user's hand or the like with a slow, constant force.
  • step S6 the recognition unit 221 determines whether or not the user is touching the tail 24 by the same process as in step S2. If it is determined that the user is touching the tail 24, the process returns to step S5.
  • step S6 the processes of steps S5 and S6 are repeatedly executed until it is determined that the user's contact with the tail 24 has ended. This allows the autonomous moving body 11 to continue its playful action.
  • step S6 determines whether the user's contact with the tail 24 has ended. If it is determined in step S6 that the user's contact with the tail 24 has ended, processing proceeds to step S7.
  • step S3 If it is determined in step S3 that the user has come into contact with the body, the process proceeds to step S7.
  • step S7 the autonomous mobile body 11 expresses joy.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform an action that expresses joy.
  • the autonomous mobile body 11 expresses joy by raising the tail 24 high and vigorously swinging it from side to side, as shown diagrammatically in FIG. 18B.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to lift the tail 24 by bending it significantly upward.
  • the rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail 24 alternately in both directions around the roll axis, causing the tail 24 to swing vigorously from side to side in a lifted state.
  • step S4 if the behavior planning unit 223 recognizes or estimates, for example, a situation in which a transition to a mode other than the favorable mode occurs, it determines to end the favorable mode, and the favorable interaction operation ends.
  • the behavior planning unit 223 sets the operation mode of the autonomous mobile body 11 to hostile mode.
  • the hostile mode is, for example, a mode that corresponds to a case where the autonomous mobile body 11 does not want the user or the like to pay attention to it.
  • step S51 the autonomous mobile body 11 takes an alert posture.
  • the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to take an alert posture.
  • the autonomous mobile body 11 lowers the tail 24 and swings it slowly from side to side.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to extend and lower the tail 24.
  • the rotation drive unit 232b drives the roll axis actuator 123b to slowly rotate the tail 24 alternately in both directions around the roll axis, causing the tail 24 to swing slowly from side to side with the tail 24 lowered.
  • step S52 similar to the process in step S2 of FIG. 17, it is determined whether or not the user has touched the tail 24. If it is determined that the user has not touched the tail 24, the process proceeds to step S53.
  • step S53 similar to the process in step S3 of FIG. 17, it is determined whether or not the user has touched the body. If it is determined that the user has not touched the body, the process proceeds to step S54.
  • step S54 the action planning unit 223 determines whether or not to continue the hostile mode based on the situation recognized or estimated by the recognition unit 221. If it is determined that the hostile mode should be continued, the process returns to step S51.
  • steps S51 to S54 is repeatedly executed until it is determined in step S52 that the user has come into contact with the tail 24, in step S53 that the user has come into contact with the body, or in step S54 that the hostile mode has been terminated.
  • the autonomous mobile body 11 continues its alert posture.
  • step S52 determines whether the user has touched the tail 24. If it is determined in step S52 that the user has touched the tail 24, processing proceeds to step S55.
  • step S55 the autonomous mobile body 11 hides the tail 24.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to execute the action of hiding the tail 24, as shown diagrammatically in B of FIG. 20.
  • the rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail section 24 180 degrees around the roll axis.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24 and hide it by inserting the tip of the tail section 24 between the rear legs (legs 23HL and 23HR).
  • step S53 if it is determined in step S53 that the user has come into contact with the body, processing proceeds to step S56.
  • step S56 the autonomous mobile body 11 runs away from the user and expresses anger with the tail 24.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223, thereby causing the autonomous mobile body 11 to execute the action of running away from the user and expressing anger with the tail 24.
  • the autonomous mobile body 11 raises the tail 24 high and assumes a forward-leaning posture.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24, thereby lifting the tail section 24 high.
  • step S54 if the behavior planning unit 223 recognizes or estimates a situation in which the mode will transition to a mode other than the hostile mode, it determines to end the hostile mode, and the hostile interaction operation ends.
  • step S101 the autonomous mobile body 11 moves backward while swinging the tail 24 horizontally.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform the action of moving backward while swinging the tail 24 horizontally.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24, thereby lifting the tail section 24 in a substantially horizontal direction.
  • the rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail section 24 alternately in both directions around the roll axis, thereby swinging the tail section 24 from side to side while lifting it in a substantially horizontal direction.
  • step S102 a determination is made as to whether or not the tail 24 has come into contact with anything, using a process similar to that of step S2 in FIG. 17. If it is determined that the tail 24 has not come into contact with anything, the process proceeds to step S103.
  • step S103 the recognition unit 221 determines whether or not the destination has been reached based on the sensor data and input data supplied from the input unit 201. If it is determined that the destination has not been reached, the process returns to step S101.
  • steps S101 to S103 is repeatedly executed until it is determined in step S102 that the tail section 24 has come into contact with something, or until it is determined in step S103 that the destination has been reached.
  • the autonomous mobile body 11 continues to move backward while swinging the tail section 24 horizontally.
  • step S104 if it is determined in step S102 that the tail 24 has come into contact with something, processing proceeds to step S104.
  • step S104 the autonomous mobile body 11 swings its tail 24 to search for the location of an obstacle.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform an action of stopping and swinging its tail 24 to search for the location of an obstacle.
  • the rotation drive unit 232b drives the roll axis actuator 123b to slowly rotate the tail section 24 alternately in both directions around the roll axis, so that the tail section 24 is lifted in a substantially horizontal direction and slowly swings from side to side as if searching for the location of an obstacle.
  • the recognition unit 221 detects the position of an obstacle based on the magnitude and direction of the external force on the tail 24, based on the external force estimation data from the tail drive unit 232.
  • the recognition unit 221 supplies data indicating the detection result of the obstacle position to the action planning unit 223.
  • step S105 the action planning unit 223 determines whether the obstacle can be avoided based on the detection result of the obstacle's position. If it is determined that the obstacle can be avoided, the process proceeds to step S106.
  • step S106 the autonomous mobile body 11 changes its traveling direction to avoid the obstacle.
  • the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223, thereby changing the traveling direction of the autonomous mobile body 11 to a direction in which the autonomous mobile body 11 can avoid the obstacle when the autonomous mobile body 11 retreats.
  • step S101 the process returns to step S101, and the processes from step S101 onwards are executed.
  • step S105 determines whether the obstacle cannot be avoided. If it is determined in step S105 that the obstacle cannot be avoided, the process proceeds to step S107.
  • step S107 the autonomous mobile body 11 stops retreating and transitions to a mode for viewing obstacles.
  • the operation control unit 224 stops retreating by controlling the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223.
  • the action planning unit 223 changes the mode of the autonomous mobile body 11 to a mode for viewing obstacles.
  • step S103 if it is determined in step S103 that the destination has been reached, the reverse operation ends.
  • step S151 the autonomous mobile body 11 faces its rear end toward the object 351.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and changes the orientation of the autonomous mobile body 11 so that its rear end faces the object 351.
  • step S152 the autonomous mobile body 11 moves backward with the tail 24 held straight down.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and moves the autonomous mobile body 11 backward with the tail 24 held straight down, as shown diagrammatically in FIG. 23A.
  • step S153 the recognition unit 221 determines whether or not an object 351 has come into contact with the back of the tail 24, based on the magnitude and direction of the external force on the tail 24 indicated by the external force estimation data supplied from the tail drive unit 232. As shown diagrammatically in FIG. 23B, if it is determined that an object 351 has come into contact with the back of the tail 24, the process proceeds to step S154.
  • step S154 the autonomous mobile body 11 bends the tail 24 in a direction to scoop up the object 351.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and bends the tail 24 in a direction to scoop up the object 351, as shown diagrammatically in C of FIG. 23.
  • the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24 upward, thereby bending the tail section 24 in a direction to scoop up the object 351.
  • step S155 the recognition unit 221 determines whether or not the weight of the object 351 is acting on the tail 24, based on the magnitude and direction of the external force on the tail 24 indicated by the external force estimation data supplied from the tail drive unit 232. If it is determined that the weight of the object 351 is not acting on the tail 24, the process proceeds to step S156.
  • step S153 determines whether the object 351 is in contact with the back side of the tail 24 is in contact with the back side of the tail 24 is skipped and the process proceeds to step S156.
  • step S156 the recognition unit 221 determines whether or not the number of trials has been completed. If the number of trials of the action of scooping up an object with the tail 24 has not reached a predetermined number, the recognition unit 221 determines that the number of trials has not been completed, and the process proceeds to step S157.
  • step S157 the autonomous mobile body 11 moves away from the object 351 and redirects its buttocks.
  • the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223 to move the autonomous mobile body 11 away from the object 351 and redirect its buttocks toward the object 351 again.
  • step S152 the process returns to step S152, and steps S152 to S157 are repeatedly executed until it is determined in step S155 that the weight of the object 351 is applied to the tail 24, or it is determined in step S156 that the number of trials has been completed.
  • step S155 the weight of the object 351 is applied to the tail 24, or it is determined in step S156 that the number of trials has been completed.
  • step S156 the number of trials has been completed.
  • step S155 if it is determined in step S155 that the weight of the object 351 is acting on the tail 24, the process proceeds to step S158.
  • step S158 the recognition unit 221 determines that the object 351 has been successfully grasped.
  • the object grasping operation then ends.
  • step S156 if the number of attempts to scoop up an object with the tail 24 reaches a predetermined number, the recognition unit 221 determines that the number of attempts has been completed, and the process proceeds to step S159.
  • step S159 the recognition unit 221 determines that grasping of the object 351 has failed.
  • the object grasping operation then ends.
  • the expressiveness of the autonomous moving body 11 can be improved.
  • the shape and movement of the tail 24 become more natural and dynamic, improving the expressiveness of the autonomous moving body 11 by the tail 24.
  • the stability of the tail section 24 is improved. This increases the degree of freedom of movement of the tail section 24, enabling smooth, dynamic movement.
  • the tail 24 is made by simply inserting a single wire 121 into the elastic body 111, so the soft feel of the elastic body 111 is maintained. In other words, a flexible, firm and supple tail 24 is realized.
  • the base of the tail 24 can also be made thin. This allows the tail 24 to resemble a real dog's tail. As a result, for example, the user is given the urge to touch the tail 24, and a pleasant feel is provided when the user touches it.
  • the tail would always be moved with strong tension, which is expected to accelerate wear on the tail.
  • the tension on the wire 121 can be loosened as needed, improving the durability of the tail 24.
  • Fig. 24 and Fig. 25 show schematic diagrams of modified flexible active mechanisms.
  • Fig. 24 shows schematic diagrams of a cross section of the flexible active mechanism.
  • Fig. 25A shows schematic diagrams of a perspective view of the flexible active mechanism.
  • Fig. 25B shows schematic diagrams of a top view of a winding mechanism for a wire 121 of the flexible active mechanism. Note that in Fig. 24, the diagonal line pattern is omitted in part of the cross section to make the diagram easier to understand. Also, in Fig. 25A, the bearing 122 is omitted.
  • the flexible active mechanism of Figs. 24 and 25 is the same as the flexible active mechanism of Figs. 6 and 7 in that it includes a tail section 24, and differs in that it includes a drive mechanism 401 instead of the drive mechanism 101.
  • the drive mechanism 401 is the same as the drive mechanism 101 in that it includes a wire 121, a bearing 122, a pitch axis actuator 123a, a roll axis actuator 123b, and a gear 125, but differs in that it includes a pinion 411 instead of the winding section 124, and a winding section 412 has been added.
  • the drive mechanism 401 is different from the drive mechanism 101 in the configuration of the winding mechanism for the wire 121.
  • a pinion 411 is connected to the tip of the pitch axis actuator 123a.
  • the pinion 411 is engaged with a winding section 412 consisting of a curved rack.
  • One end of the wire 121 is connected to the end of the winding section 412 that is farther from the tail section 24.
  • the pitch axis actuator 123a can slide the winding section 412 in the direction indicated by the arrows in Figures 24 and 25 and in the opposite direction by rotating the pinion 411. As the winding section 412 slides, the wire 121 connected to the winding section 412 is wound or unwound. As a result, the wire 121 is pulled or pushed out, and the elastic body 111 (tail section 24) is deformed in the pitch axis direction, similar to the flexible active mechanism in Figures 6 and 7.
  • the flexible active mechanism has only one wire, but it may have two or more wires.
  • a set of wire and wire winding mechanism may be added so that the tail 24 can be bent and extended in a direction perpendicular to the arrow A1 (yaw direction).
  • a sensor for detecting force may be provided in the tail portion 24, and the recognition unit 221 may detect an external force acting on the tail portion 24 based on sensor data from the sensor.
  • a sensor that detects the shape of the tail 24 may be provided, and the recognition unit 221 may detect the shape and deformation of the tail 24 based on the sensor data of the sensor.
  • a touch sensor may be provided on the tail 24, and the recognition unit 221 may detect contact with the tail 24 based on sensor data from the touch sensor. This improves the detection accuracy of contact with the tail 24 that applies almost no external force, such as a gentle stroke, to the tail 24.
  • the present technology can be applied to moving bodies such as pet-type robots to which a flexible active mechanism can be applied, in addition to the dog-type quadruped robot described above.
  • the present technology can be applied to animal-type robots other than dogs that have tails.
  • the present technology can be applied to animal-type robots that have parts other than tails that can be flexibly bent and stretched. Examples of such parts include ears and jaws.
  • the moving body to which this technology can be applied may be, for example, a moving body in which only a part of the moving body moves, and not the entire moving body.
  • this technology can be applied to a moving body in which only the tail moves, with a flexible active mechanism provided as the tail on a cushion or the like.
  • FIG. 26 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • an input/output interface 1005 Connected to the input/output interface 1005 are an input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010.
  • the input unit 1006 includes an input switch, a button, a microphone, an image sensor, etc.
  • the output unit 1007 includes a display, a speaker, etc.
  • the storage unit 1008 includes a hard disk, a non-volatile memory, etc.
  • the communication unit 1009 includes a network interface, etc.
  • the drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.
  • the CPU 1001 loads a program recorded in the storage unit 1008, for example, into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program, thereby performing the above-mentioned series of processes.
  • the program executed by the computer 1000 can be provided by being recorded on a removable medium 1011 such as a package medium, for example.
  • the program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 1008 via the input/output interface 1005 by inserting the removable medium 1011 into the drive 1010.
  • the program can also be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008.
  • the program can be pre-installed in the ROM 1002 or storage unit 1008.
  • the program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.
  • a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.
  • this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.
  • each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.
  • one step includes multiple processes
  • the multiple processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.
  • a bending and straightening drive unit that drives the bending and straightening mechanism;
  • the moving body described in (4) further comprises a recognition unit that recognizes contact of an object with the second portion based on at least one of the estimation result of the external force in the bending and straightening direction and the estimation result of the external force in the rotational direction.
  • the moving body is an animal-type robot, the first portion is a body portion, the second portion is a tail; The moving body according to (5) above, wherein the moving body swings the tail and retreats while searching for objects behind.
  • the second portion is capable of bending to grasp an object;
  • the rotation drive unit estimates an external force in a rotational direction applied to the second part based on a current value to an actuator provided in the rotation mechanism and a position of the actuator in the rotational direction.
  • the first portion is a body portion
  • the second portion is a tail.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Toys (AREA)

Abstract

The present technology pertains to a moving body capable of improving the expressive power of a moving body equipped with a part capable of bending and stretching. This autonomous moving body is equipped with: a first part; a second part provided with an elastic body which is connected to the first part and capable of bending and stretching; a bending/stretching mechanism which bends and stretches the second part by bending and stretching the elastic body by using a wire inserted into the second part in the direction in which the second part extends from the first part; and a rotary mechanism for rotating a second part around a rotational axis which is parallel to the direction in which the second part is connected to the first part. The present technology can be applied to a robotic pet, for example.

Description

移動体Mobile

 本技術は、移動体に関し、特に、尾部等の曲げ伸ばしが可能な部位を備えた移動体に関する。 This technology relates to moving bodies, and in particular to moving bodies with parts such as tails that can bend and stretch.

 従来、ペット型ロボットの尾等に適した湾曲機構が提案されている(例えば、特許文献1参照)。  Conventionally, bending mechanisms suitable for the tails of pet-type robots have been proposed (see, for example, Patent Document 1).

特開2003-117859号公報JP 2003-117859 A

 しかしながら、特許文献1に記載の湾曲機構では、芯部材及び複数の支持部材が、現実のペットの尾とは異なるゴツゴツした印象をユーザに与えるおそれがある。その結果、ペット型ロボットの表現力が低下するおそれがある。 However, in the bending mechanism described in Patent Document 1, the core member and multiple support members may give the user a rough impression that differs from the tail of a real pet. As a result, the expressiveness of the pet robot may be reduced.

 本技術は、このような状況に鑑みてなされたものであり、ペット型ロボットの尾部等の曲げ伸ばしが可能な部位を備える移動体の表現力を向上させるようにするものである。 This technology was developed in light of these circumstances, and aims to improve the expressiveness of moving objects that have parts that can bend and stretch, such as the tails of pet-type robots.

 本技術の一側面の移動体は、第1の部位と、前記第1の部位に接続され、曲げ伸ばし可能な弾性体を備える第2の部位と、前記第1の部位から前記第2の部位が伸びる方向に前記第2の部位に挿入されているワイヤを用いて前記弾性体を曲げ伸ばしすることにより、前記第2の部位を曲げ伸ばしする曲げ伸ばし機構と、前記第2の部位の前記第1の部位への接続方向に平行な回転軸回りに前記第2の部位を回転させる回転機構とを備える。 A moving body according to one aspect of the present technology includes a first portion, a second portion connected to the first portion and having a bendable elastic body, a bending and straightening mechanism that bends and straightens the second portion by bending and straightening the elastic body using a wire inserted into the second portion in the direction in which the second portion extends from the first portion, and a rotation mechanism that rotates the second portion around a rotation axis parallel to the direction in which the second portion is connected to the first portion.

 本技術の一側面においては、第1の部位から第2の部位が伸びる方向に前記第2の部位に挿入されているワイヤを用いて弾性体が曲げ伸ばしされることにより、前記第2の部位が曲げ伸ばしされ、前記第2の部位の前記第1の部位への接続方向に平行な回転軸回りに前記第2の部位が回転する。 In one aspect of the present technology, the elastic body is bent and stretched using a wire inserted into the second portion in the direction in which the second portion extends from the first portion, thereby bending and stretching the second portion and rotating the second portion around an axis of rotation parallel to the direction in which the second portion connects to the first portion.

本技術を適用した自律移動体の外観の構成例を示す左側面図である。FIG. 1 is a left side view showing an example of the external configuration of an autonomous moving body to which the present technology is applied. 本技術を適用した自律移動体の外観の構成例を示す上面図である。1 is a top view showing an example of the external configuration of an autonomous moving body to which the present technology is applied. FIG. 本技術を適用した自律移動体の外観の構成例を示す斜視図である。1 is a perspective view showing an example of the external configuration of an autonomous moving body to which the present technology is applied; 本技術を適用した自律移動体が備えるディスプレイ及びセンサの構成例を示す図である。1 is a diagram showing an example of the configuration of a display and a sensor provided in an autonomous moving body to which the present technology is applied. 本技術を適用した自律移動体が備えるアクチュエータの構成例である。2 illustrates an example of the configuration of an actuator provided on an autonomous moving body to which the present technology is applied. 自律移動体の柔軟能動機構の構成例を模式的に示す断面図である。1 is a cross-sectional view showing a schematic configuration example of a flexible active mechanism of an autonomous moving body. 自律移動体の柔軟能動機構の構成例を模式的に示す斜視図である。FIG. 1 is a perspective view showing a schematic configuration example of a flexible active mechanism of an autonomous moving body. 尾部による自律移動体の感情の表現方法の例を示す図である。11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail. 尾部による自律移動体の感情の表現方法の例を示す図である。11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail. 尾部による自律移動体の感情の表現方法の例を示す図である。11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail. 尾部による自律移動体の感情の表現方法の例を示す図である。11A and 11B are diagrams illustrating examples of a method for expressing emotions of an autonomous moving body by a tail. 自律移動体の機能構成例を示すブロック図である。FIG. 2 is a block diagram showing an example of a functional configuration of an autonomous moving body. 尾部駆動部の構成例を示すブロック図である。FIG. 2 is a block diagram showing a configuration example of a tail drive unit. 外力の推定方法の例を説明するための図である。11A and 11B are diagrams for explaining an example of a method for estimating an external force. 障害物の検出方法の例を説明するための図である。FIG. 11 is a diagram for explaining an example of a method for detecting an obstacle. 尾部による物体の把持方法の例を説明するための図である。11A to 11C are diagrams illustrating an example of a method for grasping an object by the tail. 自律移動体の好意的インタラクション動作を説明するためのフローチャートである。11 is a flowchart for explaining a favorable interaction operation of an autonomous moving body. 自律移動体の好意的インタラクション動作を説明するための図である。FIG. 13 is a diagram for explaining a favorable interaction operation of an autonomous moving body. 自律移動体の敵対的インタラクション動作を説明するためのフローチャートである。11 is a flowchart for explaining an adversarial interaction operation of an autonomous moving body. 自律移動体の敵対的インタラクション動作を説明するための図である。FIG. 13 is a diagram for explaining an adversarial interaction operation of an autonomous moving body. 自律移動体の後退動作を説明するためのフローチャートである。11 is a flowchart for explaining a backward movement operation of the autonomous moving body. 自律移動体の物体把持動作を説明するためのフローチャートである。11 is a flowchart for explaining an object grasping operation of an autonomous moving body. 自律移動体の物体把持動作を説明するための図である。FIG. 13 is a diagram for explaining an object grasping operation of the autonomous moving body. 自律移動体の柔軟能動機構の変形例を模式的に示す断面図である。FIG. 11 is a cross-sectional view showing a schematic diagram of a modified example of the flexible active mechanism of the autonomous moving body. 自律移動体の柔軟能動機構の変形例を模式的に示す斜視図である。FIG. 13 is a perspective view showing a schematic diagram of a modified example of the flexible active mechanism of the autonomous moving body. コンピュータの構成例を示すブロック図である。FIG. 1 is a block diagram illustrating an example of the configuration of a computer.

 以下、本技術を実施するための形態について説明する。説明は以下の順序で行う。
 1.実施の形態
 2.変形例
 3.その他
Hereinafter, an embodiment of the present technology will be described in the following order.
1. Embodiment 2. Modification 3. Others

 <<1.実施の形態>>
 図1乃至図23を参照して、本技術の実施の形態について説明する。
<<1. Embodiment>>
An embodiment of the present technology will be described with reference to FIG. 1 to FIG.

  <自律移動体11のハードウエア構成例>
 まず、図1乃至図3を参照して、自律移動体11の外観の構成例について説明する。
<Hardware Configuration Example of the Autonomous Moving Body 11>
First, an example of the external configuration of an autonomous moving body 11 will be described with reference to FIGS. 1 to 3. FIG.

 図1は、自律移動体11の左側面図である。図2は、自律移動体11の上面図である。図3は、自律移動体11の斜視図である。 FIG. 1 is a left side view of the autonomous mobile body 11. FIG. 2 is a top view of the autonomous mobile body 11. FIG. 3 is a perspective view of the autonomous mobile body 11.

 自律移動体11は、頭部21、胴体部22、4本の脚部23FL乃至脚部23HR、及び、尾部24を備えるイヌ型の四足歩行ロボットである。 The autonomous mobile body 11 is a dog-type quadruped robot that has a head 21, a torso 22, four legs 23FL to 23HR, and a tail 24.

 以下、脚部23FL乃至脚部23HRを個々に区別する必要がない場合、単に脚部23と称する。 Hereinafter, when there is no need to distinguish between legs 23FL to 23HR, they will simply be referred to as legs 23.

 図4は、自律移動体11が備えるディスプレイ及びセンサの構成例を示している。 FIG. 4 shows an example of the configuration of the display and sensors equipped on the autonomous moving body 11.

 自律移動体11は、頭部21に2つのディスプレイ51L及びディスプレイ51Rを備える。なお、以下、ディスプレイ51Lとディスプレイ51Rを個々に区別する必要がない場合、単にディスプレイ51と称する。 The autonomous moving body 11 is equipped with two displays, a display 51L and a display 51R, on the head 21. In the following, when there is no need to distinguish between the display 51L and the display 51R, they will simply be referred to as the display 51.

 各ディスプレイ51は、自律移動体11の目の動きや感情を視覚的に表現する機能を備える。例えば、各ディスプレイ51は、感情や動作に応じた眼球、瞳孔、瞼の動作を表現することで、実在するイヌなどの動物に近い自然な動作を演出し、自律移動体11の視線や感情を高精度かつ柔軟に表現することができる。また、ユーザは、ディスプレイ51に表示される眼球の動作から、自律移動体11の状態を直観的に把握することができる。 Each display 51 has the function of visually expressing the eye movements and emotions of the autonomous mobile body 11. For example, each display 51 can express the movements of the eyeballs, pupils, and eyelids according to emotions and actions, producing natural movements close to those of real animals such as dogs, and can express the line of sight and emotions of the autonomous mobile body 11 with high precision and flexibility. In addition, the user can intuitively grasp the state of the autonomous mobile body 11 from the eye movements displayed on the display 51.

 また、自律移動体11は、種々のセンサを備える。自律移動体11は、例えば、マイクロフォン52、カメラ53、ToF(Time Of Flight)センサ54、人感センサ55、測距センサ56、タッチセンサ57、照度センサ58、足裏ボタン59、及び、慣性センサ60を備える。 The autonomous moving body 11 also includes various sensors. For example, the autonomous moving body 11 includes a microphone 52, a camera 53, a ToF (Time Of Flight) sensor 54, a human presence sensor 55, a distance measurement sensor 56, a touch sensor 57, an illuminance sensor 58, a sole button 59, and an inertial sensor 60.

 自律移動体11は、例えば、頭部21に4つのマイクロフォン52を備える。各マイクロフォン52は、例えば、ユーザの発話や、周囲の環境音を含む周囲の音を収集する。また、複数のマイクロフォン52を備えることで、周囲で発生する音を感度高く収集すると共に、音源定位が可能となる。 The autonomous mobile body 11 has, for example, four microphones 52 on the head 21. Each microphone 52 collects surrounding sounds including, for example, the user's speech and surrounding environmental sounds. Furthermore, by having multiple microphones 52, it becomes possible to collect surrounding sounds with high sensitivity and to localize the sound source.

 自律移動体11は、例えば、鼻先と腰部に2つの広角のカメラ53を備え、自律移動体11の周囲を撮影する。例えば、鼻先に配置されたカメラ53は、自律移動体11の前方視野(すなわち、イヌの視野)内の撮影を行う。腰部に配置されたカメラ53は、自律移動体11の上方を中心とする周囲の撮影を行う。自律移動体11は、例えば、腰部に配置されたカメラ53により撮影された画像に基づいて、天井の特徴点などを抽出し、SLAM(Simultaneous Localization and Mapping)を実現することができる。 The autonomous mobile body 11 is equipped with two wide-angle cameras 53, for example, at the nose and waist, which capture images of the autonomous mobile body 11's surroundings. For example, the camera 53 located at the nose captures images within the autonomous mobile body 11's forward field of view (i.e., the dog's field of view). The camera 53 located at the waist captures images of the surroundings centered around the upper part of the autonomous mobile body 11. The autonomous mobile body 11 can extract feature points of the ceiling, for example, based on images captured by the camera 53 located at the waist, and achieve SLAM (Simultaneous Localization and Mapping).

 ToFセンサ54は、例えば、鼻先に設けられ、頭部21の前方に存在する物体との距離を検出する。自律移動体11は、ToFセンサ54により種々の物体との距離を精度高く検出することができ、ユーザを含む対象物や障害物などとの相対位置に応じた動作を実現することができる。 The ToF sensor 54 is provided, for example, at the tip of the nose, and detects the distance to an object present in front of the head 21. The ToF sensor 54 enables the autonomous mobile body 11 to detect the distance to various objects with high accuracy, and can realize operations according to the relative position to targets including the user, obstacles, etc.

 人感センサ55は、例えば、胸部に配置され、ユーザやユーザが飼育するペットなどの所在を検知する。自律移動体11は、人感センサ55により前方に存在する動物体を検知することで、当該動物体に対する種々の動作、例えば、興味、恐怖、驚きなどの感情に応じた動作を実現することができる。 The human presence sensor 55 is placed, for example, on the chest and detects the location of the user or a pet kept by the user. By detecting an animal object in front of the autonomous mobile body 11 using the human presence sensor 55, the autonomous mobile body 11 can perform various actions toward the animal object, such as actions corresponding to emotions such as interest, fear, or surprise.

 測距センサ56は、例えば、胸部に配置され、自律移動体11の前方床面の状況を検出する。自律移動体11は、測距センサ56により前方床面に存在する物体との距離を精度高く検出することができ、当該物体との相対位置に応じた動作を実現することができる。 The distance measurement sensor 56 is placed, for example, on the chest and detects the condition of the floor surface in front of the autonomous mobile body 11. The distance measurement sensor 56 allows the autonomous mobile body 11 to accurately detect the distance to an object present on the floor surface in front of it, and to realize operations according to the relative position of the object.

 タッチセンサ57は、例えば、頭頂、あご下、背中など、ユーザが自律移動体11に触れる可能性が高い部位に配置され、ユーザによる接触(タッチ)を検知する。タッチセンサ57は、例えば、静電容量式や感圧式のタッチセンサにより構成される。自律移動体11は、タッチセンサ57により、ユーザの触れる、なでる、叩く、押すなどの接触行為を検知することができ、当該接触行為に応じた動作を行うことができる。また、例えば、タッチセンサ57が各部位に線状又は面状に配置されることにより、各部位内でタッチされた位置の検出が可能になる。 The touch sensor 57 is arranged in areas where the user is likely to touch the autonomous moving body 11, such as the top of the head, under the chin, and on the back, and detects contact (touch) by the user. The touch sensor 57 is composed of, for example, a capacitive or pressure-sensitive touch sensor. The autonomous moving body 11 can detect contact actions by the user, such as touching, stroking, tapping, and pushing, using the touch sensor 57, and can perform an action according to the contact action. In addition, for example, the touch sensor 57 is arranged in a line or a plane on each part, making it possible to detect the position touched within each part.

 照度センサ58は、例えば、頭部21の背面において尾部24の付け根などに配置され、自律移動体11が位置する空間の照度を検出する。自律移動体11は、照度センサ58により周囲の明るさを検出し、当該明るさに応じた動作を実行することができる。 The illuminance sensor 58 is disposed, for example, at the base of the tail 24 on the back of the head 21, and detects the illuminance of the space in which the autonomous mobile body 11 is located. The autonomous mobile body 11 can detect the surrounding brightness using the illuminance sensor 58 and perform an operation according to the brightness.

 足裏ボタン59は、例えば、4本の脚部23の肉球に該当する部位にそれぞれ配置され、自律移動体11の脚部23の底面が床と接触しているか否かを検知する。自律移動体11は、足裏ボタン59により床面との接触または非接触を検知することができ、例えば、ユーザにより抱き上げられたことなどを把握することができる。 The sole buttons 59 are, for example, arranged on the areas of the four legs 23 that correspond to the paw pads, and detect whether the bottom surfaces of the legs 23 of the autonomous mobile body 11 are in contact with the floor. The sole buttons 59 enable the autonomous mobile body 11 to detect contact or non-contact with the floor surface, and can, for example, know when it has been picked up by a user.

 慣性センサ60は、例えば、頭部21および胴体部22にそれぞれ配置され、頭部21や胴体部22の速度、加速度、回転等の物理量を検出する。例えば、慣性センサ60は、X軸、Y軸、Z軸の加速度及び角速度を検出する6軸センサにより構成される。自律移動体11は、慣性センサ60により頭部21及び胴体部22の運動を精度高く検出し、状況に応じた動作制御を実現することができる。 The inertial sensor 60 is, for example, disposed in the head 21 and torso 22, respectively, and detects physical quantities such as the speed, acceleration, and rotation of the head 21 and torso 22. For example, the inertial sensor 60 is configured with a six-axis sensor that detects the acceleration and angular velocity of the X-axis, Y-axis, and Z-axis. The autonomous mobile body 11 can detect the movement of the head 21 and torso 22 with high accuracy using the inertial sensor 60, and realize operation control according to the situation.

 なお、自律移動体11が備えるセンサの構成は、仕様や運用等に応じて柔軟に変更され得る。例えば、自律移動体11は、上記の構成のほか、例えば、温度センサ、地磁気センサ、GNSS(Global Navigation Satellite System)信号受信機を含む各種の通信装置などをさらに備えてよい。 The configuration of the sensors equipped in the autonomous mobile body 11 can be flexibly changed depending on the specifications, operation, etc. For example, in addition to the above configuration, the autonomous mobile body 11 may further include various communication devices including a temperature sensor, a geomagnetic sensor, and a GNSS (Global Navigation Satellite System) signal receiver.

 次に、図5を参照して、自律移動体11が備える関節部の構成例について説明する。図5は、自律移動体11が備えるアクチュエータ71の構成例を示している。自律移動体11は、図5に示す回転箇所に加え、耳部と尾部24に2つずつ、口に1つの合計22の回転自由度を有する。 Next, referring to FIG. 5, an example of the configuration of the joints of the autonomous mobile body 11 will be described. FIG. 5 shows an example of the configuration of an actuator 71 of the autonomous mobile body 11. In addition to the rotation points shown in FIG. 5, the autonomous mobile body 11 has a total of 22 degrees of rotational freedom, two each in the ears and tail 24, and one in the mouth.

 例えば、自律移動体11は、頭部21に3自由度を有することで、頷きや首を傾げる動作を両立することができる。また、自律移動体11は、腰部に備えるアクチュエータ71により、腰のスイング動作を再現することで、より現実のイヌに近い自然かつ柔軟な動作を実現することができる。 For example, the autonomous mobile body 11 has three degrees of freedom in the head 21, which allows it to perform both nodding and tilting its head. In addition, the autonomous mobile body 11 can reproduce the swinging motion of its hips using the actuator 71 in its hips, allowing it to achieve natural and flexible movements that are closer to those of a real dog.

 なお、自律移動体11は、例えば、1軸アクチュエータと2軸アクチュエータを組み合わせることで、上記の22の回転自由度を実現してもよい。例えば、脚部23における肘や膝部分においては1軸アクチュエータを、肩や大腿の付け根には2軸アクチュエータをそれぞれ採用してもよい。 The autonomous mobile body 11 may achieve the above 22 degrees of rotational freedom by combining, for example, a single-axis actuator and a two-axis actuator. For example, single-axis actuators may be used in the elbows and knees of the legs 23, and two-axis actuators may be used in the shoulders and the bases of the thighs.

  <柔軟能動機構の構成例>
 次に、図6及び図7を参照して、イヌの尾を摸擬し、柔軟で能動的に変形する機構である自律移動体11の柔軟能動機構の構成例について説明する。図6は、柔軟能動機構を模式的に示す断面図である。図7は、柔軟能動機構を模式的に示す斜視図である。なお、図6において、図を分かりやすくするために、断面の一部において斜線のパターンが省略されている。
<Example of flexible active mechanism configuration>
Next, a configuration example of the flexible active mechanism of the autonomous moving body 11, which is a mechanism that flexibly and actively deforms by imitating a dog's tail, will be described with reference to Fig. 6 and Fig. 7. Fig. 6 is a cross-sectional view that shows a flexible active mechanism. Fig. 7 is a perspective view that shows a flexible active mechanism. In Fig. 6, the diagonal line pattern is omitted in part of the cross section to make the drawing easier to understand.

 自律移動体11の柔軟能動機構は、尾部24及び駆動機構101を備える。尾部24は、弾性体111、キャップ112、及び、接続部材113を備える。駆動機構101は、接続部材113、ワイヤ121、ベアリング122、ロール軸アクチュエータ123b、ピッチ軸アクチュエータ123a、巻取り部124、及び、ギア125を備える。接続部材113は、尾部24及び駆動機構101の両方の構成要素となる。接続部材113の回転軸113B、ベアリング122、ピッチ軸アクチュエータ123a、ロール軸アクチュエータ123b、巻取り部124、及び、ギア125は、胴体部22内に配置されている。 The flexible active mechanism of the autonomous mobile body 11 includes a tail section 24 and a drive mechanism 101. The tail section 24 includes an elastic body 111, a cap 112, and a connection member 113. The drive mechanism 101 includes a connection member 113, a wire 121, a bearing 122, a roll axis actuator 123b, a pitch axis actuator 123a, a winding section 124, and a gear 125. The connection member 113 is a component of both the tail section 24 and the drive mechanism 101. The rotation axis 113B, the bearing 122, the pitch axis actuator 123a, the roll axis actuator 123b, the winding section 124, and the gear 125 of the connection member 113 are disposed within the body section 22.

 なお、図7では、ベアリング122の図示が省略されている。 Note that the bearing 122 is not shown in Figure 7.

 弾性体111は、イヌの尾を模した細長い形状を有している。弾性体111は、例えば、シリコン等の樹脂、又は、エラストマ等のイヌの尾の触感に近く、柔軟性があり曲げ伸ばし可能(湾曲可能)な単一の弾性体により構成される。弾性体111の先端には、球冠状のキャップ112が被せられている。弾性体111の根元は、接続部材113の球冠状の装着部113Aに装着されている。 The elastic body 111 has a long, thin shape that resembles a dog's tail. The elastic body 111 is made of a single elastic body that is flexible and can be bent (curved), and is made of a resin such as silicone or an elastomer that has a texture similar to that of a dog's tail. A spherical crown-shaped cap 112 is placed over the tip of the elastic body 111. The base of the elastic body 111 is attached to a spherical crown-shaped attachment portion 113A of the connection member 113.

 接続部材113の装着部113Aの先端から円柱状の回転軸113Bが突出している。回転軸113Bの先端には、ギアが形成されている。回転軸113Bは、胴体部22の尻部から胴体部22に挿入され、胴体部22内においてベアリング122に挿入されている。このベアリング122により、尾部24が回転軸113B(以下、ロール軸と称する)を中心にして、回転軸113B回りに回転可能に支持されている。ロール軸は、尾部24の胴体部22への接続方向(接続部材113が胴体部22に挿入されている方向)に平行な回転軸となる。 A cylindrical rotating shaft 113B protrudes from the tip of the mounting portion 113A of the connecting member 113. A gear is formed at the tip of the rotating shaft 113B. The rotating shaft 113B is inserted into the body section 22 from the rear end of the body section 22, and is inserted into a bearing 122 inside the body section 22. This bearing 122 supports the tail section 24 so that it can rotate around the rotating shaft 113B (hereinafter referred to as the roll axis). The roll axis is a rotating axis that is parallel to the connection direction of the tail section 24 to the body section 22 (the direction in which the connecting member 113 is inserted into the body section 22).

 尾部24には、胴体部22から尾部24が伸びる方向に1本のワイヤ121が挿入されている。ワイヤ121は、例えば、形状記憶合金等の金属、又は、樹脂により構成される。ワイヤ121の先端は、キャップ112に接続され、固定されている。ワイヤ121は、尾部24(弾性体111)の長手方向に延び、接続部材113を貫通し、胴体部22内において、巻取り部124に接続されている。 A single wire 121 is inserted into the tail 24 in the direction in which the tail 24 extends from the body 22. The wire 121 is made of, for example, a metal such as a shape memory alloy, or resin. The tip of the wire 121 is connected to and fixed in the cap 112. The wire 121 extends in the longitudinal direction of the tail 24 (elastic body 111), passes through the connecting member 113, and is connected to the winding section 124 inside the body 22.

 ピッチ軸アクチュエータ123aの先端には、巻取り部124が接続されている。ピッチ軸アクチュエータ123aは、巻取り部124を回転させることにより、ワイヤ121を巻き取ったり繰り出したりする。これにより、ワイヤ121が引っ張られたり押し出されたりして、弾性体111(尾部24)が矢印A1の方向(ピッチ軸方向)に湾曲状に変形する。すなわち、ワイヤ121が巻き取られ、引っ張られると、弾性体111(尾部24)が上方向に湾曲状に曲がる。一方、ワイヤ121が繰り出されると、弾性体111(尾部24)が伸びる。すなわち、尾部24は、ロール軸回りに回転していない状態において、ワイヤ121により上下方向に曲げ伸ばしすることが可能である。 A winding section 124 is connected to the tip of the pitch axis actuator 123a. The pitch axis actuator 123a rotates the winding section 124 to wind and unwind the wire 121. As a result, the wire 121 is pulled and pushed out, and the elastic body 111 (tail section 24) is curved in the direction of arrow A1 (pitch axis direction). That is, when the wire 121 is wound and pulled, the elastic body 111 (tail section 24) is curved upward. On the other hand, when the wire 121 is unwound, the elastic body 111 (tail section 24) stretches. That is, when the tail section 24 is not rotating around the roll axis, it is possible for the wire 121 to bend and stretch in the vertical direction.

 また、例えば、ワイヤ121の巻取り量が制御されることにより、尾部24の高さ方向の位置が制御される。ワイヤ121が巻き取られたり繰り出されたりすることにより、尾部24が上下に振られる。ワイヤ121が巻き取られたり繰り出されたりする速度が制御されることにより、尾部24を上下に振る速度が制御される。ワイヤ121が巻き取られたり繰り出されたりする量が制御されることにより、尾部24を上下に振る幅が制御される。 Furthermore, for example, the heightwise position of the tail 24 is controlled by controlling the amount of wire 121 wound up and down. The tail 24 swings up and down by winding and unwinding the wire 121. The speed at which the tail 24 swings up and down is controlled by controlling the speed at which the wire 121 is wound up and unwinded. The width at which the tail 24 swings up and down is controlled by controlling the amount at which the wire 121 is wound up and unwinded.

 このように、ワイヤ121、ピッチ軸アクチュエータ123a、及び、巻取り部124により、尾部24を曲げ伸ばしする曲げ伸ばし機構が構成される。また、曲げ伸ばし機構は、ワイヤ121を巻き取ったり繰り出したりする巻取り機構を備える。 In this way, the wire 121, pitch axis actuator 123a, and winding section 124 constitute a bending and straightening mechanism that bends and straightens the tail section 24. The bending and straightening mechanism also includes a winding mechanism that winds and unwinds the wire 121.

 ロール軸アクチュエータ123bの先端には、ギア125が接続されている。ギア125は、接続部材113の回転軸113Bの先端のギアと嵌合している。ロール軸アクチュエータ123bがギア125を回転させることにより、ベアリング122により支持されている接続部材113がロール軸周りに回転する。これにより、尾部24が、矢印A2で示されるように、ロール軸回りに回転する。 A gear 125 is connected to the tip of the roll axis actuator 123b. The gear 125 is engaged with a gear at the tip of the rotating shaft 113B of the connecting member 113. When the roll axis actuator 123b rotates the gear 125, the connecting member 113 supported by the bearing 122 rotates around the roll axis. This causes the tail section 24 to rotate around the roll axis as shown by the arrow A2.

 例えば、尾部24がロール軸回りに回転することにより、尾部24を左右に振ることができる。尾部24のロール軸回りの回転速度が制御されることにより、尾部24を左右に振る速度が制御される。尾部24のロール軸回りの回転量が制御されることにより、尾部24を左右に振る幅が制御される。 For example, the tail section 24 can swing left and right by rotating about the roll axis. The speed at which the tail section 24 swings left and right is controlled by controlling the rotation speed of the tail section 24 about the roll axis. The width at which the tail section 24 swings left and right is controlled by controlling the amount of rotation of the tail section 24 about the roll axis.

 また、尾部24の回転方向の位置が制御されることにより、ワイヤ121により尾部24が曲げ伸ばしされる方向が変化する。 In addition, by controlling the rotational position of the tail 24, the direction in which the tail 24 is bent and stretched by the wire 121 changes.

 このように、回転軸113B、ベアリング122、ロール軸アクチュエータ123b、及び、ギア125により、尾部24をロール軸回りに回転させる回転機構が構成される。 In this way, the rotation shaft 113B, the bearing 122, the roll axis actuator 123b, and the gear 125 form a rotation mechanism that rotates the tail section 24 around the roll axis.

 なお、以下、ピッチ軸アクチュエータ123a及びロール軸アクチュエータ123bを個々に区別する必要がない場合、単にアクチュエータ123と称する。 In the following, when there is no need to distinguish between the pitch axis actuator 123a and the roll axis actuator 123b, they will simply be referred to as actuator 123.

 自律移動体11は、上記の構成により、関節部、眼球、及び、尾部24の動作を精度高く、柔軟に制御することで、より実在の生物に近い動作及び感情表現を再現することができる。 The above configuration of the autonomous mobile body 11 allows precise and flexible control of the movements of the joints, eyeballs, and tail 24, making it possible to reproduce movements and emotional expressions that are closer to those of real living creatures.

  <尾部24による感情の表現方法の例>
 イヌは、種類や大きさ等により多少の違いがあるが、尾の動きにより様々な感情を表現する。換言すれば、イヌの感情が尾の動きに表れる。例えば、尾の上下方向の位置、尾を上下又は左右に振る速度や大きさ等により、イヌの感情が表現される。
<Examples of how emotions are expressed by the tail 24>
Dogs express various emotions through tail movements, although there are some differences depending on the breed, size, etc. In other words, the dog's emotions are expressed in the movement of its tail. For example, the dog's emotions are expressed by the vertical position of the tail, the speed and strength of the tail wagging up and down or from side to side, etc.

 例えば、イヌがうれしかったり、楽しかったり、又は、相手を警戒したりしている場合、尾が通常より上に持ち上げられる。例えば、イヌが不安を感じたり、怯えたり、又は、相手を警戒したりしている場合、尾が通常より下に下がる。例えば、感情の強弱により、尾を振る速度や大きさが変化する。 For example, if a dog is happy, excited, or wary of others, the tail will be raised higher than normal. For example, if a dog is anxious, frightened, or wary of others, the tail will be lower than normal. For example, the speed and strength of tail wagging changes depending on the strength of the emotion.

 これに対して、自律移動体11は、上述したように、尾部24を曲げ伸ばしたり、回転させたりすることができる。そして、自律移動体11は、尾部24を曲げ伸ばしたり、回転させたりする動作を制御することにより、現実のイヌと同様に感情を表現することができる。 In contrast, the autonomous mobile body 11 can bend, straighten, and rotate the tail 24, as described above. By controlling the movements of bending, straightening, and rotating the tail 24, the autonomous mobile body 11 can express emotions in the same way as a real dog.

 ここで、図8乃至図11を参照して、尾部24による自律移動体11の感情の表現方法の例について説明する。 Now, with reference to Figures 8 to 11, we will explain an example of how the tail 24 expresses the emotions of the autonomous mobile body 11.

 例えば、自律移動体11は、図8のAに示されるように尾部24を下に垂らし、図8のB及びCに示されるように、そのまま尾部24を左右に振ることにより、リラックスした感情を表現する。 For example, the autonomous mobile body 11 expresses a relaxed feeling by letting the tail 24 hang down as shown in FIG. 8A, and then swinging the tail 24 from side to side as shown in FIG. 8B and C.

 例えば、自律移動体11は、図9のA及びBに示されるように尾部24を上下に振ることにより、換言すれば、尾部24を下に垂らしたり上方向に巻き上げたりする動作を繰り返すことにより、ワクワクした感情を表現する。 For example, the autonomous mobile body 11 expresses excitement by swinging the tail 24 up and down as shown in A and B of FIG. 9, in other words, by repeatedly hanging the tail 24 down and rolling it up.

 例えば、図10のA乃至Cに示されるように、自律移動体11は、図10のAに示されるように尾部24を上方向に湾曲させ、図10のB及びCに示されるように、そのまま尾部24を左右に振ることにより、ハイテンションであることを表現する。 For example, as shown in A to C of FIG. 10, the autonomous mobile body 11 expresses high tension by curving the tail 24 upward as shown in A of FIG. 10, and then swinging the tail 24 from side to side as shown in B and C of FIG. 10.

 例えば、自律移動体11は、図11のA及びBに示されるように尾部24を180度回転させた後、図11のCに示されるように尾部24を下方向に巻き込むことにより、怯えた感情を表現する。 For example, the autonomous mobile body 11 expresses a frightened emotion by rotating the tail 24 180 degrees as shown in A and B of FIG. 11, and then curling the tail 24 downward as shown in C of FIG. 11.

  <自律移動体11の機能構成例>
 次に、図12を参照して、自律移動体11の機能構成例について説明する。自律移動体11は、入力部201、通信部202、情報処理部203、駆動部204、出力部205、及び、記憶部206を備える。
<Example of Functional Configuration of Autonomous Moving Body 11>
Next, an example of the functional configuration of the autonomous mobile body 11 will be described with reference to Fig. 12. The autonomous mobile body 11 includes an input unit 201, a communication unit 202, an information processing unit 203, a driving unit 204, an output unit 205, and a storage unit 206.

 入力部201は、図4に示される各種のセンサ等を備え、ユーザや周囲の状況に関する各種のセンサデータを収集する機能を備える。また、入力部201は、例えば、スイッチ、ボタン等の入力デバイスを備える。入力部201は、収集したセンサデータ、及び、入力デバイスを介して入力される入力データを情報処理部203に供給する。 The input unit 201 includes various sensors as shown in FIG. 4 and has a function of collecting various sensor data related to the user and the surrounding conditions. The input unit 201 also includes input devices such as switches and buttons. The input unit 201 supplies the collected sensor data and input data input via the input devices to the information processing unit 203.

 通信部202は、他の自律移動体11、スマートフォン等の情報処理端末(不図示)、及び、情報処理サーバ(不図示)と通信を行い、各種のデータの送受信を行う。通信部202は、受信したデータを情報処理部203に供給し、送信するデータを情報処理部203から取得する。 The communication unit 202 communicates with other autonomous mobile bodies 11, information processing terminals such as smartphones (not shown), and information processing servers (not shown), and transmits and receives various types of data. The communication unit 202 supplies the received data to the information processing unit 203, and obtains data to be transmitted from the information processing unit 203.

 なお、通信部202の通信方式は、特に限定されず、仕様や運用に応じて柔軟に変更することが可能である。 The communication method of the communication unit 202 is not particularly limited and can be flexibly changed according to the specifications and operation.

 情報処理部203は、例えば、CPU(Central Processing Unit)等のプロセッサ等を備え、各種の情報処理を行ったり、自律移動体11の各部の制御を行ったりする。情報処理部203は、認識部221、学習部222、行動計画部223、及び、動作制御部224を備える。 The information processing unit 203 includes, for example, a processor such as a CPU (Central Processing Unit), and performs various types of information processing and controls each part of the autonomous mobile body 11. The information processing unit 203 includes a recognition unit 221, a learning unit 222, an action planning unit 223, and an operation control unit 224.

 認識部221は、入力部201から供給されるセンサデータ及び入力データ、通信部202から供給される受信データ、並びに、駆動部204から供給される、尾部24に対して外部から与えられる外力の推定結果を示すデータ(以下、外力推定データと称する)に基づいて、自律移動体11が置かれている状況の認識を行う。自律移動体11が置かれている状況は、例えば、自分及び周囲の状況を含む。自分の状況は、例えば、自律移動体11の状態及び動きを含む。周囲の状況は、例えば、ユーザ等の周囲の人の状態、動き、及び、指示、ペット等の周囲の生物の状態及び動き、周囲の物体の状態及び動き、時間、場所、並びに、周囲の環境等を含む。周囲の物体は、例えば、他の自律移動体を含む。 The recognition unit 221 recognizes the situation in which the autonomous mobile body 11 is placed, based on the sensor data and input data supplied from the input unit 201, the received data supplied from the communication unit 202, and data (hereinafter referred to as external force estimation data) supplied from the drive unit 204 indicating the estimated results of the external force applied to the tail unit 24 from the outside. The situation in which the autonomous mobile body 11 is placed includes, for example, the situation of itself and its surroundings. The situation of itself includes, for example, the state and movement of the autonomous mobile body 11. The surrounding situation includes, for example, the state, movement, and instructions of people in the vicinity such as a user, the state and movement of surrounding living things such as pets, the state and movement of surrounding objects, time, place, and the surrounding environment. The surrounding objects include, for example, other autonomous mobile bodies.

 また、認識部221は、状況を認識するために、例えば、人識別、表情や視線の認識、感情認識、物体認識、動作認識、空間領域認識、色認識、形認識、マーカ認識、障害物認識、段差認識、明るさ認識、温度認識、音声認識、単語理解、位置推定、姿勢推定等を行う。 In addition, in order to recognize the situation, the recognition unit 221 performs, for example, person identification, facial expression and gaze recognition, emotion recognition, object recognition, action recognition, spatial area recognition, color recognition, shape recognition, marker recognition, obstacle recognition, step recognition, brightness recognition, temperature recognition, voice recognition, word understanding, position estimation, posture estimation, etc.

 さらに、認識部221は、認識した各種の情報に基づいて、状況を推定し、理解する機能を備える。この際、認識部221は、事前に記憶される知識を用いて総合的に状況の推定を行ってもよい。 Furthermore, the recognition unit 221 has a function of estimating and understanding the situation based on the various recognized information. At this time, the recognition unit 221 may make a comprehensive estimation of the situation using knowledge stored in advance.

 認識部221は、状況の認識結果又は推定結果を示すデータ(以下、状況データと称する)を学習部222及び行動計画部223に供給する。また、認識部221は、状況データを、記憶部206に記憶されている行動履歴データに登録する。 The recognition unit 221 supplies data indicating the result of the recognition or estimation of the situation (hereinafter referred to as situation data) to the learning unit 222 and the action planning unit 223. In addition, the recognition unit 221 registers the situation data in the action history data stored in the memory unit 206.

 行動履歴データは、自律移動体11の行動の履歴を示すデータである。行動履歴データは、例えば、行動を開始した日時、行動を終了した日時、行動を実行したきっかけ、行動が指示された場所(ただし、場所が指示された場合)、行動したときの状況、行動を完了したか(行動を最後まで実行したか)否かの項目を含む。 The behavior history data is data that indicates the history of the behavior of the autonomous mobile body 11. The behavior history data includes, for example, items such as the date and time when the behavior started, the date and time when the behavior ended, the trigger for performing the behavior, the location where the behavior was instructed (if a location was instructed), the situation when the behavior was performed, and whether the behavior was completed (whether the behavior was performed to the end).

 学習部222は、入力部201から供給されるセンサデータ及び入力データ、通信部202から供給される受信データ、認識部221から供給される状況データ、行動計画部223から供給される自律移動体11の行動に関するデータ、及び、記憶部206に記憶されている行動履歴データに基づいて、状況と行動、及び、当該行動による環境への作用を学習する。例えば、学習部222は、パターン認識学習を行ったり、ユーザの躾に対応する行動パターンの学習を行ったりする。 The learning unit 222 learns the situation, behavior, and the effect of the behavior on the environment based on the sensor data and input data supplied from the input unit 201, the received data supplied from the communication unit 202, the situation data supplied from the recognition unit 221, the data related to the behavior of the autonomous mobile body 11 supplied from the behavior planning unit 223, and the behavior history data stored in the memory unit 206. For example, the learning unit 222 performs pattern recognition learning and learns behavior patterns corresponding to the user's discipline.

 例えば、学習部222は、深層学習(Deep Learning)等の機械学習アルゴリズムを用いて、上記の学習を実現する。なお、学習部222が採用する学習アルゴリズムは、上記の例に限定されず、適宜設計可能である。 For example, the learning unit 222 realizes the above learning by using a machine learning algorithm such as deep learning. Note that the learning algorithm adopted by the learning unit 222 is not limited to the above example, and can be designed as appropriate.

 学習部222は、学習結果を示すデータ(以下、学習結果データと称する)を行動計画部223に供給したり、記憶部206に記憶させたりする。 The learning unit 222 supplies data indicating the learning results (hereinafter referred to as learning result data) to the action planning unit 223 and stores the data in the memory unit 206.

 行動計画部223は、認識又は推定された状況、学習結果データ、及び、動作制御部224から供給される自律移動体11の状態を示すフィードバックデータに基づいて、自律移動体11が行う行動(例えば、振る舞い)を計画する。行動計画部223は、計画した行動を示すデータ(以下、行動計画データと称する)を動作制御部224に供給する。また、行動計画部223は、自律移動体11の行動に関するデータを学習部222に供給したり、記憶部206に記憶されている行動履歴データに登録したりする。 The action planning unit 223 plans an action (e.g., behavior) to be performed by the autonomous mobile body 11 based on the recognized or estimated situation, the learning result data, and feedback data indicating the state of the autonomous mobile body 11 supplied from the operation control unit 224. The action planning unit 223 supplies data indicating the planned action (hereinafter referred to as action plan data) to the operation control unit 224. The action planning unit 223 also supplies data regarding the action of the autonomous mobile body 11 to the learning unit 222, and registers the data in the action history data stored in the memory unit 206.

 動作制御部224は、行動計画データに基づいて、駆動部204及び出力部205を制御することにより、計画された行動を実行するように自律移動体11の動作を制御する。動作制御部224は、例えば、行動計画に基づいて、アクチュエータ71及びアクチュエータ123の回転制御や、ディスプレイ51の表示制御、スピーカによる音声出力制御などを行う。動作制御部224は、自律移動体11の動作の制御後の状態を示すフィードバックデータを行動計画部223に供給する。 The operation control unit 224 controls the operation of the autonomous mobile body 11 to execute the planned action by controlling the drive unit 204 and the output unit 205 based on the action plan data. For example, the operation control unit 224 controls the rotation of the actuators 71 and 123, the display of the display 51, and the audio output from the speaker based on the action plan. The operation control unit 224 supplies feedback data indicating the state of the autonomous mobile body 11 after the control of its operation to the action plan unit 223.

 駆動部204は、関節駆動部231及び尾部駆動部232を備える。 The drive unit 204 includes a joint drive unit 231 and a tail drive unit 232.

 関節駆動部231は、動作制御部224による制御に基づいて、各関節部が備えるアクチュエータ71を駆動することにより、自律移動体11が有する複数の関節部を屈伸させる。 The joint driving unit 231 drives the actuators 71 provided in each joint based on the control of the motion control unit 224, thereby bending and extending the multiple joints of the autonomous mobile body 11.

 尾部駆動部232は、動作制御部224による制御に基づいて、自律移動体11の尾部24を駆動する。尾部駆動部232は、曲げ伸ばし駆動部232a及び回転駆動部232bを備える。 The tail drive unit 232 drives the tail 24 of the autonomous mobile body 11 based on the control of the motion control unit 224. The tail drive unit 232 includes a bending and straightening drive unit 232a and a rotation drive unit 232b.

 曲げ伸ばし駆動部232aは、動作制御部224による制御に基づいて、ピッチ軸アクチュエータ123aを含む曲げ伸ばし機構を駆動することにより、尾部24を曲げたり伸ばしたりする。曲げ伸ばし駆動部232aは、尾部24に加えられる曲げ伸ばし方向の外力を推定し、外力の推定結果を示す外力推定データを情報処理部203に供給する。 The bending and straightening drive unit 232a bends and straightens the tail section 24 by driving a bending and straightening mechanism including the pitch axis actuator 123a based on the control of the motion control unit 224. The bending and straightening drive unit 232a estimates the external force applied to the tail section 24 in the bending and straightening direction, and supplies external force estimation data indicating the estimated external force to the information processing unit 203.

 回転駆動部232bは、動作制御部224による制御に基づいて、ロール軸アクチュエータ123bを含む回転機構を駆動することにより、尾部24をロール軸回りに回転させる。回転駆動部232bは、尾部24に加えられる回転方向の外力を推定し、外力の推定結果を示す外力推定データを情報処理部203に供給する。 The rotation drive unit 232b drives a rotation mechanism including the roll axis actuator 123b under the control of the motion control unit 224 to rotate the tail section 24 around the roll axis. The rotation drive unit 232b estimates the external force in the rotational direction applied to the tail section 24, and supplies external force estimation data indicating the estimated external force to the information processing unit 203.

 なお、以下、曲げ伸ばし駆動部232a及び回転駆動部232bを個々に区別する必要がない場合、尾部駆動部232と総称する。 In the following, when there is no need to distinguish between the bending and straightening drive unit 232a and the rotation drive unit 232b, they will be collectively referred to as the tail drive unit 232.

 出力部205は、例えば、ディスプレイ51、スピーカ、ハプティクスデバイス等を備え、動作制御部224による制御に基づいて、視覚情報、聴覚情報、触覚情報等の出力を行う。 The output unit 205 includes, for example, a display 51, a speaker, a haptic device, etc., and outputs visual information, auditory information, tactile information, etc. based on the control of the operation control unit 224.

 記憶部206は、例えば、不揮発性メモリ及び揮発性メモリを備え、各種のプログラム及びデータを記憶する。 The storage unit 206 includes, for example, non-volatile memory and volatile memory, and stores various programs and data.

  <尾部駆動部232の構成例>
 図13は、尾部駆動部232(曲げ伸ばし駆動部232a及び回転駆動部232b)の構成例を示している。
<Configuration example of tail drive unit 232>
FIG. 13 shows an example of the configuration of the tail drive unit 232 (the bending and straightening drive unit 232a and the rotation drive unit 232b).

 尾部駆動部232は、力制御器261、位置制御器262、指示値合成器263、ドライバ264、及び、力推定器265を備える。 The tail drive unit 232 includes a force controller 261, a position controller 262, an instruction value combiner 263, a driver 264, and a force estimator 265.

 力制御器261は、アクチュエータ123により尾部24に加える力(以下、駆動力と称する)の目標値を示すデータを動作制御部224から取得する。力制御器261は、尾部24に対して外部から与えられる外力の推定値を示すデータを力推定器265から取得する。力制御器261は、駆動力の目標値、及び、外力の推定値に基づいて、尾部24に加える駆動力の指示値を算出する。力制御器261は、尾部24に加える駆動力の指示値を示すデータを指示値合成器263に供給する。 The force controller 261 obtains data indicating a target value of the force (hereinafter referred to as driving force) to be applied to the tail section 24 by the actuator 123 from the motion control section 224. The force controller 261 obtains data indicating an estimated value of an external force applied to the tail section 24 from the force estimator 265. The force controller 261 calculates a command value of the driving force to be applied to the tail section 24 based on the target value of the driving force and the estimated value of the external force. The force controller 261 supplies data indicating the command value of the driving force to be applied to the tail section 24 to the command value combiner 263.

 例えば、曲げ伸ばし駆動部232aの力制御器261は、尾部24に対する曲げ伸ばし方向の駆動力の目標値及び曲げ伸ばし方向の外力の推定値に基づいて、尾部24に対する曲げ伸ばし方向の駆動力の指示値を算出する。曲げ伸ばし駆動部232aの力制御器261は、尾部24に対する曲げ伸ばし方向の駆動力の指示値を示すデータを指示値合成器263に供給する。 For example, the force controller 261 of the bending and straightening drive unit 232a calculates an instruction value of the driving force in the bending and straightening direction for the tail 24 based on a target value of the driving force in the bending and straightening direction for the tail 24 and an estimated value of the external force in the bending and straightening direction. The force controller 261 of the bending and straightening drive unit 232a supplies data indicating the instruction value of the driving force in the bending and straightening direction for the tail 24 to the instruction value combiner 263.

 例えば、回転駆動部232bの力制御器261は、尾部24に対する回転方向の駆動力の目標値及び回転方向の外力の推定値に基づいて、尾部24の回転方向の駆動力の指示値を算出する。回転駆動部232bの力制御器261は、尾部24に対する回転方向の駆動力の指示値を示すデータを指示値合成器263に供給する。 For example, the force controller 261 of the rotational drive unit 232b calculates a command value for the driving force in the rotational direction of the tail section 24 based on a target value for the driving force in the rotational direction for the tail section 24 and an estimated value for the external force in the rotational direction. The force controller 261 of the rotational drive unit 232b supplies data indicating the command value for the driving force in the rotational direction for the tail section 24 to the command value combiner 263.

 位置制御器262は、アクチュエータ123により尾部24を動かす位置(又は速度)の目標値を示すデータを動作制御部224から取得する。位置制御器262は、尾部24の位置(又は速度)の検出値を示すデータを位置センサ251から取得する。位置制御器262は、尾部24の位置(又は速度)の目標値及び検出値に基づいて、尾部24の位置(又は速度)の指示値を算出する。力制御器261は、尾部24の位置(又は速度)の指示値を示すデータを指示値合成器263に供給する。 The position controller 262 obtains data indicating a target value of the position (or speed) at which the actuator 123 moves the tail section 24 from the motion control section 224. The position controller 262 obtains data indicating a detected value of the position (or speed) of the tail section 24 from the position sensor 251. The position controller 262 calculates an instruction value for the position (or speed) of the tail section 24 based on the target value and the detected value of the position (or speed) of the tail section 24. The force controller 261 supplies data indicating the instruction value for the position (or speed) of the tail section 24 to the instruction value combiner 263.

 例えば、曲げ伸ばし駆動部232aの位置制御器262は、尾部24の曲げ伸ばし方向の位置(又は速度)の目標値及び検出値に基づいて、尾部24の曲げ伸ばし方向の位置(又は速度)の指示値を算出する。曲げ伸ばし駆動部232aの位置制御器262は、尾部24の曲げ伸ばし方向の位置(又は速度)の指示値を示すデータを指示値合成器263に供給する。 For example, the position controller 262 of the bending and straightening drive unit 232a calculates an instruction value for the position (or speed) in the bending and straightening direction of the tail section 24 based on a target value and a detected value of the position (or speed) in the bending and straightening direction of the tail section 24. The position controller 262 of the bending and straightening drive unit 232a supplies data indicating the instruction value for the position (or speed) in the bending and straightening direction of the tail section 24 to the instruction value combiner 263.

 例えば、回転駆動部232bの位置制御器262は、尾部24の回転方向の位置(又は速度)の目標値及び検出値に基づいて、尾部24の回転方向の位置(又は速度)の指示値を算出する。回転駆動部232bの位置制御器262は、尾部24の回転方向の位置(又は速度)の指示値を示すデータを指示値合成器263に供給する。 For example, the position controller 262 of the rotation drive unit 232b calculates an instruction value for the position (or speed) in the rotation direction of the tail section 24 based on a target value and a detected value of the position (or speed) in the rotation direction of the tail section 24. The position controller 262 of the rotation drive unit 232b supplies data indicating the instruction value for the position (or speed) in the rotation direction of the tail section 24 to the instruction value combiner 263.

 指示値合成器263は、尾部24に加える駆動力の指示値、及び、尾部24の位置(又は速度)の指示値に基づいて、アクチュエータ123の回転量及び回転速度の目標値を算出する。指示値合成器263は、アクチュエータ123の回転量及び回転速度の目標値を達成するための電流値を算出する。指示値合成器263は、算出した電流値を示す電流指示値を示すデータをドライバ264及び力推定器265に供給する。 The instruction value combiner 263 calculates target values for the amount of rotation and the rotation speed of the actuator 123 based on an instruction value for the driving force to be applied to the tail section 24 and an instruction value for the position (or speed) of the tail section 24. The instruction value combiner 263 calculates a current value for achieving the target values for the amount of rotation and the rotation speed of the actuator 123. The instruction value combiner 263 supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.

 例えば、曲げ伸ばし駆動部232aの指示値合成器263は、尾部24の曲げ伸ばし方向の駆動力の指示値、及び、尾部24の曲げ伸ばし方向の位置(又は速度)の指示値に基づいて、ピッチ軸アクチュエータ123aの回転量及び回転速度の目標値を算出する。曲げ伸ばし駆動部232aの指示値合成器263は、ピッチ軸アクチュエータ123aの回転量及び回転速度の目標値を達成するための電流値を算出する。曲げ伸ばし駆動部232aの指示値合成器263は、算出した電流値を示す電流指示値を示すデータをドライバ264及び力推定器265に供給する。 For example, the instruction value combiner 263 of the bending and straightening drive unit 232a calculates target values for the rotation amount and rotation speed of the pitch axis actuator 123a based on an instruction value for the driving force in the bending and straightening direction of the tail section 24 and an instruction value for the position (or speed) in the bending and straightening direction of the tail section 24. The instruction value combiner 263 of the bending and straightening drive unit 232a calculates a current value for achieving the target values for the rotation amount and rotation speed of the pitch axis actuator 123a. The instruction value combiner 263 of the bending and straightening drive unit 232a supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.

 例えば、回転駆動部232bの指示値合成器263は、尾部24の回転方向の駆動力の指示値、及び、尾部24の回転方向の位置(又は速度)の指示値に基づいて、ロール軸アクチュエータ123bの回転量及び回転速度の目標値を算出する。回転駆動部232bの指示値合成器263は、ロール軸アクチュエータ123bの回転量及び回転速度の目標値を達成するための電流値を算出する。回転駆動部232bの指示値合成器263は、算出した電流値を示す電流指示値を示すデータをドライバ264及び力推定器265に供給する。 For example, the instruction value combiner 263 of the rotation drive unit 232b calculates target values for the rotation amount and rotation speed of the roll axis actuator 123b based on an instruction value for the driving force in the rotation direction of the tail section 24 and an instruction value for the position (or speed) in the rotation direction of the tail section 24. The instruction value combiner 263 of the rotation drive unit 232b calculates a current value for achieving the target values for the rotation amount and rotation speed of the roll axis actuator 123b. The instruction value combiner 263 of the rotation drive unit 232b supplies data indicating a current instruction value indicating the calculated current value to the driver 264 and the force estimator 265.

 ドライバ264は、指示値合成器263からの電流指示値に基づく電流をアクチュエータ123に印加する。ドライバ264は、アクチュエータ123の電流応答値を示すデータを力推定器265に供給する。 The driver 264 applies a current based on the current command value from the command value combiner 263 to the actuator 123. The driver 264 supplies data indicating the current response value of the actuator 123 to the force estimator 265.

 例えば、曲げ伸ばし駆動部232aのドライバ264は、指示値合成器263からの電流指示値に基づく電流をピッチ軸アクチュエータ123aに印加する。曲げ伸ばし駆動部232aのドライバ264は、ピッチ軸アクチュエータ123aの電流応答値を示すデータを力推定器265に供給する。 For example, the driver 264 of the bending and straightening drive unit 232a applies a current based on the current command value from the command value combiner 263 to the pitch axis actuator 123a. The driver 264 of the bending and straightening drive unit 232a supplies data indicating the current response value of the pitch axis actuator 123a to the force estimator 265.

 例えば、回転駆動部232bのドライバ264は、指示値合成器263からの電流指示値に基づく電流をロール軸アクチュエータ123bに印加する。回転駆動部232bのドライバ264は、ロール軸アクチュエータ123bの電流応答値を示すデータを力推定器265に供給する。 For example, the driver 264 of the rotation drive unit 232b applies a current based on the current command value from the command value combiner 263 to the roll axis actuator 123b. The driver 264 of the rotation drive unit 232b supplies data indicating the current response value of the roll axis actuator 123b to the force estimator 265.

 力推定器265は、アクチュエータ123の回転方向の位置(又は速度)の検出結果を示すデータを位置センサ251から取得する。力推定器265は、ドライバ264の電流応答値(アクチュエータ123への電流値)と、アクチュエータ123の回転方向の位置(又は速度)の検出値とに基づいて、尾部24に対する外力を推定する。例えば、力推定器265は、ドライバ264の電流応答値から想定されるアクチュエータ123の回転方向の位置(又は速度)と、実際に検出されたアクチュエータ123の回転方向の位置(又は速度)との差に基づいて、尾部24に対する外力の大きさ及び方向を推定する。力推定器265は、外力の推定値を示す外力推定データを力制御器261及び情報処理部203に供給する。 The force estimator 265 acquires data indicating the detection result of the position (or speed) of the actuator 123 in the rotational direction from the position sensor 251. The force estimator 265 estimates the external force on the tail section 24 based on the current response value of the driver 264 (current value to the actuator 123) and the detection value of the position (or speed) of the actuator 123 in the rotational direction. For example, the force estimator 265 estimates the magnitude and direction of the external force on the tail section 24 based on the difference between the position (or speed) of the actuator 123 in the rotational direction assumed from the current response value of the driver 264 and the actually detected position (or speed) of the actuator 123 in the rotational direction. The force estimator 265 supplies external force estimation data indicating an estimate of the external force to the force controller 261 and the information processing section 203.

 例えば、曲げ伸ばし駆動部232aの力推定器265は、ドライバ264の電流応答値(ピッチ軸アクチュエータ123aへの電流値)と、ピッチ軸アクチュエータ123aの回転方向の位置(又は速度)の検出値とに基づいて、尾部24に対する曲げ伸ばし方向の外力を推定する。力推定器265は、曲げ伸ばし方向の外力の推定値を示す外力推定データを力制御器261及び情報処理部203に供給する。 For example, the force estimator 265 of the bending and straightening drive unit 232a estimates the external force in the bending and straightening direction on the tail section 24 based on the current response value of the driver 264 (current value to the pitch axis actuator 123a) and the detected value of the position (or speed) in the rotational direction of the pitch axis actuator 123a. The force estimator 265 supplies external force estimation data indicating an estimated value of the external force in the bending and straightening direction to the force controller 261 and the information processing unit 203.

 回転駆動部232bの力推定器265は、ドライバ264の電流応答値(ロール軸アクチュエータ123bへの電流値)と、ロール軸アクチュエータ123bの回転方向の位置(又は速度)の検出値とに基づいて、尾部24に対する回転方向の外力を推定する。力推定器265は、回転方向の外力の推定値を示す外力推定データを力制御器261及び情報処理部203に供給する。 The force estimator 265 of the rotational drive unit 232b estimates the external force in the rotational direction on the tail section 24 based on the current response value of the driver 264 (current value to the roll axis actuator 123b) and the detected value of the position (or speed) in the rotational direction of the roll axis actuator 123b. The force estimator 265 supplies external force estimation data indicating an estimated value of the external force in the rotational direction to the force controller 261 and the information processing unit 203.

 なお、力推定器265は、アクチュエータ123の駆動性能が良い場合、ドライバ264の電流応答値の代わりに、ドライバ264への電流指示値を用いて、尾部24に対する外力を推定することも可能である。 In addition, if the driving performance of the actuator 123 is good, the force estimator 265 can also estimate the external force on the tail 24 using the current command value to the driver 264 instead of the current response value of the driver 264.

 位置センサ251は、例えば、図12の入力部201の一部を構成し、ピッチ軸アクチュエータ123a及びロール軸アクチュエータ123bに対してそれぞれ設けられる。位置センサ251は、ピッチ軸アクチュエータ123a又はロール軸アクチュエータ123bの物理的な回転量を検出することにより、ピッチ軸アクチュエータ123a又はロール軸アクチュエータ123bの回転方向の位置(又は速度)を検出する。位置センサ251は、ピッチ軸アクチュエータ123a又はロール軸アクチュエータ123bの回転方向の位置(又は速度)の検出値を示すデータを力推定器265及び情報処理部203に供給する。 The position sensor 251, for example, constitutes part of the input unit 201 in FIG. 12, and is provided for each of the pitch axis actuator 123a and the roll axis actuator 123b. The position sensor 251 detects the rotational position (or speed) of the pitch axis actuator 123a or the roll axis actuator 123b by detecting the physical amount of rotation of the pitch axis actuator 123a or the roll axis actuator 123b. The position sensor 251 supplies data indicating the detected value of the rotational position (or speed) of the pitch axis actuator 123a or the roll axis actuator 123b to the force estimator 265 and the information processing unit 203.

 ここで、図14を参照して、力推定器265による外力の推定方法の例について説明する。 Now, with reference to FIG. 14, we will explain an example of a method for estimating an external force using the force estimator 265.

 例えば、図14に示されるように、自律移動体11が、尾部24を上方向に曲げた状態で保持している場合、尾部24に外力が加わると、ピッチ軸アクチュエータ123aの消費電流が増減する。例えば、後ろ方向に引く力が尾部24に加わると、尾部24の位置を保持するために必要なピッチ軸アクチュエータ123aの消費電流が大きくなる。一方、例えば、前方向に押す力が尾部24に加わると、尾部24の位置を保持するために必要なピッチ軸アクチュエータ123aの消費電流が小さくなる。 For example, as shown in FIG. 14, when the autonomous mobile body 11 holds the tail section 24 in an upwardly bent state, the current consumption of the pitch axis actuator 123a increases or decreases when an external force is applied to the tail section 24. For example, when a pulling force in the backward direction is applied to the tail section 24, the current consumption of the pitch axis actuator 123a required to hold the position of the tail section 24 increases. On the other hand, for example, when a pushing force in the forward direction is applied to the tail section 24, the current consumption of the pitch axis actuator 123a required to hold the position of the tail section 24 decreases.

 これに対して、例えば、曲げ伸ばし駆動部232aの力推定器265は、ドライバ264の電流応答値から想定されるピッチ軸アクチュエータ123aの回転方向の位置(又は速度)と、実際に検出されたピッチ軸アクチュエータ123aの回転方向の位置(又は速度)との差に基づいて、尾部24への曲げ伸ばし方向の外力の大きさ及び方向を推定する。 In response to this, for example, the force estimator 265 of the bending and straightening drive unit 232a estimates the magnitude and direction of the external force in the bending and straightening direction on the tail section 24 based on the difference between the rotational position (or speed) of the pitch axis actuator 123a estimated from the current response value of the driver 264 and the actually detected rotational position (or speed) of the pitch axis actuator 123a.

 また、認識部221は、尾部24への外力の大きさ及び方向の推定結果に基づいて、自律移動体11の後方にある障害物を認識することができる。 The recognition unit 221 can also recognize obstacles behind the autonomous mobile body 11 based on the estimated magnitude and direction of the external force on the tail 24.

 例えば、自律移動体11がバックしている場合に、図15に示されるように、尾部24が壁301に接触したとき、壁301から尾部24に外力が加わる。 For example, when the autonomous mobile body 11 is backing up, as shown in FIG. 15, when the tail 24 comes into contact with the wall 301, an external force is applied from the wall 301 to the tail 24.

 これに対して、例えば、曲げ伸ばし駆動部232a又は回転駆動部232bの力推定器265は、上述した方法により、壁301による尾部24への外力の大きさ及び方向を推定する。また、認識部221は、尾部24への外力の大きさ及び方向の推定結果に基づいて、自律移動体11の後方に何らかの障害物が存在することを認識する。 In response to this, for example, the force estimator 265 of the bending and straightening drive unit 232a or the rotation drive unit 232b estimates the magnitude and direction of the external force on the tail 24 by the wall 301 using the method described above. In addition, the recognition unit 221 recognizes that there is an obstacle behind the autonomous moving body 11 based on the estimation result of the magnitude and direction of the external force on the tail 24.

 さらに、例えば、認識部221は、尾部24に加わる外力の大きさ及び方向の推定結果に基づいて、自律移動体11の後方の障害物を探査することができる。 Furthermore, for example, the recognition unit 221 can search for obstacles behind the autonomous mobile body 11 based on the estimated magnitude and direction of the external force acting on the tail 24.

 例えば、自律移動体11が、尾部24を上方向に少し曲げた状態でロール軸回りに両方向の交互に回転させることにより、尾部24を左右に振りながらバックする。この場合、尾部24が何らかの障害物に接触したとき、障害物により尾部24に外力が加わる。 For example, the autonomous mobile body 11 rotates the tail section 24 alternately in both directions around the roll axis with the tail section 24 bent slightly upward, thereby backing up while swinging the tail section 24 from side to side. In this case, when the tail section 24 comes into contact with an obstacle, an external force is applied to the tail section 24 by the obstacle.

 これに対して、例えば、曲げ伸ばし駆動部232a又は回転駆動部232bの力推定器265は、上述した方法により、障害物による尾部24への外力の大きさ及び方向を推定する。また、認識部221は、尾部24への外力の大きさ及び方向の推定結果に基づいて、自律移動体11の後方に何らかの障害物が存在することを認識する。 In response to this, for example, the force estimator 265 of the bending and straightening drive unit 232a or the rotation drive unit 232b estimates the magnitude and direction of the external force on the tail section 24 caused by the obstacle using the method described above. In addition, the recognition unit 221 recognizes that some kind of obstacle is present behind the autonomous moving body 11 based on the estimation result of the magnitude and direction of the external force on the tail section 24.

 また、例えば、認識部221は、尾部24に加わる外力の大きさ及び方向の推定結果に基づいて、尾部24により物体を把持できたか否かを判定することができる。 In addition, for example, the recognition unit 221 can determine whether or not the tail portion 24 has been able to grasp an object based on the estimated magnitude and direction of the external force acting on the tail portion 24.

 例えば、図16に示されるように、自律移動体11が、尾部24を上方向に曲げて、骨型の玩具311を把持した場合、玩具311により尾部24に外力が加わる。 For example, as shown in FIG. 16, when the autonomous mobile body 11 bends the tail 24 upward to grasp a bone-shaped toy 311, an external force is applied to the tail 24 by the toy 311.

 これに対して、例えば、曲げ伸ばし駆動部232aの力推定器265は、上述した方法により、玩具311による尾部24への外力の大きさ及び方向を推定する。また、認識部221は、尾部24への外力の大きさ及び方向の推定結果に基づいて、自律移動体11が尾部24により玩具311を把持したか否かを判定する。 In response to this, for example, the force estimator 265 of the bending and straightening drive unit 232a estimates the magnitude and direction of the external force applied to the tail section 24 by the toy 311 using the method described above. Furthermore, the recognition unit 221 determines whether or not the autonomous mobile body 11 has grasped the toy 311 with the tail section 24 based on the estimation result of the magnitude and direction of the external force applied to the tail section 24.

  <自律移動体11の動作例>
 次に、図17乃至図23を参照して、自律移動体11の動作例について説明する。
<Operation Example of the Autonomous Moving Body 11>
Next, an example of the operation of the autonomous moving body 11 will be described with reference to FIGS.

   <好意的インタラクション動作>
 まず、図17のフローチャートを参照して、自律移動体11の好意的インタラクション動作について説明する。
<Positive interaction behavior>
First, the favorable interaction operation of the autonomous moving body 11 will be described with reference to the flowchart of FIG.

 この処理は、例えば、行動計画部223により自律移動体11の動作モードが好意モードに設定されたとき開始される。好意モードとは、例えば、自律移動体11がユーザ等にかまってほしい場合に対応するモードである。 This process is started, for example, when the behavior planning unit 223 sets the operation mode of the autonomous mobile body 11 to the friendly mode. The friendly mode is, for example, a mode that corresponds to a case where the autonomous mobile body 11 wants attention from a user, etc.

 ステップS1において、自律移動体11は、プレイバウを実行する。プレイバウとは、例えば、図18のAに模式的に示されるように、自律移動体11が頭部21を低くし、尻部を高く上げた姿勢である。例えば、プレイバウは、自律移動体11がユーザと遊んでほしい場合等に行われる。 In step S1, the autonomous mobile body 11 executes a play bow. A play bow is, for example, a posture in which the autonomous mobile body 11 lowers its head 21 and raises its buttocks high, as shown in A of FIG. 18. For example, a play bow is performed when the autonomous mobile body 11 wants to play with the user.

 具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、自律移動体11にプレイバウを実行させる。 Specifically, the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform a play bow.

 ステップS2において、認識部221は、尾部24にユーザが接触したか否かを判定する。具体的には、認識部221は、尾部駆動部232の力推定器265からの外力推定データに基づいて、上述した方法により、尾部24にユーザが接触したか否かを判定する。尾部24にユーザが接触していないと判定された場合、処理はステップS3に進む。 In step S2, the recognition unit 221 determines whether or not the user has touched the tail 24. Specifically, the recognition unit 221 determines whether or not the user has touched the tail 24 by the method described above, based on the external force estimation data from the force estimator 265 of the tail drive unit 232. If it is determined that the user has not touched the tail 24, the process proceeds to step S3.

 ステップS3において、認識部221は、入力部201から供給されるセンサデータ及び入力データに基づいて、体にユーザが接触したか否かを判定する。この場合の体は、自律移動体11の尾部24以外の部位を対象とする。体にユーザが接触していないと判定された場合、処理はステップS4に進む。 In step S3, the recognition unit 221 determines whether or not the user has come into contact with the body based on the sensor data and input data supplied from the input unit 201. In this case, the body refers to any part of the autonomous mobile body 11 other than the tail 24. If it is determined that the user has not come into contact with the body, the process proceeds to step S4.

 ステップS4において、行動計画部223は、認識部221により認識又は推定された状況に基づいて、好意モードを継続するか否かを判定する。好意モードを継続すると判定された場合、処理はステップS1に戻る。 In step S4, the action planning unit 223 determines whether or not to continue the favor mode based on the situation recognized or estimated by the recognition unit 221. If it is determined that the favor mode should be continued, the process returns to step S1.

 その後、ステップS2において、尾部24にユーザが接触したと判定されるか、ステップS3において、体にユーザが接触したと判定されるか、ステップS4において、好意モードを終了すると判定されるまで、ステップS1乃至ステップS4の処理が繰り返し実行される。これにより、自律移動体11はプレイバウを継続する。 Then, the processing of steps S1 to S4 is repeatedly executed until it is determined in step S2 that the user has come into contact with the tail 24, in step S3 that the user has come into contact with the body, or in step S4 that the favor mode is to be ended. As a result, the autonomous mobile body 11 continues playing bow.

 一方、ステップS2において、尾部24にユーザが接触したと判定された場合、処理はステップS5に進む。 On the other hand, if it is determined in step S2 that the user has touched the tail 24, processing proceeds to step S5.

 ステップS5において、自律移動体11は、じゃれる。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、じゃれる動作を自律移動体11に実行させる。 In step S5, the autonomous mobile body 11 plays with the object. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform a play action.

 例えば、自律移動体11は、ゆっくり一定の力で尾部24をユーザの手等に巻き付けることにより、じゃれる動作を表現する。この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を曲げることにより、ゆっくり一定の力で尾部24をユーザの手等に巻き付ける。 For example, the autonomous mobile body 11 expresses a playful motion by wrapping the tail 24 around the user's hand or the like with a slow, constant force. In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail 24, thereby wrapping the tail 24 around the user's hand or the like with a slow, constant force.

 ステップS6において、認識部221は、ステップS2と同様の処理により、尾部24にユーザが接触しているか否かを判定する。尾部24にユーザが接触していると判定された場合、処理はステップS5に戻る。 In step S6, the recognition unit 221 determines whether or not the user is touching the tail 24 by the same process as in step S2. If it is determined that the user is touching the tail 24, the process returns to step S5.

 その後、ステップS6において、尾部24へのユーザの接触が終了したと判定されるまで、ステップS5及びステップS6の処理が繰り返し実行される。これにより、自律移動体11がじゃれる動作が継続される。 Then, in step S6, the processes of steps S5 and S6 are repeatedly executed until it is determined that the user's contact with the tail 24 has ended. This allows the autonomous moving body 11 to continue its playful action.

 一方、ステップS6において、尾部24へのユーザの接触が終了したと判定された場合、処理はステップS7に進む。 On the other hand, if it is determined in step S6 that the user's contact with the tail 24 has ended, processing proceeds to step S7.

 また、ステップS3において、体にユーザが接触したと判定された場合、処理はステップS7に進む。 If it is determined in step S3 that the user has come into contact with the body, the process proceeds to step S7.

 ステップS7において、自律移動体11は、喜びを表現する。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、喜びを表現する動作を自律移動体11に実行させる。 In step S7, the autonomous mobile body 11 expresses joy. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform an action that expresses joy.

 例えば、自律移動体11は、図18のBに模式的に示されるように、尾部24を高々と上げて小刻みに激しく左右に振ることにより喜びを表現する。この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を上方向に大きく曲げることにより持ち上げる。回転駆動部232bは、ロール軸アクチュエータ123bを駆動し、尾部24をロール軸回りに両方向に交互に小刻みに回転させることにより、尾部24を持ち上げた状態で左右に小刻みに激しく振る。 For example, the autonomous mobile body 11 expresses joy by raising the tail 24 high and vigorously swinging it from side to side, as shown diagrammatically in FIG. 18B. In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to lift the tail 24 by bending it significantly upward. The rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail 24 alternately in both directions around the roll axis, causing the tail 24 to swing vigorously from side to side in a lifted state.

 その後、好意的インタラクション動作は終了する。 Then the positive interaction action ends.

 一方、ステップS4において、行動計画部223は、例えば、好意モードと異なるモードに遷移する状況が認識又は推定された場合、好意モードを終了すると判定し、好意的インタラクション動作は終了する。 On the other hand, in step S4, if the behavior planning unit 223 recognizes or estimates, for example, a situation in which a transition to a mode other than the favorable mode occurs, it determines to end the favorable mode, and the favorable interaction operation ends.

   <敵対的インタラクション動作>
 次に、図19のフローチャートを参照して、自律移動体11の敵対的インタラクション動作について説明する。
<Adversarial interaction behavior>
Next, the hostile interaction operation of the autonomous mobile body 11 will be described with reference to the flowchart of FIG.

 この処理は、例えば、行動計画部223により自律移動体11の動作モードが敵対モードに設定されたとき開始される。敵対モードとは、例えば、自律移動体11がユーザ等にかまってほしくない場合に対応するモードである。 This process is started, for example, when the behavior planning unit 223 sets the operation mode of the autonomous mobile body 11 to hostile mode. The hostile mode is, for example, a mode that corresponds to a case where the autonomous mobile body 11 does not want the user or the like to pay attention to it.

 ステップS51において、自律移動体11は、警戒姿勢をとる。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、自律移動体11に警戒姿勢をとらせる。 In step S51, the autonomous mobile body 11 takes an alert posture. Specifically, the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to take an alert posture.

 例えば、図20のAに模式的に示されるように、自律移動体11は、尾部24を下げて、ゆっくり左右に振る。この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を伸ばすことに下げる。回転駆動部232bは、ロール軸アクチュエータ123bを駆動し、尾部24をロール軸回りに両方向に交互にゆっくり回転させることにより、尾部24を下げた状態でゆっくり左右に振る。 For example, as shown in schematic diagram A of FIG. 20, the autonomous mobile body 11 lowers the tail 24 and swings it slowly from side to side. In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to extend and lower the tail 24. The rotation drive unit 232b drives the roll axis actuator 123b to slowly rotate the tail 24 alternately in both directions around the roll axis, causing the tail 24 to swing slowly from side to side with the tail 24 lowered.

 ステップS52において、図17のステップS2の処理と同様に、尾部24にユーザが接触したか否かが判定される。尾部24にユーザが接触していないと判定された場合、処理はステップS53に進む。 In step S52, similar to the process in step S2 of FIG. 17, it is determined whether or not the user has touched the tail 24. If it is determined that the user has not touched the tail 24, the process proceeds to step S53.

 ステップS53において、図17のステップS3の処理と同様に、体にユーザが接触したか否かが判定される。体にユーザが接触していないと判定された場合、処理はステップS54に進む。 In step S53, similar to the process in step S3 of FIG. 17, it is determined whether or not the user has touched the body. If it is determined that the user has not touched the body, the process proceeds to step S54.

 ステップS54において、行動計画部223は、認識部221により認識又は推定された状況に基づいて、敵対モードを継続するか否かを判定する。敵対モードを継続すると判定された場合、処理はステップS51に戻る。 In step S54, the action planning unit 223 determines whether or not to continue the hostile mode based on the situation recognized or estimated by the recognition unit 221. If it is determined that the hostile mode should be continued, the process returns to step S51.

 その後、ステップS52において、尾部24にユーザが接触したと判定されるか、ステップS53において、体にユーザが接触したと判定されるか、ステップS54において、敵対モードを終了すると判定されるまで、ステップS51乃至ステップS54の処理が繰り返し実行される。これにより、自律移動体11は警戒姿勢を継続する。 Then, the processing of steps S51 to S54 is repeatedly executed until it is determined in step S52 that the user has come into contact with the tail 24, in step S53 that the user has come into contact with the body, or in step S54 that the hostile mode has been terminated. As a result, the autonomous mobile body 11 continues its alert posture.

 一方、ステップS52において、尾部24にユーザが接触したと判定された場合、処理はステップS55に進む。 On the other hand, if it is determined in step S52 that the user has touched the tail 24, processing proceeds to step S55.

 ステップS55において、自律移動体11は、尾部24を隠す。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、図20のBに模式的に示されるように、尾部24を隠す動作を自律移動体11に実行させる。 In step S55, the autonomous mobile body 11 hides the tail 24. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to execute the action of hiding the tail 24, as shown diagrammatically in B of FIG. 20.

 この場合、例えば、回転駆動部232bは、ロール軸アクチュエータ123bを駆動し、尾部24をロール軸回りに180度回転させる。曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を曲げ、後脚部(脚部23HLと脚部23HR)の間に尾部24の先端を差し入れることにより、尾部24を隠す。 In this case, for example, the rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail section 24 180 degrees around the roll axis. The bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24 and hide it by inserting the tip of the tail section 24 between the rear legs (legs 23HL and 23HR).

 その後、敵対的インタラクション動作は終了する。 Then the hostile interaction action ends.

 一方、ステップS53において、体にユーザが接触したと判定された場合、処理はステップS56に進む。 On the other hand, if it is determined in step S53 that the user has come into contact with the body, processing proceeds to step S56.

 ステップS56において、自律移動体11は、ユーザから逃げ、尾部24で怒りを表現する。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、ユーザから逃げ、尾部24で怒りを表現する動作を自律移動体11に実行させる。例えば、自律移動体11は、図20のCに模式的に示されるように、尾部24を高々と上げ、前のめりの姿勢をとる。 In step S56, the autonomous mobile body 11 runs away from the user and expresses anger with the tail 24. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223, thereby causing the autonomous mobile body 11 to execute the action of running away from the user and expressing anger with the tail 24. For example, as shown diagrammatically in C of FIG. 20, the autonomous mobile body 11 raises the tail 24 high and assumes a forward-leaning posture.

 この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を曲げることにより、尾部24を高々と持ち上げる。 In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24, thereby lifting the tail section 24 high.

 その後、敵対的インタラクション動作は終了する。 Then the hostile interaction action ends.

 一方、ステップS54において、行動計画部223は、例えば、敵対モードと異なるモードに遷移する状況が認識又は推定された場合、敵対モードを終了すると判定し、敵対的インタラクション動作は終了する。 On the other hand, in step S54, if the behavior planning unit 223 recognizes or estimates a situation in which the mode will transition to a mode other than the hostile mode, it determines to end the hostile mode, and the hostile interaction operation ends.

   <後退動作>
 次に、図21のフローチャートを参照して、自律移動体11の後退動作について説明する。
<Reverse movement>
Next, the backward movement of the autonomous moving body 11 will be described with reference to the flowchart of FIG.

 ステップS101において、自律移動体11は、尾部24を水平に振りながら後退する。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、尾部24を水平に振りながら後退する動作を自律移動体11に実行させる。 In step S101, the autonomous mobile body 11 moves backward while swinging the tail 24 horizontally. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform the action of moving backward while swinging the tail 24 horizontally.

 この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を曲げることにより、尾部24を略水平方向に持ち上げる。回転駆動部232bは、ロール軸アクチュエータ123bを駆動し、尾部24をロール軸回りに両方向に交互に回転させることにより、尾部24を略水平方向に持ち上げた状態で左右に振る。 In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24, thereby lifting the tail section 24 in a substantially horizontal direction. The rotation drive unit 232b drives the roll axis actuator 123b to rotate the tail section 24 alternately in both directions around the roll axis, thereby swinging the tail section 24 from side to side while lifting it in a substantially horizontal direction.

 ステップS102において、図17のステップS2と同様の処理により、尾部24が何かに接触したか否かを判定する。尾部24が何かに接触していないと判定された場合、処理はステップS103に進む。 In step S102, a determination is made as to whether or not the tail 24 has come into contact with anything, using a process similar to that of step S2 in FIG. 17. If it is determined that the tail 24 has not come into contact with anything, the process proceeds to step S103.

 ステップS103において、認識部221は、入力部201から供給されるセンサデータ及び入力データに基づいて、目的地に到着したか否かを判定する。目的地に到着していないと判定された場合、処理はステップS101に戻る。 In step S103, the recognition unit 221 determines whether or not the destination has been reached based on the sensor data and input data supplied from the input unit 201. If it is determined that the destination has not been reached, the process returns to step S101.

 その後、ステップS102において、尾部24が何かに接触したと判定されるか、ステップS103において、目的地に到着したと判定されるまで、ステップS101乃至ステップS103の処理が繰り返し実行される。これにより、自律移動体11は、尾部24を水平に振りながら後退する動作を継続する。 Then, the processing of steps S101 to S103 is repeatedly executed until it is determined in step S102 that the tail section 24 has come into contact with something, or until it is determined in step S103 that the destination has been reached. As a result, the autonomous mobile body 11 continues to move backward while swinging the tail section 24 horizontally.

 一方、ステップS102において、尾部24が何かに接触したと判定された場合、処理はステップS104に進む。 On the other hand, if it is determined in step S102 that the tail 24 has come into contact with something, processing proceeds to step S104.

 ステップS104において、自律移動体11は、尾部24を振って障害物の位置を探る。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、立ち止まって尾部24を振って障害物の位置を探る動作を自律移動体11に実行させる。 In step S104, the autonomous mobile body 11 swings its tail 24 to search for the location of an obstacle. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, thereby causing the autonomous mobile body 11 to perform an action of stopping and swinging its tail 24 to search for the location of an obstacle.

 この場合、例えば、回転駆動部232bは、ロール軸アクチュエータ123bを駆動し、尾部24をロール軸回りに両方向に交互にゆっくり回転させることにより、尾部24を略水平方向に持ち上げた状態で、障害物の位置を探るようにゆっくり左右に振る。 In this case, for example, the rotation drive unit 232b drives the roll axis actuator 123b to slowly rotate the tail section 24 alternately in both directions around the roll axis, so that the tail section 24 is lifted in a substantially horizontal direction and slowly swings from side to side as if searching for the location of an obstacle.

 認識部221は、尾部駆動部232からの外力推定データに基づいて、尾部24への外力の大きさ及び方向に基づいて、障害物の位置を検出する。認識部221は、障害物の位置の検出結果を示すデータを行動計画部223に供給する。 The recognition unit 221 detects the position of an obstacle based on the magnitude and direction of the external force on the tail 24, based on the external force estimation data from the tail drive unit 232. The recognition unit 221 supplies data indicating the detection result of the obstacle position to the action planning unit 223.

 ステップS105において、行動計画部223は、障害物の位置の検出結果に基づいて、障害物を回避可能であるか否かを判定する。障害物を回避可能であると判定された場合、処理はステップS106に進む。 In step S105, the action planning unit 223 determines whether the obstacle can be avoided based on the detection result of the obstacle's position. If it is determined that the obstacle can be avoided, the process proceeds to step S106.

 ステップS106において、自律移動体11は、進行方向を変えて、障害物を避ける。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、自律移動体11が後退した場合に障害物を回避できる方向に、自律移動体11の進行方向を変更する。 In step S106, the autonomous mobile body 11 changes its traveling direction to avoid the obstacle. Specifically, the operation control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223, thereby changing the traveling direction of the autonomous mobile body 11 to a direction in which the autonomous mobile body 11 can avoid the obstacle when the autonomous mobile body 11 retreats.

 その後、処理はステップS101に戻り、ステップS101以降の処理が実行される。 Then, the process returns to step S101, and the processes from step S101 onwards are executed.

 一方、ステップS105において、障害物を回避できないと判定された場合、処理はステップS107に進む。 On the other hand, if it is determined in step S105 that the obstacle cannot be avoided, the process proceeds to step S107.

 ステップS107において、自律移動体11は、後退を中断し、障害物を見るモードに遷移する。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御することにより、後退を中断する。また、行動計画部223は、障害物を見るモードに自律移動体11のモードを変更する。 In step S107, the autonomous mobile body 11 stops retreating and transitions to a mode for viewing obstacles. Specifically, the operation control unit 224 stops retreating by controlling the drive unit 204 and the output unit 205 based on the action plan data supplied from the action planning unit 223. In addition, the action planning unit 223 changes the mode of the autonomous mobile body 11 to a mode for viewing obstacles.

 その後、後退動作は終了する。 Then the retreating motion ends.

 一方、ステップS103において、目的地に到着したと判定された場合、後退動作は終了する。 On the other hand, if it is determined in step S103 that the destination has been reached, the reverse operation ends.

   <物体把持動作>
 次に、図22のフローチャートを参照して、自律移動体11の物体把持動作について説明する。
<Object grasping action>
Next, the object grasping operation of the autonomous mobile body 11 will be described with reference to the flowchart of FIG.

 なお、以下、図23のA乃至Cに示されるように、骨型の玩具である物体351を把持する場合の動作について説明する。 Below, we will explain the operation when grasping object 351, which is a bone-shaped toy, as shown in Figures 23A to 23C.

 ステップS151において、自律移動体11は、物体351に尻部を向ける。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御し、尻部が物体351の方向を向くように自律移動体11の向きを変える。 In step S151, the autonomous mobile body 11 faces its rear end toward the object 351. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and changes the orientation of the autonomous mobile body 11 so that its rear end faces the object 351.

 ステップS152において、自律移動体11は、尾部24をまっすぐ下ろしたまま後進する。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御し、図23のAに模式的に示されるように、尾部24をまっすぐ下ろした状態で自律移動体11を後退させる。 In step S152, the autonomous mobile body 11 moves backward with the tail 24 held straight down. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and moves the autonomous mobile body 11 backward with the tail 24 held straight down, as shown diagrammatically in FIG. 23A.

 ステップS153において、認識部221は、尾部駆動部232から供給される外力推定データにより示される尾部24への外力の大きさ及び方向に基づいて、尾部24の背中側に物体351が接触したか否かを判定する。図23のBに模式的に示されるように、尾部24の背中側に物体351が接触したと判定された場合、処理はステップS154に進む。 In step S153, the recognition unit 221 determines whether or not an object 351 has come into contact with the back of the tail 24, based on the magnitude and direction of the external force on the tail 24 indicated by the external force estimation data supplied from the tail drive unit 232. As shown diagrammatically in FIG. 23B, if it is determined that an object 351 has come into contact with the back of the tail 24, the process proceeds to step S154.

 ステップS154において、自律移動体11は、尾部24を物体351を掬い上げる方向に曲げる。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御し、図23のCに模式的に示されるように、尾部24を物体351を掬い上げる方向に曲げる。 In step S154, the autonomous mobile body 11 bends the tail 24 in a direction to scoop up the object 351. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223, and bends the tail 24 in a direction to scoop up the object 351, as shown diagrammatically in C of FIG. 23.

 この場合、例えば、曲げ伸ばし駆動部232aは、ピッチ軸アクチュエータ123aを駆動し、尾部24を上方向に曲げることにより、尾部24を物体351を掬い上げる方向に曲げる。 In this case, for example, the bending and straightening drive unit 232a drives the pitch axis actuator 123a to bend the tail section 24 upward, thereby bending the tail section 24 in a direction to scoop up the object 351.

 ステップS155において、認識部221は、尾部駆動部232から供給される外力推定データにより示される尾部24への外力の大きさ及び方向に基づいて、物体351の重さが尾部24にかかったか否かを判定する。物体351の重さが尾部24にかっていないと判定された場合、処理はステップS156に進む。 In step S155, the recognition unit 221 determines whether or not the weight of the object 351 is acting on the tail 24, based on the magnitude and direction of the external force on the tail 24 indicated by the external force estimation data supplied from the tail drive unit 232. If it is determined that the weight of the object 351 is not acting on the tail 24, the process proceeds to step S156.

 一方、ステップS153において、尾部24の背中側に物体351が接触していないと判定された場合、ステップS154及びステップS155の処理はスキップされ、処理はステップS156に進む。 On the other hand, if it is determined in step S153 that the object 351 is not in contact with the back side of the tail 24, the processes in steps S154 and S155 are skipped and the process proceeds to step S156.

 ステップS156において、認識部221は、試行回数を完了したか否かを判定する。認識部221は、尾部24により物体を掬い上げる動作の試行回数が所定の回数に達していない場合、試行回数を完了していないと判定し、処理はステップS157に進む。 In step S156, the recognition unit 221 determines whether or not the number of trials has been completed. If the number of trials of the action of scooping up an object with the tail 24 has not reached a predetermined number, the recognition unit 221 determines that the number of trials has not been completed, and the process proceeds to step S157.

 ステップS157において、自律移動体11は、物体351から離れて尻部を向け直す。具体的には、動作制御部224は、行動計画部223から供給される行動計画データに基づいて、駆動部204及び出力部205を制御し、自律移動体11を物体351から離し、再度尻部を物体351の方向に向け直す。 In step S157, the autonomous mobile body 11 moves away from the object 351 and redirects its buttocks. Specifically, the motion control unit 224 controls the drive unit 204 and the output unit 205 based on the action plan data supplied from the action plan unit 223 to move the autonomous mobile body 11 away from the object 351 and redirect its buttocks toward the object 351 again.

 その後、処理はステップS152に戻り、ステップS155において、物体351の重さが尾部24にかかったと判定されるか、ステップS156において、試行回数を完了したと判定されるまで、ステップS152乃至ステップS157の処理が繰り返し実行される。これにより、物体351の把持に成功するか、又は、試行回数を完了するまで、尾部24により物体351を掬い上げる動作が繰り返される。 Then, the process returns to step S152, and steps S152 to S157 are repeatedly executed until it is determined in step S155 that the weight of the object 351 is applied to the tail 24, or it is determined in step S156 that the number of trials has been completed. As a result, the action of scooping up the object 351 with the tail 24 is repeated until the object 351 is successfully grasped or the number of trials has been completed.

 一方、ステップS155において、物体351の重さが尾部24にかかったと判定された場合、処理はステップS158に進む。 On the other hand, if it is determined in step S155 that the weight of the object 351 is acting on the tail 24, the process proceeds to step S158.

 ステップS158において、認識部221は、物体351の把持に成功したと判定する。 In step S158, the recognition unit 221 determines that the object 351 has been successfully grasped.

 その後、物体把持動作は終了する。 The object grasping operation then ends.

 一方、ステップS156において、認識部221は、尾部24で物体を掬い上げる動作の試行回数が所定の回数に達した場合、試行回数を完了したと判定し、処理はステップS159に進む。 On the other hand, in step S156, if the number of attempts to scoop up an object with the tail 24 reaches a predetermined number, the recognition unit 221 determines that the number of attempts has been completed, and the process proceeds to step S159.

 ステップS159において、認識部221は、物体351の把持に失敗したと判定する。 In step S159, the recognition unit 221 determines that grasping of the object 351 has failed.

 その後、物体把持動作は終了する。 The object grasping operation then ends.

 以上のようにして、自律移動体11の表現力を向上させることができる。特に、尾部24の形状や動作がより自然かつ躍動的になり、尾部24による自律移動体11の表現力が向上する。 In this way, the expressiveness of the autonomous moving body 11 can be improved. In particular, the shape and movement of the tail 24 become more natural and dynamic, improving the expressiveness of the autonomous moving body 11 by the tail 24.

 例えば、尾部24をロール軸回りに回転させ、尾部24をワイヤ121により曲げ伸ばしすることにより、尾部24をロール軸に垂直な全ての方向に曲げ伸ばしすることが可能になる。また、例えば、ワイヤを2本用いて尾部を上下左右に曲げ伸ばしできるようにした場合と比較して、尾部24の安定性が向上する。これにより、尾部24の動きの自由度が増え、躍動感のある滑らかな動きが可能になる。 For example, by rotating the tail section 24 around the roll axis and bending and straightening the tail section 24 with the wire 121, it becomes possible to bend and straighten the tail section 24 in all directions perpendicular to the roll axis. Also, compared to a case where, for example, two wires are used to enable the tail section to bend and straighten up, down, left and right, the stability of the tail section 24 is improved. This increases the degree of freedom of movement of the tail section 24, enabling smooth, dynamic movement.

 例えば、尾部24は弾性体111内にワイヤ121を1本挿入しただけなので、弾性体111の柔らかな触感が保持される。すなわち、柔軟かつコシのあるしなやかな尾部24が実現される。また、尾部24の根元を細くできる。これにより、より現実のイヌの尾に近い尾部24が実現される。その結果、例えば、尾部24に触ってみたい衝動をユーザに与えるとともに、ユーザが触れたときに心地よい触感を与えることができる。 For example, the tail 24 is made by simply inserting a single wire 121 into the elastic body 111, so the soft feel of the elastic body 111 is maintained. In other words, a flexible, firm and supple tail 24 is realized. The base of the tail 24 can also be made thin. This allows the tail 24 to resemble a real dog's tail. As a result, for example, the user is given the urge to touch the tail 24, and a pleasant feel is provided when the user touches it.

 また、例えば、上述したようにワイヤを2本用いた場合、常に強い張力で尾部を動かすことになり、尾部の摩耗が促進されることが想定される。一方、尾部24をロール軸回りに回転可能にすることにより、必要に応じてワイヤ121の張力を緩めることができ、尾部24の耐久性が向上する。 Also, for example, if two wires were used as described above, the tail would always be moved with strong tension, which is expected to accelerate wear on the tail. On the other hand, by making the tail 24 rotatable around the roll axis, the tension on the wire 121 can be loosened as needed, improving the durability of the tail 24.

 さらに、上述したように、機械的なセンサを設けずに、尾部24に対する外力の大きさ及び方向を推定したり、自律移動体11の後方の物体を検出したりすることが可能になる。また、外力の推定結果及び物体の検出結果に基づいて、自律移動体11の動作をリアルタイムに制御することが可能になる。 Furthermore, as described above, it becomes possible to estimate the magnitude and direction of the external force on the tail 24 and detect objects behind the autonomous mobile body 11 without providing any mechanical sensors. It also becomes possible to control the operation of the autonomous mobile body 11 in real time based on the results of the estimation of the external force and the results of the detection of the object.

 <<2.変形例>>
 以下、上述した本技術の実施の形態の変形例について説明する。
<<2. Modified Examples>>
Below, a modification of the above-described embodiment of the present technology will be described.

  <柔軟能動機構に関する変形例>
 図24及び図25は、柔軟能動機構の変形例を模式的に示している。図24は、柔軟能動機構の断面図を模式的に示している。図25のAは、柔軟能動機構の斜視図を模式的に示している。図25のBは、柔軟能動機構のワイヤ121の巻取り機構の上面図を模式的にしている。なお、図24において、図を分かりやすくするために、断面の一部において斜線のパターンが省略されている。また、図25のAでは、ベアリング122の図示が省略されている。
<Modifications of the Flexible Active Mechanism>
Fig. 24 and Fig. 25 show schematic diagrams of modified flexible active mechanisms. Fig. 24 shows schematic diagrams of a cross section of the flexible active mechanism. Fig. 25A shows schematic diagrams of a perspective view of the flexible active mechanism. Fig. 25B shows schematic diagrams of a top view of a winding mechanism for a wire 121 of the flexible active mechanism. Note that in Fig. 24, the diagonal line pattern is omitted in part of the cross section to make the diagram easier to understand. Also, in Fig. 25A, the bearing 122 is omitted.

 なお、図中、図6及び図7と対応する部分には同じ符号を付しており、その説明は適宜省略する。 In addition, parts in the figure that correspond to those in Figures 6 and 7 are given the same reference numerals, and their explanation will be omitted as appropriate.

 図24及び図25の柔軟能動機構は、図6及び図7の柔軟能動機構と比較して、尾部24を備える点が一致し、駆動機構101の代わりに駆動機構401を備える点が異なる。駆動機構401は、駆動機構101と比較して、ワイヤ121、ベアリング122、ピッチ軸アクチュエータ123a、ロール軸アクチュエータ123b、及び、ギア125を備える点で一致し、巻取り部124の代わりにピニオン411を備え、巻取り部412が追加されている点が異なる。すなわち、駆動機構401は、駆動機構101と比較して、ワイヤ121の巻取り機構の構成が異なっている。 The flexible active mechanism of Figs. 24 and 25 is the same as the flexible active mechanism of Figs. 6 and 7 in that it includes a tail section 24, and differs in that it includes a drive mechanism 401 instead of the drive mechanism 101. Compared to the drive mechanism 101, the drive mechanism 401 is the same as the drive mechanism 101 in that it includes a wire 121, a bearing 122, a pitch axis actuator 123a, a roll axis actuator 123b, and a gear 125, but differs in that it includes a pinion 411 instead of the winding section 124, and a winding section 412 has been added. In other words, the drive mechanism 401 is different from the drive mechanism 101 in the configuration of the winding mechanism for the wire 121.

 ピッチ軸アクチュエータ123aの先端には、ピニオン411が接続されている。ピニオン411は、曲線状のラックからなる巻取り部412と嵌合している。巻取り部412の尾部24から遠い方の端部付近に、ワイヤ121の一端が接続されている。 A pinion 411 is connected to the tip of the pitch axis actuator 123a. The pinion 411 is engaged with a winding section 412 consisting of a curved rack. One end of the wire 121 is connected to the end of the winding section 412 that is farther from the tail section 24.

 ピッチ軸アクチュエータ123aは、ピニオン411を回転させることにより、図24及び図25の矢印で示される方向及びその逆方向に、巻取り部412をスライドさせることができる。巻取り部412がスライドすることにより、巻取り部412に接続されているワイヤ121が巻き取られたり繰り出されたりする。これにより、図6及び図7の柔軟能動機構と同様に、ワイヤ121が引っ張られたり押し出されたりして、弾性体111(尾部24)がピッチ軸方向に変形する。 The pitch axis actuator 123a can slide the winding section 412 in the direction indicated by the arrows in Figures 24 and 25 and in the opposite direction by rotating the pinion 411. As the winding section 412 slides, the wire 121 connected to the winding section 412 is wound or unwound. As a result, the wire 121 is pulled or pushed out, and the elastic body 111 (tail section 24) is deformed in the pitch axis direction, similar to the flexible active mechanism in Figures 6 and 7.

 以上の説明では、柔軟能動機構がワイヤを1本のみ備える例を示したが、2本以上のワイヤを備えるようにしてもよい。例えば、図6及び図7の柔軟能動機構において、ワイヤ及びワイヤの巻取り機構を1組追加し、矢印A1に垂直な方向(ヨー方向)に尾部24を曲げ伸ばしできるようにしてもよい。 In the above explanation, an example has been shown in which the flexible active mechanism has only one wire, but it may have two or more wires. For example, in the flexible active mechanism of Figures 6 and 7, a set of wire and wire winding mechanism may be added so that the tail 24 can be bent and extended in a direction perpendicular to the arrow A1 (yaw direction).

  <尾部24のセンシングに関する変形例>
 例えば、尾部24に力を検出するセンサを設け、認識部221が、当該センサのセンサデータに基づいて、尾部24に対する外力を検出するようにしてもよい。
<Modifications regarding sensing of tail section 24>
For example, a sensor for detecting force may be provided in the tail portion 24, and the recognition unit 221 may detect an external force acting on the tail portion 24 based on sensor data from the sensor.

 例えば、尾部24に形状を検出するセンサを設け、認識部221が、当該センサのセンサデータに基づいて、尾部24の形状や変形を検出するようにしてもよい。 For example, a sensor that detects the shape of the tail 24 may be provided, and the recognition unit 221 may detect the shape and deformation of the tail 24 based on the sensor data of the sensor.

 例えば、尾部24にタッチセンサを設け、認識部221が、タッチセンサのセンサデータに基づいて、尾部24への接触を検出するようにしてもよい。これにより、例えば、そっと撫でる等の尾部24に対して外力がほとんど加わらない接触の検出精度が向上する。 For example, a touch sensor may be provided on the tail 24, and the recognition unit 221 may detect contact with the tail 24 based on sensor data from the touch sensor. This improves the detection accuracy of contact with the tail 24 that applies almost no external force, such as a gentle stroke, to the tail 24.

  <その他の変形例>
 本技術は、上述したイヌ型の四足歩行ロボット以外にも、柔軟能動機構が適用可能なペット型ロボット等の移動体に適用することができる。例えば、本技術は、尾部を備えるイヌ以外の動物型のロボットに適用できる。例えば、本技術は、尾部以外の柔軟に曲げ伸ばしが可能な部位を備える動物型のロボットに適用できる。そのような部位として、例えば、耳や顎等が想定される。
<Other Modifications>
The present technology can be applied to moving bodies such as pet-type robots to which a flexible active mechanism can be applied, in addition to the dog-type quadruped robot described above. For example, the present technology can be applied to animal-type robots other than dogs that have tails. For example, the present technology can be applied to animal-type robots that have parts other than tails that can be flexibly bent and stretched. Examples of such parts include ears and jaws.

 また、本技術を適用可能な移動体は、例えば、移動体全体が動かずに、一部のみが動く移動体でもよい。例えば、本技術は、クッション等に尾部として柔軟能動機構を設け、尾部のみが動く移動体にも適用できる。 Moreover, the moving body to which this technology can be applied may be, for example, a moving body in which only a part of the moving body moves, and not the entire moving body. For example, this technology can be applied to a moving body in which only the tail moves, with a flexible active mechanism provided as the tail on a cushion or the like.

 <<3.その他>>
  <コンピュータの構成例>
 上述した一連の処理は、ハードウエアにより実行することもできるし、ソフトウエアにより実行することもできる。一連の処理をソフトウエアにより実行する場合には、そのソフトウエアを構成するプログラムが、コンピュータにインストールされる。ここで、コンピュータには、専用のハードウエアに組み込まれているコンピュータや、各種のプログラムをインストールすることで、各種の機能を実行することが可能な、例えば汎用のパーソナルコンピュータなどが含まれる。
<<3. Others>>
<Example of computer configuration>
The above-mentioned series of processes can be executed by hardware or software. When the series of processes is executed by software, the programs constituting the software are installed in a computer. Here, the computer includes a computer built into dedicated hardware, and a general-purpose personal computer, for example, capable of executing various functions by installing various programs.

 図26は、上述した一連の処理をプログラムにより実行するコンピュータのハードウエアの構成例を示すブロック図である。 FIG. 26 is a block diagram showing an example of the hardware configuration of a computer that executes the above-mentioned series of processes using a program.

 コンピュータ1000において、CPU(Central Processing Unit)1001,ROM(Read Only Memory)1002,RAM(Random Access Memory)1003は、バス1004により相互に接続されている。 In computer 1000, a CPU (Central Processing Unit) 1001, a ROM (Read Only Memory) 1002, and a RAM (Random Access Memory) 1003 are interconnected by a bus 1004.

 バス1004には、さらに、入出力インタフェース1005が接続されている。入出力インタフェース1005には、入力部1006、出力部1007、記憶部1008、通信部1009、及びドライブ1010が接続されている。 Further connected to the bus 1004 is an input/output interface 1005. Connected to the input/output interface 1005 are an input unit 1006, an output unit 1007, a storage unit 1008, a communication unit 1009, and a drive 1010.

 入力部1006は、入力スイッチ、ボタン、マイクロフォン、撮像素子などよりなる。出力部1007は、ディスプレイ、スピーカなどよりなる。記憶部1008は、ハードディスクや不揮発性のメモリなどよりなる。通信部1009は、ネットワークインタフェースなどよりなる。ドライブ1010は、磁気ディスク、光ディスク、光磁気ディスク、又は半導体メモリなどのリムーバブルメディア1011を駆動する。 The input unit 1006 includes an input switch, a button, a microphone, an image sensor, etc. The output unit 1007 includes a display, a speaker, etc. The storage unit 1008 includes a hard disk, a non-volatile memory, etc. The communication unit 1009 includes a network interface, etc. The drive 1010 drives removable media 1011 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory.

 以上のように構成されるコンピュータ1000では、CPU1001が、例えば、記憶部1008に記録されているプログラムを、入出力インタフェース1005及びバス1004を介して、RAM1003にロードして実行することにより、上述した一連の処理が行われる。 In the computer 1000 configured as above, the CPU 1001 loads a program recorded in the storage unit 1008, for example, into the RAM 1003 via the input/output interface 1005 and the bus 1004, and executes the program, thereby performing the above-mentioned series of processes.

 コンピュータ1000(CPU1001)が実行するプログラムは、例えば、パッケージメディア等としてのリムーバブルメディア1011に記録して提供することができる。また、プログラムは、ローカルエリアネットワーク、インターネット、デジタル衛星放送といった、有線または無線の伝送媒体を介して提供することができる。 The program executed by the computer 1000 (CPU 1001) can be provided by being recorded on a removable medium 1011 such as a package medium, for example. The program can also be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.

 コンピュータ1000では、プログラムは、リムーバブルメディア1011をドライブ1010に装着することにより、入出力インタフェース1005を介して、記憶部1008にインストールすることができる。また、プログラムは、有線または無線の伝送媒体を介して、通信部1009で受信し、記憶部1008にインストールすることができる。その他、プログラムは、ROM1002や記憶部1008に、あらかじめインストールしておくことができる。 In the computer 1000, the program can be installed in the storage unit 1008 via the input/output interface 1005 by inserting the removable medium 1011 into the drive 1010. The program can also be received by the communication unit 1009 via a wired or wireless transmission medium and installed in the storage unit 1008. Alternatively, the program can be pre-installed in the ROM 1002 or storage unit 1008.

 なお、コンピュータが実行するプログラムは、本明細書で説明する順序に沿って時系列に処理が行われるプログラムであっても良いし、並列に、あるいは呼び出しが行われたとき等の必要なタイミングで処理が行われるプログラムであっても良い。 The program executed by the computer may be a program in which processing is performed chronologically in the order described in this specification, or a program in which processing is performed in parallel or at the required timing, such as when called.

 また、本明細書において、システムとは、複数の構成要素(装置、モジュール(部品)等)の集合を意味し、すべての構成要素が同一筐体中にあるか否かは問わない。したがって、別個の筐体に収納され、ネットワークを介して接続されている複数の装置、及び、1つの筐体の中に複数のモジュールが収納されている1つの装置は、いずれも、システムである。 In addition, in this specification, a system refers to a collection of multiple components (devices, modules (parts), etc.), regardless of whether all the components are in the same housing. Therefore, multiple devices housed in separate housings and connected via a network, and a single device in which multiple modules are housed in a single housing, are both systems.

 さらに、本技術の実施の形態は、上述した実施の形態に限定されるものではなく、本技術の要旨を逸脱しない範囲において種々の変更が可能である。 Furthermore, the embodiments of this technology are not limited to the above-mentioned embodiments, and various modifications are possible without departing from the spirit of this technology.

 例えば、本技術は、1つの機能をネットワークを介して複数の装置で分担、共同して処理するクラウドコンピューティングの構成をとることができる。 For example, this technology can be configured as cloud computing, in which a single function is shared and processed collaboratively by multiple devices over a network.

 また、上述のフローチャートで説明した各ステップは、1つの装置で実行する他、複数の装置で分担して実行することができる。 In addition, each step described in the above flowchart can be executed by a single device, or can be shared and executed by multiple devices.

 さらに、1つのステップに複数の処理が含まれる場合には、その1つのステップに含まれる複数の処理は、1つの装置で実行する他、複数の装置で分担して実行することができる。 Furthermore, when one step includes multiple processes, the multiple processes included in that one step can be executed by one device, or can be shared and executed by multiple devices.

  <構成の組み合わせ例>
 本技術は、以下のような構成をとることもできる。
<Examples of configuration combinations>
The present technology can also be configured as follows.

(1)
 第1の部位と、
 前記第1の部位に接続され、曲げ伸ばし可能な弾性体を備える第2の部位と、
 前記第1の部位から前記第2の部位が伸びる方向に前記第2の部位に挿入されているワイヤを用いて前記弾性体を曲げ伸ばしすることにより、前記第2の部位を曲げ伸ばしする曲げ伸ばし機構と、
 前記第2の部位の前記第1の部位への接続方向に平行な回転軸回りに前記第2の部位を回転させる回転機構と
 を備える移動体。
(2)
 前記曲げ伸ばし機構を駆動する曲げ伸ばし駆動部と、
 前記回転機構を駆動する回転駆動部と
 をさらに備える前記(1)に記載の移動体。
(3)
 前記曲げ伸ばし駆動部は、前記曲げ伸ばし機構が備える第1のアクチュエータへの電流値と、前記第1のアクチュエータの回転方向の位置又は速度とに基づいて、前記第2の部位に対する曲げ伸ばし方向の外力を推定する
 前記(2)に記載の移動体。
(4)
 前記回転駆動部は、前記回転機構が備える第2のアクチュエータへの電流値と、前記第2のアクチュエータの回転方向の位置又は速度とに基づいて、前記第2の部位に対する回転方向の外力を推定する
 前記(3)に記載の移動体。
(5)
 前記曲げ伸ばし方向の外力の推定結果、及び、前記回転方向の外力の推定結果のうち少なくとも1つに基づいて、前記第2の部位への物体の接触を認識する認識部を
 さらに備える前記(4)に記載の移動体。
(6)
 前記移動体は、動物型のロボットであり、
 前記第1の部位は、胴体部であり、
 前記第2の部位は、尾部であり、
 前記移動体は、前記尾部を振り、後方の物体を探査しながら後退する
 前記(5)に記載の移動体。
(7)
 前記第2の部位を曲げることにより物体を把持することが可能であり、
 前記曲げ伸ばし方向の外力の推定結果に基づいて、前記第2の部位が前記物体を把持しているか否かを判定する認識部を
 さらに備える前記(3)乃至(6)のいずれかに記載の移動体。
(8)
 前記回転駆動部は、前記回転機構が備えるアクチュエータへの電流値と、前記アクチュエータの回転方向の位置とに基づいて、前記第2の部位に対する回転方向の外力を推定する
 前記(2)に記載の移動体。
(9)
 前記第1の部位は、胴体部であり、
 前記第2の部位は、尾部である
 前記(2)乃至(8)のいずれかに記載の移動体。
(10)
 前記曲げ伸ばし駆動部及び前記回転駆動部を制御し、前記尾部の曲げ伸ばし及び回転により、前記移動体の感情を表現させる動作制御部を
 さらに備える前記(9)に記載の移動体。
(11)
 前記曲げ伸ばし機構及び前記回転機構は、前記胴体部内に配置されている
 前記(9)又は(10)に記載の移動体。
(12)
 前記ワイヤは、前記第2の部位内において前記第2の部位の先端付近に接続され、前記弾性体内において前記弾性体の長手方向に延び、前記第1の部位に挿入されている
 前記(1)乃至(11)のいずれかに記載の移動体。
(13)
 前記曲げ伸ばし機構は、前記ワイヤを引っ張ったり押し出したりして、前記弾性体を曲げ伸ばしする
 前記(12)に記載の移動体。
(14)
 前記曲げ伸ばし機構は、前記ワイヤを巻き取ったり繰り出したりする巻取り機構を備える
 前記(13)に記載の移動体。
(15)
 前記弾性体は、湾曲可能である
 前記(1)乃至(14)のいずれかに記載の移動体。
(1)
A first portion; and
a second section connected to the first section and having a bendable elastic body;
a bending and straightening mechanism that bends and straightens the second portion by bending and straightening the elastic body using a wire that is inserted into the second portion in a direction in which the second portion extends from the first portion;
a rotation mechanism that rotates the second part about a rotation axis that is parallel to a connection direction of the second part to the first part.
(2)
A bending and straightening drive unit that drives the bending and straightening mechanism;
The moving body according to (1), further comprising: a rotation drive unit that drives the rotation mechanism.
(3)
The moving body described in (2), wherein the bending and straightening drive unit estimates an external force in the bending and straightening direction applied to the second part based on a current value to a first actuator provided in the bending and straightening mechanism and a position or speed in the rotational direction of the first actuator.
(4)
The moving body described in (3), wherein the rotational drive unit estimates an external force in a rotational direction applied to the second part based on a current value to a second actuator provided in the rotation mechanism and a position or speed of the second actuator in the rotational direction.
(5)
The moving body described in (4) further comprises a recognition unit that recognizes contact of an object with the second portion based on at least one of the estimation result of the external force in the bending and straightening direction and the estimation result of the external force in the rotational direction.
(6)
the moving body is an animal-type robot,
the first portion is a body portion,
the second portion is a tail;
The moving body according to (5) above, wherein the moving body swings the tail and retreats while searching for objects behind.
(7)
The second portion is capable of bending to grasp an object;
The moving body according to any one of (3) to (6), further comprising a recognition unit that determines whether the second portion is gripping the object based on an estimation result of the external force in the bending and straightening direction.
(8)
The moving body according to (2), wherein the rotation drive unit estimates an external force in a rotational direction applied to the second part based on a current value to an actuator provided in the rotation mechanism and a position of the actuator in the rotational direction.
(9)
the first portion is a body portion,
The moving body according to any one of (2) to (8), wherein the second portion is a tail.
(10)
The moving body according to (9), further comprising an action control unit that controls the bending and straightening drive unit and the rotation drive unit to express emotions of the moving body by bending and straightening and rotating the tail.
(11)
The moving body according to (9) or (10), wherein the bending/stretching mechanism and the rotation mechanism are disposed within the body portion.
(12)
The moving body described in any of (1) to (11), wherein the wire is connected within the second portion near the tip of the second portion, extends within the elastic body in the longitudinal direction of the elastic body, and is inserted into the first portion.
(13)
The movable body according to (12), wherein the bending and straightening mechanism bends and straightens the elastic body by pulling and pushing the wire.
(14)
The movable body according to (13), wherein the bending and straightening mechanism includes a winding mechanism that winds and unwinds the wire.
(15)
The moving body according to any one of (1) to (14), wherein the elastic body is bendable.

 なお、本明細書に記載された効果はあくまで例示であって限定されるものではなく、他の効果があってもよい。 Note that the effects described in this specification are merely examples and are not limiting, and other effects may also be present.

 11 自律移動体, 21 頭部, 22 胴体部, 23FL乃至23HR 脚部, 24 尾部, 71 アクチュエータ, 101 駆動機構, 111 弾性体, 112 キャップ, 113 接続部材, 113A 軸, 121 ワイヤ, 122 ベアリング, 123a ピッチ軸アクチュエータ, 123b ロール軸アクチュエータ, 124 巻取り部, 125 ギア, 201 入力部, 203 情報処理部, 204 駆動部, 205 出力部, 221 認識部, 223 行動計画部, 224 動作制御部, 231 関節駆動部, 232 尾部駆動部, 232a 曲げ伸ばし駆動部, 232b 回転駆動部, 261 力制御器, 262 位置制御器, 263 指示値合成器, 264 ドライバ, 265 力推定器, 401 駆動機構, 411 ピニオン, 412 巻取り部 11 Autonomous mobile body, 21 Head, 22 Body, 23FL to 23HR Legs, 24 Tail, 71 Actuator, 101 Drive mechanism, 111 Elastic body, 112 Cap, 113 Connection member, 113A Shaft, 121 Wire, 122 Bearing, 123a Pitch axis actuator, 123b Roll axis actuator, 124 Winding section, 125 Gear, 201 Input section, 203 Information processing section, 204 Drive section, 205 Output section, 221 Recognition section, 223 Action planning section, 224 Motion control section, 231 Joint drive section, 232 Tail drive section, 232a Bending and stretching drive section, 232b Rotation drive section, 261 Force controller, 262 Position controller, 263 Indication value synthesizer, 264 Driver, 265 Force estimator, 401 Drive mechanism, 411 Pinion, 412 Winding section

Claims (15)

 第1の部位と、
 前記第1の部位に接続され、曲げ伸ばし可能な弾性体を備える第2の部位と、
 前記第1の部位から前記第2の部位が伸びる方向に前記第2の部位に挿入されているワイヤを用いて前記弾性体を曲げ伸ばしすることにより、前記第2の部位を曲げ伸ばしする曲げ伸ばし機構と、
 前記第2の部位の前記第1の部位への接続方向に平行な回転軸回りに前記第2の部位を回転させる回転機構と
 を備える移動体。
A first portion; and
a second section connected to the first section and having a bendable elastic body;
a bending and straightening mechanism that bends and straightens the second portion by bending and straightening the elastic body using a wire that is inserted into the second portion in a direction in which the second portion extends from the first portion;
a rotation mechanism that rotates the second part about a rotation axis that is parallel to a connection direction of the second part to the first part.
 前記曲げ伸ばし機構を駆動する曲げ伸ばし駆動部と、
 前記回転機構を駆動する回転駆動部と
 をさらに備える請求項1に記載の移動体。
A bending and straightening drive unit that drives the bending and straightening mechanism;
The moving body according to claim 1 , further comprising: a rotation drive unit that drives the rotation mechanism.
 前記曲げ伸ばし駆動部は、前記曲げ伸ばし機構が備える第1のアクチュエータへの電流値と、前記第1のアクチュエータの回転方向の位置又は速度とに基づいて、前記第2の部位に対する曲げ伸ばし方向の外力を推定する
 請求項2に記載の移動体。
The movable body according to claim 2 , wherein the bending and straightening drive unit estimates an external force in a bending and straightening direction applied to the second part based on a current value to a first actuator provided in the bending and straightening mechanism and a position or speed in a rotational direction of the first actuator.
 前記回転駆動部は、前記回転機構が備える第2のアクチュエータへの電流値と、前記第2のアクチュエータの回転方向の位置又は速度とに基づいて、前記第2の部位に対する回転方向の外力を推定する
 請求項3に記載の移動体。
The movable body according to claim 3 , wherein the rotation drive unit estimates an external force in a rotational direction applied to the second part based on a current value to a second actuator provided in the rotation mechanism and a position or a speed of the second actuator in the rotational direction.
 前記曲げ伸ばし方向の外力の推定結果、及び、前記回転方向の外力の推定結果のうち少なくとも1つに基づいて、前記第2の部位への物体の接触を認識する認識部を
 さらに備える請求項4に記載の移動体。
The moving body according to claim 4 , further comprising a recognition unit that recognizes contact of an object with the second portion based on at least one of the estimation result of the external force in the bending and straightening direction and the estimation result of the external force in the rotational direction.
 前記移動体は、動物型のロボットであり、
 前記第1の部位は、胴体部であり、
 前記第2の部位は、尾部であり、
 前記移動体は、前記尾部を振り、後方の物体を探査しながら後退する
 請求項5に記載の移動体。
the moving body is an animal-type robot,
the first portion is a body portion,
the second portion is a tail;
The moving body according to claim 5 , wherein the moving body swings the tail and moves backward while searching for an object behind.
 前記第2の部位を曲げることにより物体を把持することが可能であり、
 前記曲げ伸ばし方向の外力の推定結果に基づいて、前記第2の部位が前記物体を把持しているか否かを判定する認識部を
 さらに備える請求項3に記載の移動体。
The second portion is capable of bending to grasp an object;
The moving body according to claim 3 , further comprising a recognition unit that determines whether or not the second portion is gripping the object based on a result of the estimation of the external force in the bending and straightening direction.
 前記回転駆動部は、前記回転機構が備えるアクチュエータへの電流値と、前記アクチュエータの回転方向の位置とに基づいて、前記第2の部位に対する回転方向の外力を推定する
 請求項2に記載の移動体。
The moving body according to claim 2 , wherein the rotation drive unit estimates an external force in the rotation direction applied to the second part based on a current value to an actuator included in the rotation mechanism and a position of the actuator in the rotation direction.
 前記第1の部位は、胴体部であり、
 前記第2の部位は、尾部である
 請求項2に記載の移動体。
the first portion is a body portion,
The moving body according to claim 2 , wherein the second portion is a tail.
 前記曲げ伸ばし駆動部及び前記回転駆動部を制御し、前記尾部の曲げ伸ばし及び回転により、前記移動体の感情を表現させる動作制御部を
 さらに備える請求項9に記載の移動体。
The moving body according to claim 9 , further comprising: a motion control unit that controls the bending and straightening drive unit and the rotation drive unit to cause the moving body to express an emotion by bending and straightening and rotating the tail.
 前記曲げ伸ばし機構及び前記回転機構は、前記胴体部内に配置されている
 請求項9に記載の移動体。
The moving body according to claim 9 , wherein the bending and straightening mechanism and the rotation mechanism are disposed within the body portion.
 前記ワイヤは、前記第2の部位内において前記第2の部位の先端付近に接続され、前記弾性体内において前記弾性体の長手方向に延び、前記第1の部位に挿入されている
 請求項1に記載の移動体。
The movable body according to claim 1 , wherein the wire is connected within the second portion near a tip of the second portion, extends within the elastic body in a longitudinal direction of the elastic body, and is inserted into the first portion.
 前記曲げ伸ばし機構は、前記ワイヤを引っ張ったり押し出したりして、前記弾性体を曲げ伸ばしする
 請求項12に記載の移動体。
The movable body according to claim 12 , wherein the bending and straightening mechanism bends and straightens the elastic body by pulling and pushing the wire.
 前記曲げ伸ばし機構は、前記ワイヤを巻き取ったり繰り出したりする巻取り機構を備える
 請求項13に記載の移動体。
The movable body according to claim 13 , wherein the bending and straightening mechanism includes a winding mechanism that winds and unwinds the wire.
 前記弾性体は、湾曲可能である
 請求項1に記載の移動体。
The moving body according to claim 1 , wherein the elastic body is bendable.
PCT/JP2024/001640 2023-02-07 2024-01-22 Moving body Ceased WO2024166661A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2024576213A JPWO2024166661A1 (en) 2023-02-07 2024-01-22

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2023-016813 2023-02-07
JP2023016813 2023-02-07

Publications (1)

Publication Number Publication Date
WO2024166661A1 true WO2024166661A1 (en) 2024-08-15

Family

ID=92262370

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2024/001640 Ceased WO2024166661A1 (en) 2023-02-07 2024-01-22 Moving body

Country Status (2)

Country Link
JP (1) JPWO2024166661A1 (en)
WO (1) WO2024166661A1 (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000237987A (en) * 1999-02-19 2000-09-05 Sony Corp Bending mechanism and robot
JP2002116100A (en) * 2000-10-11 2002-04-19 Sony Corp Contact detection sensor and toy
JP2003117859A (en) * 2001-10-12 2003-04-23 Omron Corp Curving mechanism and robot equipped therewith
JP2003136460A (en) * 2001-10-31 2003-05-14 Omron Corp Mounting structure of wire rod for operation of robot
JP2008100317A (en) * 2006-10-19 2008-05-01 Toyota Industries Corp Object handling device
JP2018171701A (en) * 2017-02-28 2018-11-08 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc Apparatus of continuum robot
WO2019087567A1 (en) * 2017-10-31 2019-05-09 ソニー株式会社 Robot device
WO2020158642A1 (en) * 2019-01-31 2020-08-06 ソニー株式会社 Robot control device, robot control method, and program

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000237987A (en) * 1999-02-19 2000-09-05 Sony Corp Bending mechanism and robot
JP2002116100A (en) * 2000-10-11 2002-04-19 Sony Corp Contact detection sensor and toy
JP2003117859A (en) * 2001-10-12 2003-04-23 Omron Corp Curving mechanism and robot equipped therewith
JP2003136460A (en) * 2001-10-31 2003-05-14 Omron Corp Mounting structure of wire rod for operation of robot
JP2008100317A (en) * 2006-10-19 2008-05-01 Toyota Industries Corp Object handling device
JP2018171701A (en) * 2017-02-28 2018-11-08 キヤノン ユーエスエイ, インコーポレイテッドCanon U.S.A., Inc Apparatus of continuum robot
WO2019087567A1 (en) * 2017-10-31 2019-05-09 ソニー株式会社 Robot device
WO2020158642A1 (en) * 2019-01-31 2020-08-06 ソニー株式会社 Robot control device, robot control method, and program

Also Published As

Publication number Publication date
JPWO2024166661A1 (en) 2024-08-15

Similar Documents

Publication Publication Date Title
JP6572943B2 (en) Robot, robot control method and program
JP7747032B2 (en) Information processing device and information processing method
JP7400923B2 (en) Information processing device and information processing method
JP2024103652A (en) ROBOT CONTROL DEVICE, ROBOT CONTROL METHOD, AND PROGRAM
JP7559900B2 (en) Information processing device, information processing method, and program
JP7238796B2 (en) ANIMAL-TYPE AUTONOMOUS MOBILE BODY, OPERATION METHOD OF ANIMAL-TYPE AUTONOMOUS MOBILE BODY, AND PROGRAM
JP2003266351A (en) Robot apparatus and operation control method for robot apparatus
JP7259843B2 (en) Information processing device, information processing method, and program
JP2022169548A (en) Information processing device, information processing method, and program
JP2019168925A (en) Robot, robot control method and program
JP7156300B2 (en) Information processing device, information processing method, and program
JP2004130427A (en) Robot apparatus and operation control method for robot apparatus
WO2024166661A1 (en) Moving body
WO2019123744A1 (en) Information processing device, information processing method, and program
JP7029521B2 (en) Force-tactile transmission system, force-tactile transmission device, force-tactile transmission method and program
JP2003266364A (en) Robot device
JP3494408B2 (en) Electronic pet
WO2024203004A1 (en) Autonomous mobile body and operation control method
JP2002120171A (en) Motion expression device and toy
JP2004298976A (en) Robot apparatus and robot apparatus recognition control method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24753107

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2024576213

Country of ref document: JP

Kind code of ref document: A

WWE Wipo information: entry into national phase

Ref document number: 2024576213

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE