[go: up one dir, main page]

WO2017119348A1 - Robot, procédé de commande de robot, et programme - Google Patents

Robot, procédé de commande de robot, et programme Download PDF

Info

Publication number
WO2017119348A1
WO2017119348A1 PCT/JP2016/088777 JP2016088777W WO2017119348A1 WO 2017119348 A1 WO2017119348 A1 WO 2017119348A1 JP 2016088777 W JP2016088777 W JP 2016088777W WO 2017119348 A1 WO2017119348 A1 WO 2017119348A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
movable
unit
units
mode
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2016/088777
Other languages
English (en)
Japanese (ja)
Inventor
貴裕 井上
暁 本村
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sharp Corp
Original Assignee
Sharp Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp filed Critical Sharp Corp
Priority to JP2017560120A priority Critical patent/JP6568601B2/ja
Publication of WO2017119348A1 publication Critical patent/WO2017119348A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators

Definitions

  • the present invention relates to a robot having a plurality of movable parts, a robot control method, and a program.
  • the conventional robot emergency stop technology stops all the movements of the entire robot.
  • the robot completely stops its operation at the time of emergency stop, and thus cannot communicate with the user at all.
  • An object of the present invention is to provide a robot, a robot control method, and a program that can perform appropriate gestures by the movement of some movable parts for communication while ensuring safety.
  • a robot includes a plurality of movable units and one or a plurality of drive units that individually drive the plurality of movable units, and the plurality of movable units.
  • Some drive units that are allowed to transmit force to the movable unit for a specific communication among the units have one or a plurality of preset modes, all in a specific mode Among the drive units, the transmission of the force from a part of the drive units set in the specific mode to a part of the movable units is permitted, and the remaining drive units to the remaining movable units It is further characterized by further comprising a control unit further comprising a control unit for stopping the transmission of the force.
  • a robot control method includes a plurality of movable units and one or a plurality of drive units that individually drive the plurality of movable units.
  • the robot is configured such that, among all the drive units, a part of the drive units for driving a predetermined part of the movable unit for specific communication from the part of the drive units.
  • the permission for transmitting the force to the motor has one or more preset modes, and when in a specific mode, among the plurality of movable units, the part of the drive units set in the specific mode And a control step of permitting the transmission of the force from the remaining drive unit to the remaining movable unit, and allowing the transmission of the force from the remaining drive unit to the remaining movable unit.
  • FIG. 1 It is a block diagram which shows the principal part structural example of the robot which concerns on Embodiment 1 of this invention.
  • (A) is a front view which shows an example of the external appearance of the robot which concerns on Embodiment 1 of this invention
  • (b) is a figure which shows an example of the skeleton of the robot shown to (a).
  • (A) is an example of the communication table which prescribes
  • (b) is a pocket mode shown to (a).
  • (c) is a figure which shows the example of the state of the robot which can select the bag mode shown to (a).
  • (A) is a flowchart which shows an example of the flow of the process which switches the mode of the robot which concerns on Embodiment 2 of this invention
  • (b) is a flowchart which shows the specific example of the process of S201 shown to (a).
  • (A) is a flowchart which shows the specific example of the process of S204 shown to (a) of FIG. 7
  • (b) is a flowchart which shows the specific example of the process of S205 shown to (a) of FIG.
  • Embodiment 1 Embodiment 1 according to the present invention will be described below with reference to FIGS.
  • a case where the shape of the robot 1 according to the present invention has an appearance like a human body (that is, a humanoid robot) will be described as an example. That is, the robot 1 performs one or a plurality of posture (pose) changes and gestures (communication operation) to the user by individually moving movable parts such as the head, arms, hands, and feet.
  • the shape of the robot 1 is not limited to a human type, and may be an animal type such as a cat type or a dog type, or may be an insect type, a cocoon type, a centipede type, a snake type, or the like. .
  • FIG. 2A is a front view showing an example of the appearance of the robot 1 according to this embodiment.
  • the robot 1 includes a head 2 (movable part), a trunk part 3, a right arm part 4 (movable part), a left arm part 5 (movable part), a right leg part 6 (movable part), and a left leg part. 7 (movable part).
  • the face corresponding to the face in the head 2 of the robot 1 and the face corresponding to the abdomen in the trunk 3 are referred to as “front”, and the side corresponding to the back of the head 2 and the back in the trunk 3.
  • the side corresponding to is referred to as the “back”.
  • the head 2 is provided with a voice input unit 20 (acquisition unit), an LED (Light Emitting Diode) 22, and a speaker 23.
  • the LEDs 22 are provided around both eyes of the robot 1.
  • the voice input unit 20 and the LEDs 22 are provided in pairs on the left and right, corresponding to the ears and eyes of the robot.
  • the right arm portion 4 includes an upper right arm portion 41, a right forearm portion 42, and a right hand portion 43. From the one end (base side) to the other end (tip side) of the right arm portion 4, the upper right arm portion 41, the right forearm portion 42, and the right hand portion 43 are arranged in this order. One end of the right arm 4 is connected to a location corresponding to the right shoulder side of the trunk 3.
  • the left arm part 5 includes a left upper arm part 51, a left forearm part 52, and a left hand part 53. A left upper arm portion 51, a left forearm portion 52, and a left hand portion 53 are arranged in this order from one end (base side) to the other end (tip side) of the left arm portion 5. One end of the left arm 5 is connected to a place corresponding to the left shoulder side of the trunk 3.
  • the right leg 6 is composed of a right thigh 61 and a right foot 62.
  • One end (base side) of the right thigh 61 is connected to a place corresponding to the waist side of the trunk 3, and the right foot 62 is connected to the other end (tip side) of the right thigh 61.
  • the left leg 7 is composed of a left thigh 71 and a left foot 72.
  • One end (base side) of the left thigh 71 is connected to a place corresponding to the waist side of the trunk 3, and the left foot 72 is connected to the other end (tip side) of the left upper arm 51.
  • the means for accepting input of user instructions is not limited to voice, but may be a keyboard, a touch panel, or a light receiving unit such as infrared rays.
  • FIG. 2B is a diagram illustrating a skeleton configuration of the robot 1 according to the present embodiment.
  • the robot 1 further includes a neck roll 11a, a neck pitch 11b, a neck yaw 11c, a right shoulder pitch 12, as a drive unit 40 (see FIG. 1),
  • a left shoulder pitch 13, a right elbow roll 14, a left elbow roll 15, a right crotch pitch 16, a left crotch pitch 17, a right ankle pitch 18b, a right ankle roll 18a, a left ankle pitch 19b, and a left ankle roll 19a are provided.
  • the neck roll 11a to the left ankle roll 19a are all servomotors in this embodiment, and may be provided at each joint portion as shown in FIG.
  • the term neck roll 11a is intended to enable the servomotor to rotate and move the movable part in the roll direction. The same applies to other members such as the neck pitch 11b.
  • each drive unit 40 By instructing each drive unit 40 from a control unit 10 (see FIG. 1) described later, control is performed such that the drive unit 40 is rotated to a specified angle or torque is turned on / off. Thereby, the robot 1 can perform operations such as changing the posture or walking. Below, what can adjust an angle among the drive parts 40 is described as a joint part especially.
  • the neck roll 11a, the neck pitch 11b, and the neck yaw 11c are arranged at a location corresponding to the neck in the robot 1.
  • the control unit 10 can control the movement of the head 2 in the robot 1 by controlling these.
  • the right shoulder pitch 12 is arranged at a position corresponding to the right shoulder in the robot 1.
  • the control unit 10 can control the movement of the entire right arm unit 4 in the robot 1 by controlling this.
  • the left shoulder pitch 13 is disposed on the left shoulder of the robot 1.
  • the control unit 10 can control the movement of the entire left arm unit 5 in the robot 1 by controlling this.
  • the right elbow roll 14 is disposed at a location corresponding to the right elbow in the robot 1.
  • the control unit 10 can control the movement of the right forearm unit 42 and the right hand unit 43 in the robot 1 by controlling this.
  • the left elbow roll 15 is disposed at a location corresponding to the left elbow in the robot 1.
  • the control unit 10 can control the movement of the left forearm unit 52 and the left hand unit 53 in the robot 1 by controlling this.
  • the right crotch pitch 16 is disposed at a location corresponding to the right crotch in the robot 1.
  • the control unit 10 can control the movement of the entire right leg 6 in the robot 1 by controlling this.
  • the left crotch pitch 17 is disposed at a location corresponding to the left crotch in the robot 1.
  • the control unit 10 can control the movement of the entire left leg 7 in the robot 1 by controlling this.
  • the right ankle pitch 18b and the right ankle roll 18a are arranged at a location corresponding to the right ankle in the robot 1.
  • the control unit 10 can control the movement of the right foot 62 in the robot 1 by controlling these.
  • the left ankle pitch 19 b and the left ankle roll 19 a are disposed at a location corresponding to the left ankle in the robot 1.
  • the control unit 10 can control the movement of the left foot 72 in the robot 1 by controlling these.
  • the robot 1 may be provided with one drive unit 40 for driving one movable part (for example, the right crotch pitch 16 with respect to the right leg 6).
  • a plurality of driving units 40 for driving the movable unit can be provided (for example, the neck roll 11a, the neck pitch 11b, and the neck yaw 11c with respect to the head 2).
  • Each drive unit 40 can notify the control unit 10 of a status such as an angle at a predetermined interval.
  • the status notification is performed even when the torque of the servo motor is off.
  • the robot 1 can detect the operation of the movable part by the user.
  • the control unit 10 can recognize the angle of the servo motor by receiving the status notification.
  • the robot 1 shown in FIG. 2 has been described with an example in which the trunk 3 does not have a movable part.
  • the robot 1 may be configured so that the waist portion of the trunk 3 can be rotated to the left and right.
  • a jaw-like member (not shown) may be provided in a portion corresponding to the mouth of the head 2 and the jaw-like member may be configured to be movable according to the sound from the speaker 23.
  • a bowl-like member (not shown) may be provided on the LED 22 so that the movement of the bowl-like member blinking is possible.
  • the trunk 3 and the head 2 can also be movable parts.
  • you may comprise so that each movable part may move in the aspect different from the example shown in FIG.
  • the right arm part 4, the left arm part 5, the right leg part 6, and the left leg part 7 may be configured to expand and contract in addition to rotating and bending and stretching.
  • FIG. 1 is a block diagram showing the configuration of the robot 1.
  • the robot 1 includes a plurality of movable parts. For example, when the robot 1 is housed in a pocket and a bag, the robot 1 is housed in a movable part (part of the movable part) belonging to a portion outside the pocket and the bag, and in the pocket and the bag. Control with the movable part (remaining movable part) which belongs to a part can be varied.
  • the robot 1 includes a control unit 10, a voice input unit 20, a storage unit 30, and a drive unit 40.
  • the drive unit 40 is as described above with reference to FIG.
  • the control unit 10 controls the operation and processing of the robot 1 in an integrated manner.
  • the voice input unit 20 acquisition unit
  • the voice input unit 20 is a device for acquiring voice input to the control unit 10 by the user.
  • the voice input unit 20 is a microphone.
  • the storage unit 30 is a storage medium that stores various types of information for the control unit 10 to perform processing. Specific examples of the storage unit 30 include a hard disk or a flash memory.
  • the storage unit 30 stores a communication table 31 that defines which joint of the robot 1 is turned off for each communication mode (specific mode). The communication table 31 will be described later with a specific example.
  • the storage unit 30 is a voice table (not shown) that is referenced by the robot 1 to interpret the voice instruction when a voice instruction from the user is input to the robot 1. Etc. are stored.
  • the “communication mode” means that a part of the drive units 40 that are allowed to transmit force (driving force) to a part of the movable units for specific communication among the plurality of movable units are set in advance.
  • the robot 1 permits transmission of force from some of the drive units 40 set to the communication mode to some of the movable units among all the drive units 40 (with torque turned on). Then, the transmission of force from the remaining drive unit 40 to the remaining movable unit is stopped (torque is turned off). As a result, the robot 1 performs one or more gestures by driving only some of the movable parts. That is, the communication mode is a mode for performing a gesture with operation restriction.
  • the robot 1 is a mode in which all the drive units 40 permit the transmission of force to the corresponding movable unit (that is, there is no restriction on the drive of all the drive units 40) and a gesture is performed.
  • a certain “normal mode” may be provided.
  • the robot 1 further stops the transmission of the force to the corresponding movable part in all the drive parts 40 (that is, sets a limit on the drive of all the drive parts 40), and performs all the gestures by moving the movable parts.
  • An “operation stop mode”, which is a mode in which no operation is performed, may be provided.
  • the control unit 10 includes a voice recognition unit 101, a mode control unit 102, a drive control unit 103, and a function execution unit 104.
  • the voice recognition unit 101 recognizes (interprets) the voice (a predetermined instruction from the outside) input to the voice input unit 20.
  • the voice recognition unit 101 determines whether or not the input voice is a predetermined voice included in the voice table 32 of the storage unit 30. For example, it is assumed that the voice input from the voice input unit 20 is a voice instructing the start of the communication mode.
  • the voice recognition unit 101 refers to the voice table 32 of the storage unit 30 and recognizes that the robot 1 has been instructed to start a communication mode that is a mode corresponding to a voice instruction. Based on this recognition, the voice recognition unit 101 outputs an instruction to start the communication mode to the mode control unit 102.
  • the mode control unit 102 changes the mode of the robot 1 in accordance with an instruction from the voice recognition unit 101.
  • the mode control unit 102 also outputs information indicating that the mode has been changed to the drive control unit 103 and the function execution unit 104.
  • the drive control unit 103 refers to the communication table 31 and transmits force from a part of the drive units 40 set for each communication mode changed by the mode control unit 102 to a corresponding part of the movable units. While permitting, the transmission of force from the remaining drive section 40 to the remaining movable section is stopped.
  • the function execution unit 104 executes various functions provided in the robot 1 specified for each mode changed by the mode control unit 102.
  • the robot 1 may have various functions such as a voice call function and a schedule management function (for example, functions provided in a mobile terminal such as a smartphone).
  • the robot 1 may incorporate a computer including a control unit 10 that is a CPU and a storage unit 30.
  • FIG. 3 is a flowchart illustrating an example of a process flow for switching the mode of the robot 1.
  • the voice recognition unit 101 refers to the voice table 32 and recognizes that the voice “communication mode” is a keyword for shifting to the communication mode (YES in S101).
  • the mode control unit 102 that has received the instruction to start the communication mode from the voice recognition unit 101 shifts (changes) the mode of the robot 1 to the communication mode that is a mode corresponding to the instruction (S102, mode). Switching process).
  • the voice instruction for instructing to change the mode from the normal mode to the communication mode the voice “communication mode” described above may be used, or the name of the robot 1 may be registered in advance by the user. It may be the name that was made. Note that the robot 1 continues to operate in the normal mode until an instruction from the user who instructs to call the communication mode is input.
  • the drive control unit 103 refers to the communication table 31 stored in the storage unit 30 and, depending on the situation and posture of the robot 1, from some of the drive units 40 set in advance for each communication mode.
  • the transmission of the force to the corresponding part of the movable parts is permitted, and the transmission of the force from the remaining drive part 40 to the remaining movable parts is stopped. That is, control is performed such that some of the drive units 40 set in the communication table 31 are weakened and not driven (S103, control process). Specifically, the torque of a predetermined servo motor among the servo motors constituting the joint portion as the drive unit is turned off.
  • the drive control part 103 performs an appropriate gesture by operating only some drive parts 40 according to the condition and attitude
  • the robot 1 permits the transmission of force from a part of the drive units 40 set to the communication mode among all the drive units 40 to a part of the movable units, and the rest. The transmission of force from the drive unit 40 to the remaining movable units is stopped. Thereby, the robot 1 can perform an appropriate gesture while ensuring safety.
  • the function execution unit 104 executes operations of various functions of the robot 1 in response to the mode control unit 102 changing the mode from the normal mode to the communication mode (S104, control process). Note that the processing order of S103 and S104 is not limited to this, and may be reversed, for example.
  • a voice “end of communication mode” issued by the user to the robot 1 is input to the voice input unit 20.
  • the voice recognition unit 101 refers to the voice table 32 and recognizes that the voice “communication mode end” is a keyword for shifting from the communication mode to the normal mode (YES in S105).
  • the mode control unit 102 that has received the instruction to end the communication mode from the voice recognition unit 101 changes the mode of the robot 1 to the normal mode (ends the communication mode) (S106).
  • the drive control unit 103 restores the drive unit 40, which has been weakened in S103, to a drivable state. Specifically, the torque of a predetermined servo motor that has been turned off among the servo motors constituting the joint as the drive unit 40 is turned on. Note that the robot 1 continues to operate in the communication mode until an instruction from the user instructing the end of the communication mode is input.
  • the robot 1 Since the robot 1 has the communication mode, the robot 1 permits the transmission of force from a part of the drive units 40 set to the communication mode to a part of the movable units, and from the remaining drive units 40 to the rest.
  • the transmission of force to the movable part can be stopped.
  • the drive unit 40 included in the housed part is difficult or cannot be driven. Therefore, the robot 1 stops driving only the parts that are stored, and enables driving the parts that are not stored. According to this configuration, for example, even when the robot 1 is stored in the pocket, it is possible to perform one or more gestures by driving only the drive unit 40 in a portion that is not stored. is there.
  • the driving of the housed portion since the driving of the housed portion is stopped, it is possible to prevent the pocket 1 from being damaged or the robot 1 from being damaged by the robot 1 forcibly driving the drive unit 40 in the pocket. Can do.
  • the present invention is not limited to this.
  • the robot 1 may switch the mode to the communication mode even if the robot 1 can move freely on a desk, for example.
  • the following (1) and (2) have the advantages of intentionally limiting the operation by switching the mode to the communication mode.
  • the robot 1 Since the robot 1 switches the mode to the communication mode, the number of servo motors to be controlled is reduced, so that the battery consumption of the robot 1 can be suppressed. Even in the normal mode, in order to reduce the battery consumption, it is possible to limit (reduce) the operation for the robot 1 to perform a gesture. However, the number of servo motors to be driven can be reduced by the robot 1 switching the mode to the communication mode. Therefore, the battery consumption can be further reduced, and the robot 1 can perform an appropriate gesture in which the restriction of operation is more effectively relaxed than in the normal mode.
  • FIG. 4A is an example of a communication table that defines which joint portion of the robot torque is to be turned off for each communication mode.
  • FIG. 4B is a diagram showing an example of the state of the robot 1 that can select the pocket mode shown in FIG. 4A
  • FIG. 4C is a diagram shown in FIG. It is a figure which shows the example of the state of the robot 1 which can select a bag mode.
  • the robot 1 has a plurality of communication modes such as “pocket mode” and “bag mode” will be described as an example, but the present invention is not limited to this.
  • the robot 1 may have one communication mode or three or more communication modes.
  • the robot 1 accepts an input of a type designation instruction for designating which communication mode is used from the user, and switches the type of the communication mode according to the type designation instruction. It may be.
  • the voice recognition unit 101 when a voice “pocket” from the user is input from the voice input unit 20, the voice recognition unit 101 refers to the voice table 32, and the voice switches the communication mode to “pocket mode” (shifts). It is recognized that the voice is instructed to do so. Based on this recognition, the voice recognition unit 101 outputs an instruction to start the pocket mode to the mode control unit 102.
  • the voice recognition unit 101 when a voice “bag” from the user is input from the voice input unit 20, the voice recognition unit 101 refers to the voice table 32 so that the voice switches the communication mode to the “bag mode”. Recognize that the voice is instructed to. Based on this recognition, the voice recognition unit 101 outputs an instruction to start the bag mode to the mode control unit 102.
  • the mode control unit 102 changes the mode of the robot 1 to the pocket mode in accordance with an instruction from the voice recognition unit 101, and the drive control unit 103 starts control of the operation corresponding to each mode with reference to the communication table 31. .
  • the function execution unit 104 executes a function corresponding to each mode.
  • the robot 1 when the robot 1 is stored in a pocket as shown in FIG. 4B, if the arm portions 4 and 5 of the robot 1 and the joint portions of the legs 6 and 7 are forcibly driven, The robot 1 jumps out of the pocket, a failure occurs in the joint, or the pocket is damaged. Therefore, in order for the robot 1 in the state as shown in FIG. 4B to perform a gesture, the face portions are changed without moving the joint portions of the arm portions 4 and 5 and the leg portions 6 and 7. Or the head 2 may be moved so as to shake the neck vertically and horizontally. Thereby, one or a plurality of appropriate gestures can be performed.
  • the drive control unit 103 refers to the communication table 31 shown in FIG. 4A, and in the pocket mode, the torque of the servo motor “other than neck roll, pitch, yaw” of the robot 1 is set. Turn off.
  • the drive control unit 103 has a part of movable parts (head 2) exposed outside the pocket among the plurality of movable parts. Is allowed to transmit force from a part of the driving parts (neck roll 11a, neck pitch 11b, neck yaw 11c) to a part of the movable part (head 2) (torque-on).
  • the drive control unit 103 further includes a remaining drive unit 40 (for driving the remaining movable units (right arm unit 4, arm unit 5, right leg unit 6, left leg unit 7) housed in the pocket).
  • the transmission of the force from the right shoulder pitch 12, the left shoulder pitch 13, etc.) to the remaining movable parts (right arm part 4, arm part 5, right leg part 6, left leg part 7) is stopped (torque off).
  • the drive control unit 103 turns off the torque of the servo motor of “both crotch pitch and both ankle pitch” of the robot 1 in the bag mode. .
  • the drive control unit 103 is configured such that some of the movable units exposed to the outside of the bag (the head 2,.
  • the drive control unit 103 further includes a remaining drive unit 40 (right crotch pitch 16, left crotch pitch) for driving the remaining movable units (right leg unit 6, left leg unit 7) housed in the bag. 17) to the remaining movable parts (right leg part 6, left leg part 7) is stopped (torque off).
  • a remaining drive unit 40 right crotch pitch 16, left crotch pitch
  • the robot 1 may not have a communication table as shown in FIG.
  • the robot 1 may have a configuration in which the torque of some of the movable parts determined in advance to perform a specific gesture is turned off so as not to move (deforce).
  • part by applying a torque may be sufficient.
  • the robot 1 has a voice call function that a mobile terminal such as a smartphone has, if the voice call function is a function that does not involve driving of the drive unit 40, the mode of the robot 1 is the communication mode.
  • the voice call function can be executed as in the normal mode. However, if the voice call function is a function accompanied by driving of the driving unit 40, the driving unit 40 that can be driven in the normal mode may not be driven in the communication mode.
  • the robot 1 uses an operation of raising (raising) the right arm portion 4.
  • the drive unit 40 can be operated as determined in advance.
  • the robot 1 is operating in the pocket mode as shown in FIG. 4B, it is impossible to notify the incoming call by raising the hand as described above. Therefore, the robot 1 operating in the pocket mode of the communication table 31 cannot perform the operation of raising the right arm portion 4 even if it detects an incoming call.
  • the drive unit 40 is not limited to driving the voice call function, and the drive unit 40 is usually used to perform a schedule management function and a gesture used for emotional expression for communication with the user. Whether or not the drive unit 40 can be driven in the mode and the communication mode may be determined in advance.
  • Embodiment 2 Embodiment 2 according to the present invention will be described below with reference to FIGS.
  • members having the same functions as those described in the embodiment are given the same reference numerals, and descriptions thereof are omitted.
  • FIG. 6 is a front view showing an example of the appearance of the robot 1a according to the present embodiment.
  • the robot 1a includes an image input unit 24 (detection unit) on the head 2a, a brightness detection unit 50 (detection unit), a failure detection unit 70 (detection unit), and the like on the trunk 3a. It differs from the robot 1 shown in FIG.
  • the robot 1a recognizes what state it is by acquiring surrounding information and external stimuli via the image input unit 24, the brightness detection unit 50, the failure detection unit 70, and the like.
  • the drive unit 40 can be driven by autonomously switching to a mode according to the state.
  • the robot 1a includes a part of all the driving units 40 that are set to the communication mode. While permitting the transmission of force from the part 40 to some of the movable parts, the transmission of force from the remaining drive part 40 to the remaining movable parts is stopped. That is, the robot 1a autonomously shifts to a communication mode that is a mode corresponding to the detection result according to the surrounding situation, while driving some movable parts that are permitted to operate, The movable part can be stopped.
  • predetermined conditions which permit transmission of the force from some drive parts 40 to some movable parts, and stop transmission of the force from the remaining drive parts 40 to the remaining movable parts. A specific example will be described later.
  • the image input unit 24 is for inputting an image, and is composed of a camera.
  • the image input unit 24 may be disposed on the front surface of the head 2a of the robot 1a.
  • the image input unit 24 may be a digital camera, or may be a digital video that can capture surrounding objects and user movements.
  • the brightness detection unit 50 is for detecting brightness, and is composed of, for example, an illuminance sensor.
  • the obstacle detection unit 70 is for detecting an object existing around the robot 1a, and includes a proximity sensor, a contact sensor, a distance measuring sensor, and the like.
  • the robot 1 illustrated the structure which equips each one with the brightness detection part 50 and the obstacle detection part 70 in the trunk part 3a, it is not limited to this.
  • the robot 1a may be configured to include only a plurality of brightness detection units 50 among the brightness detection unit 50 and the failure detection unit 70.
  • the brightness detection part 50 may be provided in each of the head 2a, the trunk 3a, the arms 4 and 5 (movable part), and the legs 6 and 7 (movable part).
  • the robot 1a may be configured to include all of the image input unit 24, the brightness detection unit 50, and the failure detection unit 70, but is not limited thereto.
  • the robot 1a may be configured to include at least one of the image input unit 24, the brightness detection unit 50, and the failure detection unit 70.
  • FIG. 6 shows an example in which both the brightness detection unit 50 and the failure detection unit 70 are provided on the front side of the trunk 3a
  • the present invention is not limited to this.
  • the brightness detection unit 50 and the failure detection unit 70 may be arranged at arbitrary positions, respectively. That is, at least one of the brightness detection unit 50 and the failure detection unit 70 may be provided on the back side of the trunk 3a.
  • a voice instruction predetermined external instruction
  • a button press predetermined external instruction
  • an operation on the touch panel instructed externally
  • the head 2a, the trunk 3a, etc. It may be configured to accept a predetermined instruction from the outside).
  • a voice instruction predetermined external instruction
  • the mode is autonomously switched to a mode corresponding to the state of the robot 1a, and the mode is not suitable for the state of the robot 1a
  • the user utters a voice such as “redo mode”
  • a configuration in which the robot 1a switches the mode again by operating buttons, a touch panel, or the like may be used.
  • FIG. 5 is a block diagram illustrating a configuration example of a main part of the robot according to the present embodiment.
  • the robot 1a includes an image input unit 24, a brightness detection unit 50, and a failure detection unit 70 that detect stimulation from the outside world to the robot 1a and the state of the robot 1a.
  • the robot 1a determines a part of the drive unit 40 that may be driven to perform a gesture, and a part of the movable unit 40 is movable. While allowing the transmission of force to the part, it is possible to stop the transmission of force from the remaining drive part 40 to the remaining movable part (communication mode).
  • the control unit 10a of the robot 1a includes an environment recognition unit 205.
  • the environment recognition unit 205 acquires information indicating the state of the robot 1a and stimulation from the outside world via the image input unit 24, the brightness detection unit 50, and the failure detection unit 70 described above.
  • the brightness detection unit 50 may be used as an alternative to the image input unit 24, and conversely, the image input unit 24 may be used as an alternative to the brightness detection unit 50.
  • the environment recognition unit 205 determines that the robot 1a is accommodated in a bag, a pocket, or the like. May be. Based on this determination, the environment recognition unit 205 may instruct the mode control unit 102 to switch to a mode in which all the drive units 40 are stopped (operation stop mode).
  • the image acquired by the image input unit 24 shows a certain level of brightness or more. If the user's face is included in the image, the environment recognition unit 205 may determine that the robot 1a is housed beside the bag or the like. Based on this determination, the environment recognition unit 205 may instruct the mode control unit 102 to change from the normal mode to the bag mode (see FIG. 4).
  • the environment recognition unit 205 is not limited to the state of the robot 1a described above, but can be various based on combinations of information acquired through the image input unit 24, the brightness detection unit 50, and the failure detection unit 70. The state can be recognized.
  • FIG. 7A is a flowchart illustrating an example of a process flow for switching the mode of the robot 1a.
  • FIG. 7B is a flowchart illustrating an example of the processing flow of S201 illustrated in FIG.
  • FIG. 8A is a flowchart illustrating an example of the processing flow of S204 illustrated in FIG.
  • FIG. 8B is a flowchart illustrating an example of the processing flow of S205 illustrated in FIG.
  • symbol is attached and the description is abbreviate
  • the robot 1a includes a brightness detection unit 50 that detects ambient brightness, an image input unit 24 that can detect the presence or absence of an obstacle in front of the head, and an obstacle detection that can detect the presence or absence of an obstacle in front of the trunk 3a.
  • the unit 70 is provided, and the mode can be autonomously switched according to the detected result.
  • the mode control unit 102 shifts the mode to the operation stop mode (S202) or the communication mode (S102). To do.
  • the drive control unit 103 turns off the torque of the servo motor of the drive unit 40 of the whole body of the robot 1a (S203).
  • the robot 1a that has shifted to the operation stop mode may be provided with means that does not use the operation of the drive unit 40, such as voice utterance or lighting and blinking of the LED 22, that is, instead of a gesture.
  • the mode control unit 102 ends the current mode (operation stop mode) (S106).
  • the drive control unit 103 refers to the communication table 31 as shown in FIG. 4A, for example, except for the aforementioned neck roll, pitch, and yaw.
  • the torque of the servo motor of the drive unit 40 is turned off (S103, control process).
  • the robot 1a can perform an appropriate gesture by driving the drive part 40 of the neck part.
  • the function execution unit 104 executes operations of various functions of the robot 1 in response to the mode control unit 102 changing the mode from the normal mode to the communication mode (S104, control process).
  • the mode control unit 102 ends the current mode (communication mode) (S106). On the other hand, when a situation to be shifted to another mode is not detected (NO in S205), the process returns to S104 and the operation in the communication mode is continued.
  • the environment recognition unit 205 first acquires information indicating brightness from the brightness detection unit 50, and determines whether or not the surroundings of the robot 1a are brighter than a predetermined brightness (predetermined brightness) (S210). ). If the surroundings of the robot 1a are darker than the predetermined brightness (the brightness is less than or equal to the predetermined value) (YES in S210), the robot 1a is stored in a bag or the like and should be switched to the communication mode. It can be recognized that it is not a situation (state). That is, even if the robot 1a drives the drive unit 40, the user cannot visually recognize the movement because the surroundings are dark, or the robot 1a cannot move freely because the whole body is accommodated in a bag or the like. Since there is a high possibility, it is desirable to stop the driving of the driving unit 40.
  • the robot 1a determines that it is difficult to perform a gesture by driving the driving unit 40, and the above-described YESa in S201 is determined. move on.
  • the predetermined brightness may be set to the brightness when the whole body of the robot 1a is stored in the bag.
  • environment recognition unit 205 next determines whether there is an obstacle ahead of head 2a. Is performed (S211). The environment recognition unit 205 detects (determines) whether an obstacle is present in front of the head 2a based on the image acquired from the image input unit 24 (by performing image analysis or the like). If there is an obstacle in front of robot 1a (YES in S211), the process proceeds to YESa in S201 described above.
  • the environment recognition unit 205 detects (determines) whether there is an obstacle ahead of the trunk 3a based on the information acquired from the failure detection unit 70. If there is an obstacle in front of robot 1a (YES in S212), the process proceeds to YESb in S201 described above.
  • the robot 1a is shown in FIG. It is thought that it is in a state. That is, although the head 2a is out of the pocket, a part of the trunk 3a and the legs 6 and 7 are stored in the pocket or the like. Therefore, the robot 1a shifts to the communication mode.
  • the mode control unit 102 determines the current mode of the robot 1a. It is determined that the situation for ending (operation stop mode) is not detected (NO in S204). On the other hand, when the surrounding brightness of robot 1a is higher (brighter) than the predetermined value (NO in S220) and no obstacle exists in front of head 2a (NO in S221), the mode The control unit 102 determines that the current mode (operation stop mode) of the robot 1a has been detected (YES in S204). Thus, YES is obtained in S204 when the condition that does not become YESa is established in S201, while NO is obtained in S204 when the condition that is YESa is established in S201. is there.
  • mode control unit 102 determines that a situation in which the current mode (communication mode) of robot 1a should be terminated has been detected (S205). At YES). Alternatively, if the brightness around robot 1a is higher (brighter) than a predetermined value (NO in S220) and no obstacle is present in front of trunk 3a (NO in S222), the mode control unit 102 determines that the current mode (communication mode) of the robot 1a has been detected (YES in S205).
  • mode control unit 102 determines that the current mode (communication mode) of robot 1a has not been detected. (NO in S205).
  • YES is obtained in S205 when the condition that does not become YESb is established in S201
  • NO is obtained in S205 when the condition that is YESb is established in S201. is there.
  • the robot 1a can always autonomously select an appropriate mode by accurately recognizing the current situation based on the external stimulus and the surrounding environment.
  • the control blocks (particularly the voice recognition unit 101, the mode control unit 102, the drive control unit 103, and the function execution unit 104) of the robot 1 are realized by a logic circuit (hardware) formed on an integrated circuit (IC chip) or the like. Alternatively, it may be realized by software using a CPU (Central Processing Unit).
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the robot 1 includes a CPU that executes instructions of a program that is software that realizes each function, a ROM (Read Only Memory) in which the program and various data are recorded so as to be readable by a computer (or CPU), or a memory.
  • a device (these are referred to as “recording media”), a RAM (Random Access Memory) for expanding the program, and the like are provided.
  • the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • a transmission medium such as a communication network or a broadcast wave
  • the present invention can also be realized in the form of a data signal embedded in a carrier wave in which the program is embodied by electronic transmission.
  • a robot (1, 1a) according to aspect 1 of the present invention includes a plurality of movable parts (heads 2, 2a, right arm part 4, left arm part 5, right leg part 6, left leg part 7) and the plurality of movable parts.
  • the drive unit has one or more preset modes, and when in a specific mode, some of the drive units set to the specific mode among all the drive units are part of the above
  • a control unit (10, 10a) that permits transmission of the force to the movable unit and stops transmission of the force from the remaining drive unit to the remaining movable unit is further provided. .
  • the robot can stop the other movable parts while driving some of the movable parts in the specific mode.
  • the movable portion (head portion, etc.) stored in the pocket is stopped and the movable portion (head) that is not stored is stopped. Part or the like) can be driven.
  • the movable part such as the leg that is not visible to the user stops, but the movable part that is visible to the user such as the head is driven. Therefore, the robot can perform an appropriate gesture by swinging (positive) or the like while stopping a leg or the like that may damage the robot itself or the pocket when operated.
  • the robot according to aspect 1 can perform an appropriate gesture by movement of some movable parts for communication while ensuring safety.
  • the robot according to aspect 2 of the present invention further includes an acquisition unit (speech input unit 20) that acquires a predetermined instruction from outside in the aspect 1, and the control unit acquires the instruction
  • the mode is changed to the mode corresponding to the instruction, the transmission of the force from some of the drive units to some of the movable units is permitted, and the remaining drive units to the remaining movable units. The transmission of the force is stopped.
  • the robot according to aspect 3 of the present invention further includes a detection unit (image input unit 24, brightness detection unit 50, failure detection unit 70) that detects a stimulus from the outside world to the robot in the above-described aspect 1, and
  • the control unit shifts to the mode corresponding to the detection result by the detection unit, permits transmission of the force from some of the drive units to some of the movable units, and from the remaining drive units. The transmission of the force to the remaining movable parts is stopped.
  • a robot according to an aspect 4 of the present invention is the robot according to any one of the aspects 1 to 3, wherein the control unit is configured such that when the robot is partially stored in the storage place, the control unit Permits transmission of the force from a part of the driving parts for driving a part of the movable parts exposed to the outside to a part of the movable parts, and is stored in the storage place. The transmission of the force from the remaining drive unit for driving the remaining movable unit to the remaining movable unit is stopped.
  • the robot in a situation where the robot is partially stored in a storage location such as a pocket, the robot is exposed from the storage location while stopping the movable portion (eg, leg portion) stored in the storage location. It is possible to perform a gesture such as swinging (positive) using only the operation of some of the drive units.
  • the robot according to Aspect 5 of the present invention is the robot according to Aspect 4, wherein each of all the drive units is a servo motor, and the control unit includes a part of the movable parts exposed outside the storage place. The torque of a part of the driving parts for driving is turned on, and the torque of the remaining driving parts for driving the remaining movable parts stored in the storage place is turned off.
  • the robot can be realized with a simple configuration.
  • the control method of the robot 1 or 1a includes a plurality of movable parts (heads 2 and 2a, right arm part 4, left arm part 5, right leg part 6 and left leg part 7),
  • a robot control method comprising one or more drive units (40) for individually driving each of the movable units, wherein the robot is predetermined for a specific communication among the plurality of movable units.
  • the robot described above may be realized by a computer.
  • Media also falls within the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)
  • Toys (AREA)

Abstract

L'invention concerne un robot qui réalise des gestes en utilisant le mouvement de certaines des pièces mobiles de communication tout en assurant la sécurité. Le robot (1) est muni d'une unité (10) de commande qui, pendant un mode spécifié, permet la transmission de puissance de la partie d'une unité (40) d'entraînement qui est réglée pour ledit mode à la partie de pièces mobiles, parmi des pièces mobiles multiples, qui est destinée à une communication spécifiée, et bloque la transmission de puissance du reste de l'unité (40) d'entraînement aux pièces mobiles restantes.
PCT/JP2016/088777 2016-01-07 2016-12-26 Robot, procédé de commande de robot, et programme Ceased WO2017119348A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2017560120A JP6568601B2 (ja) 2016-01-07 2016-12-26 ロボット、ロボットの制御方法、およびプログラム

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016001755 2016-01-07
JP2016-001755 2016-01-28

Publications (1)

Publication Number Publication Date
WO2017119348A1 true WO2017119348A1 (fr) 2017-07-13

Family

ID=59274563

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/088777 Ceased WO2017119348A1 (fr) 2016-01-07 2016-12-26 Robot, procédé de commande de robot, et programme

Country Status (2)

Country Link
JP (1) JP6568601B2 (fr)
WO (1) WO2017119348A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109719738A (zh) * 2017-10-30 2019-05-07 索尼公司 信息处理装置、信息处理方法以及程序

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001328091A (ja) * 2000-05-22 2001-11-27 Sony Corp バッテリ駆動の脚式移動ロボット及びその制御方法
JP2003071763A (ja) * 2001-09-04 2003-03-12 Sony Corp 脚式移動ロボット
JP2004050383A (ja) * 2002-07-24 2004-02-19 Fujitsu Ltd 移動型ロボットのための電源制御装置および方法
JP2007152446A (ja) * 2005-12-01 2007-06-21 Mitsubishi Heavy Ind Ltd ロボットシステム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004306251A (ja) * 2003-03-23 2004-11-04 Sony Corp ロボット装置及びその制御方法
JP2009050958A (ja) * 2007-08-27 2009-03-12 Fanuc Ltd 停止監視機能を備えたロボット制御装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001328091A (ja) * 2000-05-22 2001-11-27 Sony Corp バッテリ駆動の脚式移動ロボット及びその制御方法
JP2003071763A (ja) * 2001-09-04 2003-03-12 Sony Corp 脚式移動ロボット
JP2004050383A (ja) * 2002-07-24 2004-02-19 Fujitsu Ltd 移動型ロボットのための電源制御装置および方法
JP2007152446A (ja) * 2005-12-01 2007-06-21 Mitsubishi Heavy Ind Ltd ロボットシステム

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109719738A (zh) * 2017-10-30 2019-05-07 索尼公司 信息处理装置、信息处理方法以及程序
WO2019087484A1 (fr) * 2017-10-30 2019-05-09 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, et programme
JPWO2019087484A1 (ja) * 2017-10-30 2020-11-26 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
JP7173031B2 (ja) 2017-10-30 2022-11-16 ソニーグループ株式会社 情報処理装置、情報処理方法、およびプログラム
US11709476B2 (en) 2017-10-30 2023-07-25 Sony Corporation Information processing apparatus, information processing method and program
US20230305530A1 (en) * 2017-10-30 2023-09-28 Sony Corporation Information processing apparatus, information processing method and program

Also Published As

Publication number Publication date
JP6568601B2 (ja) 2019-09-04
JPWO2017119348A1 (ja) 2018-08-30

Similar Documents

Publication Publication Date Title
JP7747032B2 (ja) 情報処理装置及び情報処理方法
JP7400923B2 (ja) 情報処理装置および情報処理方法
CA2945885C (fr) Robot humanoide dote d'une capacite de vie autonome
EP3456487A2 (fr) Robot, son procédé de commande et programme
US12204338B2 (en) Information processing apparatus, information processing method, and program
JP2015526309A (ja) 安全ロボット動作のためのシステムおよび方法
JP2003266351A (ja) ロボット装置及びロボット装置の動作制御方法
US11938625B2 (en) Information processing apparatus, information processing method, and program
Wong et al. Touch semantics for intuitive physical manipulation of humanoids
JP6568601B2 (ja) ロボット、ロボットの制御方法、およびプログラム
JP2013099800A (ja) ロボット、ロボットの制御方法及び制御プログラム
Nho et al. Emotional interaction with a mobile robot using hand gestures
JP2004114285A (ja) ロボット装置及びその行動制御方法
JP7374581B2 (ja) ロボット、画像処理方法及びプログラム
Balaji et al. Smart phone accelerometer sensor based wireless robot for physically disabled people
US20240367065A1 (en) Autonomous mobile body, information processing method, and program
Alves et al. Assisted robot navigation based on speech recognition and synthesis
KR101402908B1 (ko) 행위기반 로봇 제어장치 및 그 제어방법
JP2003071755A (ja) ロボット装置及びロボット装置の衝撃吸収方法
JP4411503B2 (ja) ロボット装置及びその制御方法
KR102672085B1 (ko) 스마트폰을 이용한 로봇
WO2025037529A1 (fr) Dispositif de commande et procédé de commande
JPWO2017104199A1 (ja) ロボット、ロボットの制御方法、およびプログラム
US20240367066A1 (en) Autonomous mobile body, information processing method, and program
JP2002205290A (ja) 脚式移動ロボットの制御装置及び制御方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16883847

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017560120

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16883847

Country of ref document: EP

Kind code of ref document: A1