GB2573790A - Robot development system - Google Patents
Robot development system Download PDFInfo
- Publication number
- GB2573790A GB2573790A GB1808043.2A GB201808043A GB2573790A GB 2573790 A GB2573790 A GB 2573790A GB 201808043 A GB201808043 A GB 201808043A GB 2573790 A GB2573790 A GB 2573790A
- Authority
- GB
- United Kingdom
- Prior art keywords
- robot
- configurations
- virtual robot
- operable
- virtual
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000011161 development Methods 0.000 title description 4
- 230000033001 locomotion Effects 0.000 claims abstract description 50
- 238000000034 method Methods 0.000 claims description 20
- 230000007613 environmental effect Effects 0.000 claims description 5
- 238000003860 storage Methods 0.000 claims description 3
- 230000003278 mimic effect Effects 0.000 claims description 2
- 238000013461 design Methods 0.000 abstract description 8
- 210000002414 leg Anatomy 0.000 description 15
- 230000007704 transition Effects 0.000 description 14
- 210000004394 hip joint Anatomy 0.000 description 13
- 230000009471 action Effects 0.000 description 11
- 238000012360 testing method Methods 0.000 description 10
- 210000002683 foot Anatomy 0.000 description 9
- 210000002310 elbow joint Anatomy 0.000 description 7
- 238000013515 script Methods 0.000 description 7
- 210000000323 shoulder joint Anatomy 0.000 description 7
- 210000000544 articulatio talocruralis Anatomy 0.000 description 6
- 238000004088 simulation Methods 0.000 description 6
- 210000003857 wrist joint Anatomy 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000001514 detection method Methods 0.000 description 4
- 210000003414 extremity Anatomy 0.000 description 4
- 230000005484 gravity Effects 0.000 description 4
- 210000000629 knee joint Anatomy 0.000 description 4
- 210000003141 lower extremity Anatomy 0.000 description 4
- 238000004458 analytical method Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 210000000245 forearm Anatomy 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000004044 response Effects 0.000 description 3
- 210000001364 upper extremity Anatomy 0.000 description 3
- 210000000689 upper leg Anatomy 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 238000011960 computer-aided design Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000003137 locomotive effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 210000003423 ankle Anatomy 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 244000309466 calf Species 0.000 description 1
- 210000000080 chela (arthropods) Anatomy 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 230000007340 echolocation Effects 0.000 description 1
- 229920001746 electroactive polymer Polymers 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 210000003127 knee Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000010399 physical interaction Effects 0.000 description 1
- 238000007639 printing Methods 0.000 description 1
- 238000003908 quality control method Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 239000000779 smoke Substances 0.000 description 1
- 238000002604 ultrasonography Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1602—Programme controls characterised by the control system, structure, architecture
- B25J9/1605—Simulation of manipulator lay-out, design, modelling of manipulator
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1671—Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B62—LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
- B62D—MOTOR VEHICLES; TRAILERS
- B62D57/00—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
- B62D57/02—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
- B62D57/032—Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Robotics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
Abstract
For identifying stable configurations of a virtual robot comprising a plurality of components, a system comprises a control unit 610 to control movement of the virtual robot, a configuration assessment unit 620 to identify, for one or more configurations of the virtual robot generated by movement controlled by the control unit 610, whether each configuration is stable and an information generating unit 630 generating information describing configurations identified as being unstable. A virtual robot generating unit 600 may provide the virtual robot. The information generating unit 630 may define at least one volume around the robot or around each component of the robot, the volume indicating one or more boundaries and when a component intersects such a boundary it is determined that the robot is in an unstable configuration 810. The information generating unit 630 may identify alternative motions 830, 840 to avoid unstable configurations 810. The system may be used in the design of robots.
Description
ROBOT DEVELOPMENT SYSTEM
This disclosure relates to a robot development system and method.
As robots have become more complex, the range of motion that it is possible to provide to a robot has increased substantially. For example, this may be achieved by both increasing the number of movable components and the types of motion that are available. While this provides many advantages in terms of the interactivity that is provided by the robot, the variety of different motions may lead to problems.
One such problem that may be encountered is that of the increased likelihood of a broken part that is inherent in a system with an increased number of moving parts. Another such problem is that of the increased cost of production in manufacturing more complex robots - this is passed on to the consumer, and so purchasing robotic devices may be seen as less appealing.
However, the problem that is considered in the present disclosure is that of a decreased stability that may be associated with more complex devices. As the range of motion increases, the likelihood of the robot becoming unbalanced or unstable may be increased significantly. This may cause the robot to fall over, which can interfere with interactions as well as increase the risk of causing damage to the robot.
One method for addressing this problem is to modify the physical design of the robot so as to improve stability. For example, both providing a wider base and lowering the centre of gravity may each contribute to an increase in stability of the robot. Conversely, having a high centre of gravity over a smaller base is likely to make for a much less stable robot.
However, by making such modifications to increase the stability of the robot, the design freedom enjoyed by the designer may be greatly restricted. It is therefore desirable to avoid placing such constraints on the design of robots when addressing the problem of stability in a robot.
It is in the context of the above problems that the present invention arises.
This disclosure is defined by claims 1 and 12.
Further respective aspects and features of the disclosure are defined in the appended claims.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram showing front and rear elevations of a robot, in accordance with embodiments of the present invention;
Figure 2 is a schematic diagram showing front and rear elevations of points of articulation of a robot, in accordance with embodiments of the present invention;
Figure 3 is a schematic diagram illustrating degrees of freedom at respective points of articulation of a robot, in accordance with embodiments of the present invention;
Figure 4 is a schematic diagram of a control system for a robot, in accordance with embodiments of the present invention;
Figure 5 is a schematic diagram of an interactive robot system in accordance with embodiments of the present invention;
Figure 6 schematically illustrates a system for identifying stable configurations of a virtual robot;
Figure 7 schematically illustrates examples of bounding volumes;
Figure 8 schematically illustrates an example of determining an alternative transition between configurations to be applied; and
Figure 9 schematically illustrates a method for identifying stable configurations of a virtual robot.
A robot platform 100 for implementing embodiments of the present invention may take the form of any suitable robotic device, or simulation of a robotic device, as applicable.
The robot platform may have any suitable physical features. Hence movement, where required, may be achieved by wheels, tracks, articulated limbs, internal mass displacement or any other suitable means. Manipulation, where required, maybe achieved by one or more of a mechanical hand, pincer or any other hooking or gripping system, such as a suction or electromagnetic attachment mechanism or a hook or clip, and any further optional articulation such as one or more jointed arms. Vision, where required, may be achieved by optical camera and/or infra-red camera/detector, mounted on the robot and/or located within the environment navigated by the robot. Other situational awareness systems such as ultrasound echolocation, or detection of metal tracks and/or electrically charged tracks, and proximity systems such as whiskers coupled to sensors, or pressure pads, may also be considered. Control of the robot may be provided by running suitable software instructions on a processor of the robot and/or a processor of a remote computer communicating with the robot, for example via a wireless protocol.
Figure 1 illustrates front and rear views of an exemplary legged locomotive robot platform 100. As shown, the robot includes a body, head, right and left upper limbs, and right and left lower limbs for legged movement. A control unit 80 (not shown in Figure 1) within the body provides a control system for the robot.
Each of the right and left lower limbs includes a thigh, knee joint, second thigh (calf/shin), ankle and foot. The lower limb is coupled by a hip joint to the bottom of the trunk. Each of the right and left upper limb includes an upper arm, elbow joint and forearm. The upper limb is coupled by a shoulder joint to each upper edge of the trunk. Meanwhile, the head is coupled by a neck joint near to the upper end centre of the trunk.
Figure 2 illustrates front and rear views of the robot, showing its points of articulation (other than the hands).
Figure 3 then illustrates the degrees of freedom available for each point of articulation.
Referring to these Figures, a neck joint for supporting the head 1 has 3 degrees of freedom: a neck-joint yaw-axis 2, a neck-joint pitch-axis 3, and a neck-joint roll-axis 4. Meanwhile each arm has 7 degrees of freedom; a shoulder-joint pitch-axis 8, a shoulder-joint roll-axis 9, an upper-arm yaw-axis 10, an elbow-joint pitch-axis 11, a forearm yaw-axis 12, a wrist-joint pitch-axis 13, a wrist-joint roll-axis 14, and a hand 15. Typically the hand 15 also has a multi-joints multi-degrees-of-freedom structure including a plurality of fingers. However, these are omitted for simplicity of explanation. The trunk has 3 degrees of freedom; a trunk pitch-axis 5, a trunk roll-axis 6, and a trunk yaw-axis 7. Each leg constituting the lower limbs has 6 degrees of freedom; a hip-joint yaw-axis 16, a hip-joint pitch-axis 17, a hip-joint roll-axis 18, a knee-joint pitch-axis 19, an ankle-joint pitch-axis 20, an ankle-joint roll-axis 21, and a foot 22. In the exemplary robot platform, the cross point between the hip-joint pitch-axis 17 and the hipjoint roll-axis 18 defines a hip-joint location of the legged walking robot 100 according to the embodiment. Again for simplicity it is assumed that the foot itself has no degrees of freedom, but of course this is non-limiting. As a result the exemplary robot 100 has 32(=3 + 7χ2 + 3 + 6 x 2) degrees of freedom in total. It will be appreciated however that this is merely exemplary, and other robot platforms may have more or fewer degrees of freedom.
Each degree of freedom of the exemplary legged locomotive robot platform 100 is implemented by using an actuator. For example, a small AC servo actuator that is directly coupled to a gear and that houses a one-chip servo-system may be used, although any suitable actuator may be considered, such as a linear servo, electroactive polymer muscle, pneumatic, piezoelectric, or the like.
It will be appreciated that any desired action that the robot platform is capable of may be implemented by control signals issued by a control system to one or more of the actuators of the robot (or to simulated actuators in a simulation, as applicable), to adjust the pose of the robot within its available degrees of freedom.
Figure 4 schematically illustrates an exemplary control system for the robot platform 100.
A control unit 80 operates to co-ordinate the overall motion I actions of the robot. The control unit 80 has a main control unit 81 including main circuit components (not shown) such as a CPU (central processing unit) and a memory, and typically a periphery circuit 82 including an interface (not shown) for sending and receiving data and/or commands to and from a power supply circuit (not shown) and each component of the robot. The control unit may comprise a communication interface and communication device for receiving data and/or commands by remote-controlling. The control unit can be located anywhere suitable within the robot.
As shown in Figure 4, the robot has logical units 30 (head), 40 (torso), and 50R/L and 60R/L each representing the corresponding one of four human limbs. The degrees-of-freedom of the robot 100 shown in Fig. 3 are implemented by the corresponding actuator within each unit. Hence the head unit 30 has a neck-joint yaw-axis actuator A2, a neck-joint pitch-axis actuator A3, and a neck-joint roll-axis actuator A4 disposed therein for representing the neckjoint yaw-axis 2, the neck-joint pitch-axis 3, and the neck-joint roll-axis 4, respectively. Meanwhile the trunk unit 40 has a trunk pitch-axis actuator A5, a trunk roll-axis actuator A6, and a trunk yaw-axis actuator A7 disposed therein for representing the trunk pitch-axis 5, the trunk roll-axis 6, and the trunk yaw-axis 7, respectively. Similarly the arm units 50R/L are broken down into upper-arm units 51R/L, elbow-joint units 52R/L, and forearm units 53R/L. Each of the arm units 50R/L has a shoulder-joint pitch-axis actuator A8, a shoulder-joint roll-axis actuator A9, an upper-arm yaw-axis actuator A10, an elbow-joint pitch-axis actuator A11, an elbow-joint roll-axis actuator A12, a wrist-joint pitch-axis actuator A13, and a wrist-joint roll-axis actuator A14 disposed therein for representing the shoulder-joint pitch-axis 8, the shoulder-joint roll-axis 9, the upper-arm yaw-axis 10, the elbow-joint pitch-axis 11, an elbow-joint roll-axis 12, the wristjoint pitch-axis 13, and the wrist-joint roll-axis 14, respectively. Finally the leg units 60R/L are broken down into thigh units 61 R/L, knee units 62R/L, and second-thigh units 63R/L. Each of the leg units 60 R/L has a hip-joint yaw-axis actuator A16, a hip-joint pitch-axis actuator A17, a hip-joint roll-axis actuator A18, a knee-joint pitch-axis actuator A19, an ankle-joint pitch-axis actuator A20, and an ankle-joint roll-axis actuator A21disposed therein for representing the hipjoint yaw-axis 16, the hip-joint pitch-axis 17, the hip-joint roll-axis 18, the knee-joint pitch-axis 19, the ankle-joint pitch-axis 20, and the ankle-joint roll-axis 21, respectively. Optionally the head unit 30, the trunk unit 40, the arm units 50, and the leg units 60 may have sub-controllers 35, 45, 55, and 65 for driving the corresponding actuators disposed therein.
Hence by issuing appropriate commands, the main controller (81) can control the driving of the joint actuators included in the robot 100 to implement the desired action. For example, the controller may implement a walking action by implementing successive phases, as follows:
(1) Single support phase (left leg) with the right leg off the walking surface;
(2) Double support phase with the right foot touching the walking surface;
(3) Single support phase (right leg) with the left leg off the walking surface; and (4) Double support phase with the left foot touching the walking surface.
Each phase in turn comprises the control of a plurality of actuators, both within the relevant leg and potentially elsewhere in the robot, for example moving the opposing arm and/or attitude of the torso to maintain the centre of gravity of the robot over the supporting foot or feet.
Optionally, to detect the manner and/or extent of a physical interaction with an object and/or the environment, physical sensors may be provided.
Hence in the exemplary robot, the feet 22 have grounding detection sensors 91 and 92 (e.g. a proximity sensor or microswitch) for detecting the grounding of the feet 22 mounted on legs 60R and 60L respectively, and the torso is provided with an attitude sensor 93 (e.g. an acceleration sensor and/or a gyro-sensor) for measuring the trunk attitude. Outputs of the grounding detection sensors 91 and 92 are used to determine whether each of the right and left legs is in a standing state or a swinging state during the walking action, whilst an output of the attitude sensor 93 is used to detect an inclination and an attitude of the trunk. Other sensors may also be provided, for example on a gripping component of the robot, to detect that an object is being held.
The robot may also be equipped with sensors to provide additional senses. Hence for example the robot may be equipped with one or more cameras, enabling the control unit (or a remote system to which sensor-based data is sent) to recognise a user of the robot, or a target object for retrieval. Similarly one or more microphones may be provided to enable voice control or interaction by a user. Any other suitable sensor may be provided, according to the robot’s intended purpose. For example, a security robot intended to patrol a property may include heat and smoke sensors, and GPS.
Hence more generally, a robot platform may comprise any suitable form factor and comprise those degrees of freedom necessary to perform an intended task or tasks, achieved by the use of corresponding actuators that respond to control signals from a local or remote controller that in turn operates under suitable software instruction to generate a series of control signals corresponding to a performance of the intended task(s).
In order to provide software instruction to generate such control signals, a robot software development system may be provided for developing control sequences for desired actions, and/or for developing decision making logic to enable the robot control system to respond to user commands and/or environmental features.
As part of this development system, a virtual robot (i.e. a simulation) may be used in order to simplify the process of implementing test software (for example by avoiding the need to embed test software within robot hardware that may not have simple user-serviceable parts, or to simulate an environment or action where a mistake in the software could damage a real robot). The virtual robot may be characterised by the dimensions and degrees of freedom of the robot, etc., and an interpreter or API operable to respond to control signals to adjust the state of the virtual robot accordingly.
Control software and/or scripts to use with such software may then be developed using, and to use, any suitable techniques, including rule based I procedural methods, and/or machine learning I neural network based methods.
Referring to Figure 5, in an exemplary usage scenario a (toy) real robot crane 260 and a corresponding simulation (virtual robot crane 262) interact for entertainment purposes, for example mirroring each other’s actions or behaving in a complementary manner, and/or using sensor data from the real or virtual robot to control actions of the other. The virtual robot may be graphically embellished compared to the real robot, for example having a face, or resembling an object or creature only approximated by the real robot.
In this example, the robot platform 260 has motorised wheels 266a-d and one articulated arm with actuators 264a-c. However it will be appreciated that any suitable form factor may be chosen, such as for example the humanoid robot 100 of Figure 1, or a dog-shaped robot (not shown) or a spheroidal robot (not shown).
In Figure 5, control of both the virtual and real robots is performed by a general purpose computer (110) operating under suitable software instructions, such as the Sony® PlayStation 4®. A user can interact with the PlayStation and hence optionally indirectly interact with one or both of the real and virtual robots using any suitable interface, such as a videogame controller 143. The PlayStation can detect the state of the real robot by receiving telemetry and other status data from the robot, and/or from analysis of an image of the real robot captured by a video camera 141. Alternatively or in addition the PlayStation can assume the state of the real robot based on expected outcomes of the commands sent to it. Hence for example, the PlayStation may analyse captured images of the real robot in expected final poses to determine its positon and orientation, but assume the state of the robot during intermediate states such as transitions between poses.
In the example scenario, the user provides inputs to control the real robot via the PlayStation (for example indicating an amount and direction of travel with one joystick, and a vertical and horizontal position of the arm end with another joystick). These inputs are interpreted by the PlayStation into control signals for the robot. Meanwhile the virtual simulation of the robot may also be controlled in a corresponding or complementary manner using the simulation technique described above, according to the mode of play.
Alternatively or in addition, the user may directly control the real robot via its own interface or by direct manipulation, and the state of the robot may be detected by the PlayStation (e.g. via image analysis and/or telemetry data from the robot as described previously) and used to set a corresponding state of the virtual robot.
It will be appreciated that the virtual robot may not be displayed at all, but may merely act as a proxy for the real robot within a virtual environment. Hence for example the image of the real robot may be extracted from a captured video image and embedded within a generated virtual environment in an augmented reality application, and then actions of the real robot can be made to appear to have an effect in the virtual environment by virtue of those interactions occurring with a corresponding virtual robot in the environment mirroring the state of the real robot.
Alternatively, a virtual robot may not be used at all, and the PlayStation may simply provide control and/or state analysis for the real robot. Hence for example the PlayStation may monitor the robot via the camera, and cause it to pick up a ball or other target object placed within the camera’s field of view by the user.
Hence more generally, a robot platform may interact with a general purpose computer such as the Sony ® PlayStation 4 ® to obtain a series of control signals relating to setting a state of the robot, for the purposes of control by a user and/or control by the PlayStation to achieve a predetermined task or goal. Optionally the state, task or goal may be at least in part defined within or in response to a virtual environment, and may make use of a simulation of the robot.
The present disclosure relates to a system and method for use in the design of such robots as those described above, or for use when developing scripts or programs for movements, behaviours, activities and the like for an existing robot. In particular, the present disclosure relates to identifying when motion of a robot would cause the robot to become unstable, and to either prevent this motion being performed, identify the instability to a designer (or equally a developer), or to identify alternatives to the motion that would not cause the same degree of instability.
The stability of a robot may be measured in many different manners. For example, a set of forces could be considered to be applied to the robot in a given configuration; the stability may be expressed as a success rate of maintaining the configuration once exposed to the force. Alternatively, or in addition, the stability may be a function of the amount of force required to cause the robot to lose the intended configuration (such as by falling over).
Figure 6 schematically illustrates a system for identifying stable configurations of a virtual robot. This system comprises a virtual robot generating unit 600, a control unit 610, a configuration assessment unit 620 and an information generating unit 630. While shown as a single device, these functions may of course be distributed between two or more devices as appropriate.
The virtual robot generating unit 600 is operable to generate the virtual robot being designed and/or tested. For example, this may comprise generating a virtual robot according to user inputs (for example, defining the robot using a user interface of the virtual robot generating unit), or using images captured of a real robot to identify components so as to allow a virtual reconstruction of a real robot, or using computer-aided design (CAD) models of the robot or other similar specifications, such as 3-D printing files of the robot body. The generation of the virtual robot may include generating a visual representation for display to a user, or may simply comprise generating a model of a robot that may be used as an input for a testing process. The virtual robot will thus typically comprise a model having dimensions and degrees of freedom corresponding to the (real or intended) physical robot. In addition optionally at least an approximation of the mass distribution of the robot is included, either by calculating mass as a function of volume of various components of the robot, or by the user specifying masses at one or more points within the robot. For example a mass may be in the form of a lumped mass in the robots torso/main body, and optionally in the head and/or articulated limbs, if it has them. It will be appreciated that the number and distribution of lumped masses will depend upon the form of the robot, and optionally upon the required fidelity of the model, which in turn may depend upon the tests being performed.
In some embodiments, the virtual robot generating unit 600 may be omitted, as it is possible that a virtual model may be supplied as an input without having to be newly generated. For example, when used as a part of a quality control process a designer may provide a pregenerated virtual model of their robot for testing by a third party (such as an accreditation body that is able to certify that the robot meets particular standards).
The control unit 610 is operable to control movement of the virtual robot; this may include movement of the robot as a whole (for example, a walking motion) or the motion of one or more individual components (such as moving only an arm). This movement may be determined based upon user inputs, or the movement may be generated independently of a user. Of course, this motion is considered to be virtual when manipulating a virtual robot.
For example, a testing script to be applied to a robot that defines a plurality of different configurations may be provided. A separate script may be generated individually for that robot, or for each type of robot (for example, ‘vehicular’ or ‘humanoid’) so as to ensure that an appropriate test is applied. These scripts may include configurations representing common uses of the robot (such as walking, for a humanoid robot), or may be more thorough in that a greater number of configurations are tested.
In some examples, the testing scripts are based upon the determined range of motion of the virtual robot for one or more of the robot’s components - this may be advantageous in that every possible configuration may be tested, or a representative set of configurations may be identified that relate to the more ‘extreme’ configurations (that is, those considered most likely to cause instability). This is an example of the control unit 610 being operable to identify movements of the virtual robot that are commonly associated with unstable configurations, and to control the movement of the virtual robot to mimic these configurations.
The configuration assessment unit 620 is operable to identify, for one or more configurations of the virtual robot generated by movement controlled by the control unit 610, whether each configuration is stable. For example, after a movement is applied by the control unit 610, the configuration assessment unit 620 may be operable to identify if the virtual robot is still standing (or in another defined configuration indicative of stability). Hence for example the configuration assessment unit may calculate whether the robot’s combined centre of gravity remains within an area of support bounded by the feet or other grounded support elements of the virtual robot. If so, then the robot should remain standing. The configuration assessment unit may also consider how resilient the virtual robot would be to external forces. For instance, while a robot may still be standing it may be the case that only a small push would be required to cause it to fall over, and therefore the configuration may be regarded as unstable. The testing process may therefore comprise the application of forces to the robot in one or more different configurations. For example, a succession of virtual forces in a plurality of directions could be applied to the robot in a candidate pose, with forces successively increasing until a force value either reaches a maximum or causes the virtual robot to fall over. The resilience of the candidate pose, optionally as a function of force direction may then be calculated based on the level of force required to topple (or not topple) the robot in that pose.
The configuration assessment unit 620 may further be operable to identify whether the robot experiences any instability during the motion (which may be considered to be a succession of different configurations) required to move between two configurations. For example, the start and end configurations for a motion may each be entirely stable, but to transition between them the robot must temporarily be in an unstable configuration. In view of this, it may be necessary to assess the stability of numerous transitions between different pairs of the same set of configurations.
The configuration assessment unit 620 may be operable to assess one or more of the configurations of the virtual robot a plurality of times under different environmental conditions. For example, repeated tests may be performed with differing weather/external conditions (for example, being subject to a continual force in a predetermined direction, equivalent to a constant breeze), or with a shape, orientation and/or hardness of a surface upon which the virtual robot is placed being different between tests (for example, having a ground surface similar to a carpet, which deforms as a function of weight distributed upon it and so tends to increase instability when the robot’s mass is off centre).
The information generating unit 630 is operable to generate information describing configurations identified as being unstable (or within a threshold amount of stability, which may be measured in any suitable manner), based upon the results generated by the configuration assessment unit 620. In some embodiments, this information comprises information identifying a set of configurations that are regarded as being unstable; alternatively, or in addition, the information generating unit 630 is operable to define a volume about the virtual robot that indicates a boundary, wherein it is determined that if a component intersects this boundary then the robot is in an unstable configuration.
The use of the bounding volume may be advantageous in that is becomes easy to categorise new motions/configurations as being unstable by simply determining whether any components move outside of the volume at any time. In some embodiments, the information generating unit 630 is operable to define a separate volume for each component of the virtual robot (or at least a number of the components). This may be useful in the case that if one component (such as an arm) moves to a certain location then the robot is stable, but if another component (such as a leg) moves to the same location then the robot would fall.
In some embodiments, the information generating unit 630 may be operable to identify groups of components that may be moved to counterbalance one another. For example, if two mirrored unstable configurations are identified (such as left arm fully extended and right arm fully extended) then it may be considered that performing both at the same time would lead to a stable configuration. In some embodiments, tests of mirrored configurations such as this may be performed intentionally so as to identify groups of components that may be moved to counterbalance one another.
Hence in a similar manner to the generation of bounding volumes for separate components of the virtual robot, the information generating unit 630 may to define conditional volumes that are only used when certain criteria are met. This may be useful for example in the case that if one component (such as an leg) moves to a certain location then the robot would fall, but if another component (such as an arm) moved in opposing direction, then the robot would remain balanced.
Hence more generally, a set of bounding volumes may be generated for each of a plurality of initial baseline/common stable poses of the robot from which variants may be derived, such as standing neutrally, standing on one leg, standing with one or both arms stretched forward, out of the side or back, crouching, or in a walking pose (such as for example one or more of the support phases described previously herein during a walking action).
Alternatively, or in addition, it may be considered that the information generation unit 630 is operable to identify groups of components based upon information about the range of motion and respective weights or the like of each component comprising the virtual robot.
One use for such information is that of enabling the information generating unit 630 to generate information relating to alternative motion so as to avoid unstable configurations. For example, target configurations may be modified so as to generate a similar, but more stable, pose. Alternatively, or in addition, alternative motion paths between configurations may be identified that would provide greater stability by avoiding any unstable configurations.
Figure 7 schematically illustrates examples of bounding volumes, as applied to a humanoid figure that represents a virtual robot. The bounding volumes shown in this Figure may be two- or three-dimensional, as appropriate for the degrees of freedom of the robot (e.g. the actuators that control the motion of the components of the robot).
A first bounding volume 700 is defined surrounding the whole of the robot; this is the simplest application of the bounding volume, as it has a simple shape that covers the whole of the robot. However, such a simplified volume may be much smaller than necessary as it may be the case that stable configurations exist outside of the volume 700 but are omitted to preserve the simplicity of the volume 700.
A second bounding volume 710 is defined surrounding only the top portion of the robot. This may be a more suitable volume than the volume 700, as it is only defined for the arms and so fewer stable configurations for the arms may be omitted. Such a volume 710 may be particularly useful in embodiments in which the configuration of a part of the robot does not matter - for example, with a mobile crane robot (such as that shown in Figure 5) it is the position of the arm that leads to instability rather than the position of the wheels.
A third bounding volume 720 is defined such that a single, more complex shape (or a pair of simple shapes) may be used to represent the volume in which stable configurations may be found. This approach may be useful in eliminating overlapping areas in which a position would be stable if an arm were moved there but not if a leg were (as discussed above).
A fourth bounding volume 730 is shown that defines a separate volume for each limb (analogous to a general robot component). While this may be the most complex approach, it may be the most effective.
Whilst not shown, any of these bounding volumes may similarly be used as part of a set of bounding volumes from which one is conditionally selected in response to respective core pose of the robot, as described previously herein.
By using a more tailored approach, the size and shape of each bounding volume may be defined so as to more precisely represent the boundary between stable and unstable configurations. This is advantageous in that the number of false-negatives (that is, determinations of instability when in fact the configuration would be stable) may be reduced. The individual volumes may overlap, but this is not shown for the sake of clarity.
Figure 8 schematically illustrates an example of determining an alternative transition between configurations to be applied. In this Figure, an initial configuration 800 (arms lowered) and a final configuration 820 (one arm raised) are shown.
In order to move between the two configurations 800 and 820, initially the virtual robot is caused to move its arm between the two positions in the most direct manner possible. As is clear from the configuration 810, this causes the arm to violate the bounding volume and therefore this is regarded as an unstable configuration (for example, as the uneven weight distribution may cause the robot to lean too far to one side). It is therefore considered that this is not a suitable motion, and that an alternative transition should be used.
The configuration 830 shows the robot directing its arm in a forwards direction while raising it, so as to not cause the same sideways extension as in 810. This alternative transition is an example of an alternative that no longer violates the bounding volume. Many such modifications may be possible, as different components may be able to be moved in a number of different directions and may be able to adjust their length (such as in the case of an extendable component, or by moving components relative to one another - an example of this being the bending of the elbow to reduce shoulder-hand separation).
The configuration 840 illustrates an alternative approach; here, the violation of the boundary volume is mirrored by the robots other arm so as to counterbalance the motion. This is achieved by raising both arms between the configurations 800 and 840; to achieve the target configuration 820 the robot then raises/lowers the arms as appropriate. This alternative transition is an example of moving components to counterbalance one another, and shows that boundary violations may not need to be entirely avoided. It will be appreciated that alternatively or in addition, due to the possibly conditional nature of stability within a given pose of the robot as described previously herein, a different bounding volume calculated for a different core pose of the robot may be selected when the robot’s current pose falls within its boundaries. Hence as a robot implements a transition from one state to another, it may pass through one or more conditional bounding volumes that each provide a more tailored model of the robot’s stability for its current, transitional pose.
If course, in the counterbalancing example the movement of the counterbalancing component need not be performed at the same time or to the same degree. For example, raising the other arm a small amount may provide sufficient stability without raising it all the way to the level shown in configuration 840. Similarly, the counterbalancing component may be moved in response to a detection that the instability is about to cause a fall.
Information identifying one or more such alternative transitions may be output by the information generating unit 630, and may be either automatically implemented in the robot design or movement script/program ,or suggested to a user so as to guide the robot design / motion authoring process.
Figure 9 schematically illustrates a method for identifying stable configurations of a virtual robot. Such a method may be used to automatically update or otherwise guide a user’s design of a robot or scripting/programming of a robot’s movements, for example by modifying configurations or changing transitions between configurations so as to improve stability (or recommending/highlighting the need for such modifications/changes to the user).
A step 900 comprises generating a virtual robot; as noted above this step may be optional, as a virtual robot may be provided by other means.
A step 910 comprises controlling movement of the virtual robot; this is to cause the virtual robot to assume one or more different configurations. The movement of the robot may be controlled so as to repeat a number of configurations, in order to determine whether the transitions between different configurations would cause instability.
A step 920 comprises identifying, for one or more configurations of the virtual robot generated by movement controlled by the control unit, whether each configuration is stable. This may also comprise an assessment of the transitions between the different configurations;
indeed, the transition itself may be regarded as a series of different configurations that represent intermediate stages between an initial configuration and a target configuration.
A step 930 comprises generating information describing configurations identified as being unstable. As discussed above, this may be a list or other set of information detailing 5 unstable configurations, or it may take the form of a bounding volume that defines a set of stable configurations for example. This information may also comprise environmental data that influences the stability - for example, noting that a particular configuration is only stable/unstable under particular conditions (such as on a particular surface type, or in certain weather conditions).
The techniques described above may be implemented in hardware, software or combinations of the two. In the case that a software-controlled data processing apparatus is employed to implement one or more features of the embodiments, it will be appreciated that such software, and a storage or transmission medium such as a non-transitory machinereadable storage medium by which such software is provided, are also considered as 15 embodiments of the disclosure.
Claims (15)
1. A system for identifying stable configurations of a virtual robot comprising a plurality of components, the system comprising:
a control unit operable to control movement of the virtual robot;
a configuration assessment unit operable to identify, for one or more configurations of the virtual robot generated by movement controlled by the control unit, whether each configuration is stable; and an information generating unit operable to generate information describing configurations identified as being unstable.
2. A system according to claim 1, comprising a virtual robot generating unit operable to generate the virtual robot.
3. A system according to claim 1, wherein the control unit is operable to control movement of the virtual robot based upon user inputs.
4. A system according to claim 1, wherein the control unit is operable to identify movements of the virtual robot that are commonly associated with unstable configurations, and to control the movement of the virtual robot to mimic these configurations.
5. A system according to claim 1, wherein the configuration assessment unit is operable to assess one or more of the configurations of the virtual robot a plurality of times under different environmental conditions.
6. A system according to claim 5, wherein the environmental conditions comprise a shape, orientation and/or hardness of a surface upon which the virtual robot is placed.
7. A system according to claim 1, wherein the information generating unit is operable to define a volume about the virtual robot that indicates a boundary, wherein it is determined that if a component intersects this boundary then the robot is in an unstable configuration.
8. A system according to claim 7, wherein the information generating unit is operable to define a separate volume for each component of the virtual robot.
9. The system according to claim 7, wherein the information generating unit is operable to define a separate volume for each of a predetermined set of initial configurations of the virtual robot.
10. A system according to claim 1, wherein the information generating unit is operable to identify groups of components that may be moved to counterbalance one another.
11. A system according to claim 1, wherein the information generating unit is operable to generate information relating to alternative motion so as to avoid unstable configurations.
12. A method for identifying stable configurations of a virtual robot, the method comprising the steps of:
controlling movement of the virtual robot;
identifying, for one or more configurations of the virtual robot generated by movement controlled by the control unit, whether each configuration is stable; and generating information describing configurations identified as being unstable.
13. A method according to claim 12, comprising the step of defining a volume about the virtual robot that indicates a boundary, wherein it is determined that if a component intersects this boundary then the robot is in an unstable configuration.
14. Computer software which, when executed by a computer, causes the computer to carry out the method of claim 12 or 13.
15. A non-transitory machine-readable storage medium which stores computer software according to claim 14.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1808043.2A GB2573790B (en) | 2018-05-17 | 2018-05-17 | Robot development system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1808043.2A GB2573790B (en) | 2018-05-17 | 2018-05-17 | Robot development system |
Publications (3)
| Publication Number | Publication Date |
|---|---|
| GB201808043D0 GB201808043D0 (en) | 2018-07-04 |
| GB2573790A true GB2573790A (en) | 2019-11-20 |
| GB2573790B GB2573790B (en) | 2020-10-28 |
Family
ID=62723243
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB1808043.2A Active GB2573790B (en) | 2018-05-17 | 2018-05-17 | Robot development system |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2573790B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150231786A1 (en) * | 2014-02-19 | 2015-08-20 | Toyota Jidosha Kabushiki Kaisha | Movement control method for mobile robot |
| US20160243699A1 (en) * | 2015-02-24 | 2016-08-25 | Disney Enterprises, Inc. | Method for developing and controlling a robot to have movements matching an animation character |
-
2018
- 2018-05-17 GB GB1808043.2A patent/GB2573790B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20150231786A1 (en) * | 2014-02-19 | 2015-08-20 | Toyota Jidosha Kabushiki Kaisha | Movement control method for mobile robot |
| US20160243699A1 (en) * | 2015-02-24 | 2016-08-25 | Disney Enterprises, Inc. | Method for developing and controlling a robot to have movements matching an animation character |
Also Published As
| Publication number | Publication date |
|---|---|
| GB201808043D0 (en) | 2018-07-04 |
| GB2573790B (en) | 2020-10-28 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11498223B2 (en) | Apparatus control systems and method | |
| EP3587048B1 (en) | Motion restriction system and method | |
| US10216262B1 (en) | Force management system that includes a force measurement assembly, a visual display device, and one or more data processing devices | |
| JP3615702B2 (en) | Motion control device and motion control method for legged mobile robot, and legged mobile robot | |
| US9916011B1 (en) | Force measurement system that includes a force measurement assembly, a visual display device, and one or more data processing devices | |
| US9271660B2 (en) | Virtual prosthetic limb system | |
| US11312002B2 (en) | Apparatus control system and method | |
| CN105892626A (en) | Lower limb movement simulation control device used in virtual reality environment | |
| EP3575044A2 (en) | Robot interaction system and method | |
| CN120134329B (en) | A method and system for robot anthropomorphic motion control based on skeleton mapping | |
| JP4463120B2 (en) | Imitation robot system and its imitation control method | |
| EP3599540A1 (en) | Robot interaction system and method | |
| US11148033B2 (en) | Rehabilitation and training gaming system to promote cognitive-motor engagement | |
| Davoodi et al. | Development of a physics-based target shooting game to train amputee users of multijoint upper limb prostheses | |
| GB2573790A (en) | Robot development system | |
| EP4491497A1 (en) | Apparatus, systems and methods for robotics | |
| TW201824209A (en) | Remote simulation system and tactile simulation system | |
| Banda et al. | Investigations on collaborative remote control of virtual robotic manipulators by using a Kinect v2 sensor | |
| Ojeda et al. | Gesture-gross recognition of upper limbs to physical rehabilitation | |
| JP7258426B2 (en) | Simulation system, simulation program and learning device | |
| Cooper | Analysis and synthesis of bipedal humanoid movement: a physical simulation approach | |
| JP4481132B2 (en) | Robot apparatus and control method thereof | |
| CN120807729A (en) | Method and device for generating gestures of virtual character and nonvolatile storage medium | |
| Cascino et al. | Enhancing Virtual Reality Interactions with Modular Peripherals | |
| CN120014126A (en) | A standing balance method for lightweight physical virtual human based on kinematics |