[go: up one dir, main page]

WO2025049602A1 - A bipedal robot for dynamic and robust location in diverse environments - Google Patents

A bipedal robot for dynamic and robust location in diverse environments Download PDF

Info

Publication number
WO2025049602A1
WO2025049602A1 PCT/US2024/044221 US2024044221W WO2025049602A1 WO 2025049602 A1 WO2025049602 A1 WO 2025049602A1 US 2024044221 W US2024044221 W US 2024044221W WO 2025049602 A1 WO2025049602 A1 WO 2025049602A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
assembly
actuator
leg
joint
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/US2024/044221
Other languages
French (fr)
Inventor
Quan Nguyen
Junheng LI
Junchao Ma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Southern California USC
Original Assignee
University of Southern California USC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Southern California USC filed Critical University of Southern California USC
Publication of WO2025049602A1 publication Critical patent/WO2025049602A1/en
Anticipated expiration legal-status Critical
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D57/00Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track
    • B62D57/02Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members
    • B62D57/032Vehicles characterised by having other propulsion or other ground- engaging means than wheels or endless track, alone or in addition to wheels or endless track with ground-engaging propulsion means, e.g. walking members with alternately or sequentially lifted supporting base and legs; with alternately or sequentially lifted feet or skid

Definitions

  • Legged robots have also demonstrated the ability to jump onto an obstacle three or four times higher than their body height.
  • Commercial and sociological interests widely promote the motivation for studying bipedal robots.
  • the desired outcomes of bipedal robot applications range from assisting or replacing humans in various situations, including disaster response missions, warehousing, and rehabilitation. It requires bipedal robots to be highly dynamic and capable of handling external disturbances and uneven terrain.
  • a bipedal robot with a leg design that allows highly dynamic movement.
  • a bipedal robot that concentrates multiple actuators near the thigh of the legs to allow for the use of a Model Predictive Control (MPC) based controller for efficient and fast operation of the legs of the robot.
  • MPC Model Predictive Control
  • One disclosed example is a bi-pedal robot having a torso and a first and a second leg coupled to the torso. Each leg has an abduction adduction actuator attached to the torso. Each leg has a hip actuator rotatable by the abduction adduction actuator. A thigh actuator is rotatable by the hip actuator. A thigh link assembly is rotatably coupled to the hip actuator. A knee actuator is coupled to the thigh link assembly in proximity with the thigh actuator. An ankle actuator is coupled to one end of the thigh link assembly. A knee joint is coupled to an opposite end of the thigh link assembly. A calf link assembly is rotatably coupled to the knee joint and knee actuator.
  • the example robot includes a sensor coupled to the controller, wherein the controller stores data sensed by the sensor.
  • the torso includes an enclosure with a power source and a payload compartment.
  • each of the actuators are identical motors.
  • the example robot includes an input to accept a command from an input device for the robot to traverse terrain.
  • the input device is one of a human input controller, an autonomous controller, or a semi-autonomous controller.
  • the thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts.
  • the example robot includes a belt and pulley coupling the knee actuator to the knee joint to rotate the calf link assembly.
  • the example robot includes a linkage assembly coupling the ankle actuator to the ankle joint.
  • the controller controls the dynamic movement of the legs based on a Model Predictive Control (MPC).
  • MPC Model Predictive Control
  • the MPC is based on a single rigid body dynamics model of the robot allowing linearization of the single rigid body dynamics model.
  • the MPC is applied to control the dynamic movement based on properties of the ground reaction forces and moments of inertia on the first and second legs.
  • the controller includes a multi-contact controller having an input of a contact schedule of contacts by the legs during the dynamic movement to provide stance control of the legs. Another implementation is where the multi-contact controller optimizes ground reaction forces and moments of the first and second legs. Another implementation is where the controller includes a Cartesian PD controller that provides swing control of the legs. Another implementation is where the example robot includes a first and second arm assembly mounted to the torso, wherein the first and second arm assemblies are controlled by the controller. Another implementation is where the first and second arm assembly each include a shoulder yaw actuator attached to the torso; a shoulder pitch actuator rotatable by the shoulder yaw actuator; and a shoulder roll actuator rotatable by the shoulder pitch actuator.
  • Each of the arm assemblies have an upper arm assembly coupled to the shoulder pitch actuator.
  • An elbow actuator is coupled to one end of the upper arm assembly.
  • An elbow joint is coupled to the upper arm assembly.
  • a forearm assembly - 3 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) rotates around the elbow joint.
  • An elbow transmission couples the elbow joint to the elbow actuator.
  • Another disclosed example is a leg assembly for a robot.
  • the leg assembly includes a hip joint and a thigh link assembly rotatably coupled to the hip joint.
  • a knee joint is coupled to the thigh link assembly.
  • a calf link assembly is rotatably coupled to the knee joint.
  • An ankle joint is coupled to the calf link assembly.
  • a plurality of actuators are configured to control movement of the leg assembly. At least some of the actuators are positioned near the hip joint to concentrate mass of the leg assembly proximate to the hip joint.
  • a transmission system is configured to transmit force from at least one of the actuators to either the knee joint or the ankle joint of the leg assembly.
  • the plurality of actuators include an abduction adduction actuator coupled to the hip joint.
  • the abduction adduction actuator is attachable to a torso of the robot.
  • a hip actuator is coupled to the hip joint and the thigh link assembly. The hip actuator is rotatable by the abduction adduction actuator.
  • a thigh actuator is coupled to one end of the thigh link assembly and the hip actuator.
  • the thigh actuator is rotatable by the hip actuator.
  • the leg assembly includes a foot structure coupled to the ankle joint.
  • the transmission system includes a belt including one end rotated by a pulley.
  • the pulley is rotatably coupled to a knee actuator, and another end of the belt is rotatably coupled to the knee joint to transmit force to rotate the calf link assembly relative to the thigh link assembly.
  • the first transmission system is a linkage assembly.
  • the linkage assembly transmits force between an ankle actuator and the ankle joint.
  • each of the plurality of actuators are identical motors.
  • thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts.
  • CNC computer numerical control
  • Another disclosed example is a control system for a robot having a limb with an upper assembly and a lower assembly, and a plurality of actuators attached to the upper assembly.
  • the control system includes a low level controller coupled to the plurality of actuators.
  • a multi- contact controller has inputs of a contact schedule of contacts by the limb during dynamic movement, reaction forces on the limb, and moments of inertia on the limb to provide stance - 4 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) control of the limb from a Model Predictive Control (MPC) based on a single rigid body dynamics model of the robot.
  • MPC Model Predictive Control
  • a Cartesian PD controller provides swing control of the limb from the MPC based on reaction forces from movement of the limb.
  • a further implementation of the example control system is where the limb is an arm, and where the upper assembly is an upper arm, and the lower assembly is a forearm rotatably coupled to the upper arm.
  • the plurality of actuators allows four degrees of movement of the arm.
  • the limb is a leg, and where the upper assembly is thigh link, and the lower assembly is a calf link rotatably coupled to the thigh link.
  • the plurality of actuators allows five degrees of movement of the leg.
  • the multi-contact controller optimizes ground reaction forces and moments of the leg.
  • FIG. 1 is a perspective view of an example bipedal robot with an example set of leg assemblies, according to an embodiment of the disclosure
  • FIG.2 is a close-up perspective view of one of the leg assemblies of the example bipedal robot in FIG.1, according to an embodiment of the disclosure
  • - 5 4868-5457-8139.1 065715-000161WOPT
  • FIG.2B is an opposite side close-up perspective view of one of the leg assemblies of the example bipedal robot in FIG 1, according to an embodiment of the disclosure
  • FIG.2C is an exploded perspective view of the components of one of the leg assemblies of the example bipedal robot in FIG.1, according to an embodiment of the disclosure
  • FIG. 3A is a perspective view of one of the arms of the example robot in FIG. 1, according to an embodiment of the disclosure
  • FIG. 3B is an exploded perspective view of the components of one of the arms of the example robot in FIG.1, according to an embodiment of the disclosure
  • FIG. 4 shows a block diagram of an example MPC based robotic control architecture system for the example robot in FIG.1, according to an embodiment of the disclosure
  • FIG. 5A shows the different degrees of freedom of the limbs on the example robot in FIG.1, according to an embodiment of the disclosure
  • FIG. 5B shows a sequence of movement of the leg assembly of the robot in FIG. 1, according to an embodiment of the disclosure
  • FIG. 5C shows a sequence of striding of the leg assembly of the robot in FIG. 1, according to an embodiment of the disclosure
  • FIG.6A shows a set of stance diagrams for developing a force and moment based model of the example robot in FIG.1, according to an embodiment of the disclosure
  • FIG.6B shows a set of stance diagrams that allows developing an external force model of the example robot in FIG.1 with a payload, according to an embodiment of the disclosure
  • FIG. 7A shows a sequence of images of the example robot in FIG.
  • FIG.7B is a graph showing torque of various joints of the example robot in FIG.1 in the lift stances in FIG.7A, according to an embodiment of the disclosure;
  • FIG.8A is a series of images showing the example robot in FIG.1 carrying a load while walking over wood slats, according to an embodiment of the disclosure;
  • FIG.8B is a series of images showing the example robot in FIG.1 walking with each arm carrying a load, according to an embodiment of the disclosure;
  • FIG. 8C is an image of the example robot in FIG.
  • FIG. 9A is a series of images showing an example test robot traversing uneven terrain, according to an embodiment of the disclosure; [0034] FIG. 9B is a series of images showing the example test robot traversing uneven random terrain, according to an embodiment of the disclosure; [0035] FIG. 9C is an image showing the example robot traversing uneven terrain, according to an embodiment of the disclosure; [0036] FIG.
  • FIG. 10A is an image of the example robot performing dynamic turning in place, according to an embodiment of the disclosure
  • FIG.10B is a graph showing yaw rate tracking of the example robot performing dynamic turning in place, according to an embodiment of the disclosure
  • FIG. 11A is an image of the example robot standing while carrying a load, according to an embodiment of the disclosure
  • FIG. 11B is a box plot of the solve time of the example MPC based controller for different scenarios for the example robot standing while carrying a load, according to an embodiment of the disclosure
  • FIG. 12A is a graph of velocity tracking and height of the example robot while walking under the example MPC based control, according to an embodiment of the disclosure
  • FIG. 12B is a graph showing torque of various joints of the example robot in FIG. 1 while walking under the example MPC based control, according to an embodiment of the disclosure;
  • FIG.13A is a set of graphs showing foot force and moments for the example robot while walking under the example MPC based control, according to an embodiment of the disclosure;
  • FIG.13B is a set of graph showing leg torques of the example robot while walking under the example MPC based control, according to an embodiment of the disclosure;
  • FIG. 14 shows snap shots of a double-leg balancing experiment of the example robot, according to an embodiment of the disclosure; [0045] FIG.
  • FIG. 15 shows snap shots of a stepping in place experiment of the example robot, according to an embodiment of the disclosure; and [0046] FIG. 16 shows snap shots of an experiment of the example robot traversing uneven terrain, according to an embodiment of the disclosure.
  • - 7 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) DETAILED DESCRIPTION
  • Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. One skilled in the art will recognize many methods and materials similar or equivalent to those described herein, which could be used in the practice of the present invention.
  • the example bipedal robot is a small-scale and power-dense unit that has five degree-of-freedom legs and corresponding actuators.
  • the integration of advanced sensors and a vision camera enables precise state estimation for stable locomotion ability.
  • a robust force-and-moment-based Model Predictive Control (MPC) system is employed on the example robot and allows for real-time high-level planning and control, facilitating various dynamic gaits.
  • MPC Model Predictive Control
  • Current results show the capabilities of the example bipedal robot in traversing different terrain dynamically, robustness to disturbances, and outstanding balancing performance.
  • the example bipedal robot may be used for various applications, including hazardous environments, emergency responses, and industrial tasks, while offering a cost-effective alternative for legged locomotion research.
  • Each leg assembly of the example robot has five degrees of freedom. These degrees of freedom include actuated abduction/adduction (ab/ad), hip, thigh, knee, and calf joints.
  • the leg design is composed of both aluminum and carbon steel parts for the balance between overall weight and transmission rigidity. To minimize the leg mass and concentrate body mass to the hip area, all the actuators are strategically positioned towards the upper thigh and hip regions. To actuate the knees and ankles, bar linkages and belt-pulley transmissions are integrated into the leg assembly.
  • the overall leg design is modular and the robot leg may be replaced with a wheel-leg or other types of legs with slight design modifications.
  • the example leg assembly allows the example bipedal robot to have very power-dense legs while having reasonably low weight.
  • the example leg assembly facilitates the use of an example MPC based control system.
  • - 9 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02)
  • An example of the bipedal robot is 0.6 m in standing height and weighs 14 kg.
  • the example bipedal robot is modified from components from a Unitree A1 quadruped robot with A1 torque-actuated joint motors. Different manufacturing techniques such as CNC machining, laser cutting, or 3D printing may be used as well as different materials such as Aluminum 6061, Aluminum 7075, Abs plastics, carbon fiber infused PETG, and Nylon plastics, and the like to maximize the rigidity of the robot while simultaneously minimizing its weight.
  • the example robot is designed around the use of laser cutting and commercially available parts to decrease production costs, with an emphasis on maintaining ease of servicing.
  • the cost of producing such a bipedal hardware is very comparable to the production cost of a small quadruped robot, which is very affordable towards either individual hobbyists or research groups.
  • the example robot can be equipped with different visual sensors including a camera such as an Intel T265 camera, enabling precise state estimation and vision data.
  • the example robot is also capable of the integration of various sensors and devices for different application purposes.
  • the two leg robot 100 includes a torso that is a trunk enclosure 110, and two five degrees of freedom (DoF) leg assemblies 120 and 122, and two arm assemblies 124 and 126.
  • the top of the trunk enclosure 110 includes a sensor module 128.
  • the leg assemblies 120 and 122 are attached to the bottom end of the trunk enclosure 110.
  • the arm assemblies 124 and 126 are attached to the top of the trunk enclosure 110.
  • the trunk enclosure 110 encloses components such as a power supply, a control system, a transceiver, payload and sensor support components.
  • the sensor module 128 includes sensors such as a camera to provide vision data and thus enable precise state estimation or the robot 100.
  • FIG.2A shows a close-up side perspective view of the leg assembly 120.
  • the other leg assembly 122 is identical to the leg assembly 120.
  • FIG. 2B shows a close-up reverse side perspective view of the leg assembly 120.
  • FIG. 2C shows an exploded view of the components of the leg assembly 120.
  • Each of the leg assemblies 120 and 122 includes a hip joint 210, a hip connecting link 212, a thigh link assembly 214, a knee joint 216, a calf link assembly 218, and an ankle joint 220, with a foot structure 222.
  • the leg components are powered by different - 10 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) actuators, which may be identical electric motors.
  • five actuators provide five degrees of freedom.
  • a set of the actuators are positioned near the hip joint 210 to concentrate mass of the leg assembly 120 proximate the hip joint 210.
  • the actuators are controlled by a robot control system that may be housed in the trunk enclosure 110.
  • Each of the leg assemblies such as the leg assembly 120 includes an abduction/adduction (ab/ad) actuator 230, a hip actuator 232, a thigh actuator 234, an ankle actuator 236, and a knee actuator 238.
  • the example bipedal robot 100 has ten actuators for the two leg assemblies 120 and 122.
  • the ab/ad actuator 230 is mounted on the trunk enclosure 110 of the example robot 100 in FIG.1.
  • the output shaft of the ab/ad actuator 230 forms the hip joint 210.
  • the output shaft of the ab/ad actuator 230 is opposite to the end of the actuator 230 attached to the bottom of the trunk enclosure 110.
  • the output shaft of the ab/ad actuator 230 is connected to the hip connecting link 212 to rotate the hip connecting link 212.
  • the hip connecting link 212 includes an ab/ad plate 240 that is perpendicular to a lateral plate 242.
  • the output shaft of the ab/ad actuator 230 engages a hole 244 in the ab/ad plate 240 to allow the hip connecting link 212 to be rotated by the ab/ad actuator 230.
  • the hip actuator 232 is mounted on the lateral plate 242 of the hip connecting link 212.
  • the output shaft of the hip actuator 232 extends through a hole 246 in the lateral plate 242 and is connected to a thigh connecting plate 248 of the thigh assembly 214.
  • the thigh actuator 234 is mounted on a support ring 250 connected perpendicularly to the thigh connecting plate 248.
  • the output shaft of the thigh actuator 234 is connected to an outer thigh frame 252 of the thigh assembly 214.
  • the outer frame 252 is joined to a parallel inner thigh frame 254 by a series of spacing pins 256.
  • the lower ends of the thigh frames 252 and 254 hold a pin that forms the knee joint 216.
  • the thigh frames 252 and 254 are fabricated from laser cut aluminum and joined by standoffs.
  • the calf and foot structure 218 and 222 are each assembled using similar method of the thigh structure, using laser cut plates joined by standoffs.
  • the thigh assembly 214 can be rotated independently of the thigh connecting plate 248 by the hip actuator 232.
  • the ankle actuator 236 and the knee actuator 238 are mounted on the exterior surface of the inner thigh frame 254.
  • the inner thigh frame 254 has two through holes 258 and 260 to allow the respective output shafts of the ankle actuator 236 and knee actuator 238 to extend through to the interior surface of the inner thigh frame 254.
  • - 11 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02)
  • the first transmission mechanism for the knee actuator 238 is a timing belt 262 and a knee joint pulley 264.
  • the pulley 264 is mounted on the pin between the distal ends of the thigh frames 252 and 254 that forms the knee joint 216.
  • a knee actuator pulley 266 is mounted on the output shaft of the knee actuator 238.
  • the knee joint pulley 264 is rigidly connected to the top side of the calf link assembly 218.
  • the timing belt 262 is looped between the two pulleys 266 and 264 to allow power transmission from the knee actuator 238 to the calf link assembly 218 to rotate around the knee joint 216.
  • a tensioner 268 is mounted on a pin held between the thigh frames 252 and 254 that pushes the timing belt 262 inward, providing necessary tension.
  • the calf link assembly 218 includes an interior calf frame structure 270 and an exterior calf frame structure 272. One end of each of the frame structures 270 and 272 is attached to the knee joint pulley 264 and thus rotate with the knee joint pulley 264. The opposite ends of the frame structures 270 and 272 hold a pin that forms the ankle joint 220.
  • a second transmission mechanism is an ankle linkage assembly 280.
  • the ankle linkage assembly 280 includes an ankle link connector 282 that is coupled to the output shaft of the ankle actuator 236. Thus, the ankle actuator 236 rotates the ankle link connector 282.
  • One end of the ankle link connector 282 is rotatable connected to one end of an ankle driving link 284.
  • the other end of the ankle driving link 284 is rotatably coupled to one vertex of a coupler link 286.
  • the coupler link 286 includes two triangular plates 288 and 290 that are rotatably mounted on the knee joint 216 on another vertex to allow rotation around the knee joint 216.
  • a third vertex of the triangular plates 288 and 290 are rotatably coupled to one end of an ankle joint follower link 292.
  • the opposite end of ankle joint follower link 292 is rotatably coupled to the foot structure 222.
  • the ankle actuator 236 allows the foot structure 222 to rotate around the ankle joint 220 via the ankle linkage assembly 280.
  • the ankle linkage assembly 280 transmits the same output of ankle actuator 236 to the ankle joint 220, forming a two-stage four-bar- linkage transmission system.
  • the example hybrid transmission mechanism with the timing belt 262 and pulley 264 assembly, and the ankle linkage assembly 280 harnesses the strengths of both linkage systems - 12 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) and timing belts, achieving transmission efficiency, gear reduction, and compactness.
  • the design integrates a two-stage linkage system forming two parallelograms, transmitting power from the ankle actuator 236 to the ankle joint 220 at a gear ratio of 1:1.
  • the increased knee torque requirement is observed to ensure dynamic capabilities according to simulation results.
  • the timing belt and pulley system both transmits and amplifies torque from the knee actuator 236 to the knee joint 216 while ensuring necessary joint speed requirements for dynamic motions.
  • the MPC based control system explained below in FIG. 4 allows the robot 100 to traverse uneven terrain with large obstacles.
  • the control system is designed for control of limbs each having an upper assembly and a lower assembly, where actuators for the limb are attached to upper assembly to concentrate mass of the limb proximate the joint attaching the upper assembly to the body of the robot.
  • one limb may be the leg assemblies 120 and 122 where the upper assembly is the thigh link assembly 214 that is rotatably coupled to the hip joint 210.
  • the calf link assembly 218, which is the lower assembly, may be rotated on the knee joint 216.
  • An opposite end of the calf link assembly 218 supports the ankle joint 220 and the attached foot structure 222.
  • the thigh link assembly 214 has an end that is attached to the thigh connecting plate 248 that is rotated by the hip actuator 232.
  • the ankle actuator 236 and thigh actuator 234 are on opposite sides of the end of the thigh link assembly 214.
  • the ab/ad actuator 230 rotates a plate holding the hip actuator 232 and one end of the thigh link assembly 214.
  • the hip actuator 232 rotates the thigh link assembly 214 relative to the plate on a first axis.
  • the knee actuator 238 is located on the thigh link assembly 214 next to the thigh actuator 234.
  • the thigh actuator 234 rotates the thigh link assembly 214 relative to the plate along a second axis.
  • the knee actuator 238 rotates the calf link assembly 218 around the knee joint 216 via the pulley and timing belt.
  • the ankle actuator 236 rotates the ankle joint 220 via the linkage assembly 280 that runs along the calf link assembly 218.
  • the power transmission system of the leg assemblies 120 and 122 of the example robot 100 has the actuators of each leg assembly 120 and 122 located close to the trunk enclosure 110 to minimize inertia and concentrate mass to hip area due to the rigid body dynamics assumption in the example model predictive control (MPC) based controller.
  • MPC model predictive control
  • the power transmission system also includes the timing belts and pulleys utilized for power transmission.
  • This leg - 13 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) assembly design provides the robot 100 with a stable and efficient power source for its movements.
  • the example bipedal robot 100 is designed to be a versatile and adaptable machine, capable of performing a wide variety of tasks in various environments, while maintaining stability, maneuverability, and energy efficiency. Laser cutting the parts of leg assemblies of the robot 100 results in a generally light weight robot, which allows less unwanted dynamic effect of the legs in the system and lowers motor torque limit requirements during balancing control and navigating high obstacles.
  • the actuators are positioned close to the torso of the example robot to minimize leg weight and inertia, and improve dynamic capabilities.
  • the timing belt based transmission system for the calf assembly enables gear reduction and improves joint torque at the knee joint, which is the most torque demanding joint.
  • ankle joint torque requirements are low, linkage is more space efficient than timing belt while being able to transmit power.
  • the hybrid transmission setup ensures sufficient torques for each joint for highly dynamic motions while keeping the leg side and weight small.
  • the gear ratio of the example knee transmission is 1.54, amplifying the torque at the knee joint 216 if the robot doesn’t have the optional arm assemblies 124 and 126.
  • FIG.3A shows a perspective view of the arm assembly 124 in FIG.1.
  • FIG.3B shows an exploded view of the arm assembly 124.
  • the arm assembly 126 in FIG.1 is identical to the arm assembly 124.
  • the example arm assembly 124 has four degrees of freedom.
  • the arm assembly 124 uses the same design philosophy of the leg assembly 120. Thus, all of the actuators of the arm assembly 124 are mounted close to the torso of the robot 100 and minimize the weight on the arm link.
  • the example controller may also control limbs such as the arm assemblies 124 and 126 having an upper assembly and a lower assembly where the actuators are positioned on the upper assembly to concentrate mass of the arm assembly near the torso of the robot.
  • the arm assembly 124 constitutes another limb that includes a shoulder joint 310, an - 14 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) upper arm assembly 312 (upper assembly), an elbow joint 314, and a forearm link assembly 316 (lower assembly).
  • the arm assembly 124 includes a shoulder yaw actuator 320, a shoulder pitch actuator 322, a shoulder roll actuator 324, and an elbow actuator 326.
  • the shoulder yaw actuator 320 is attached to the top of the trunk enclosure 110 of the robot. As shown in FIGs. 3A-3B, the output shaft of the shoulder yaw actuator 320 is connected to a horizontal shoulder plate 330 that includes a mounting cradle 332.
  • the shoulder pitch actuator 322 is mounted on the mounting cradle 332 of the horizontal shoulder plate 330.
  • the output shaft of the shoulder pitch actuator 322 is then connected to a second shoulder plate 334.
  • the shoulder plate 334 includes a mounting cradle 336.
  • the shoulder roll actuator 324 is mounted on the mounting cradle 336.
  • the whole upper arm assembly 312 is then mounted on the output shaft of the shoulder roll actuator 324.
  • the upper arm assembly 312 includes inner and outer arm plates 340 and 342. Inside of the upper arm assembly 312, there are mounting points for the elbow actuator 326 to be rigidly connected to the upper arm assembly 312.
  • the forearm link assembly 316 includes an inner support 344 and an outer support 346. One end of both supports 344 and 346 are rotatably coupled to the elbow joint 314. The opposite end of the supports 344 and 346 are attached to an end effector 348.
  • An elbow actuator pulley 350 is connected to the output shaft of the elbow actuator 326.
  • the elbow actuator pulley 350 is mounted on the upper section of the forearm link assembly 316.
  • the elbow actuator pulley 350 drives one end of an elbow timing belt 352.
  • the other end of the elbow timing belt 352 drives an elbow joint pulley 354 that is mounted on the elbow joint 314.
  • the elbow pulley 354 is attached to the forearm link assembly 316.
  • a pair of tensioner pins 356 create tension in the elbow timing belt 352.
  • the elbow pulley belt transmission system of the pulleys 350 and 354 and the timing belt 352 is similar to that of the leg assembly 120 and thus transmits the torque from the elbow actuator 326 to the elbow joint 314 with a gear ratio of 1.45, amplifying the torque.
  • the parts in the leg assemblies and arm assemblies are fabricated by laser-cut Aluminum 7075 T6. The aluminum is particularly attractive because it offers a high strength-to-weight ratio, rapid manufacturing, and low production cost.
  • the example bipedal robot 100 uses a force-and-moment-based model predictive control (MPC) for real-time high-level planning and control.
  • MPC model predictive control
  • the MPC assumes the dynamics of the example biped robot as rigid body dynamics since most of the mass is concentrated around the hips. This allows linearizing the dynamics model of the example robot and allows use of force- based control input in MPC.
  • the swing leg is controlled by a low-level Cartesian PD (proportional with derivative action) controller and its capture points are based on heuristic policies.
  • PD projection with derivative action
  • leg assembly 120 and 122 With the design of the leg assemblies 120 and 122 concentrating the location of the actuators, the CoM of each leg stays close to the upper thigh and hip area. Compared to traditional humanoid robot legs, where the CoM lies near lower thigh and knee area, the example - 16 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) leg assembly design induces much less change in entire body moment of inertia and CoM when performing fast legged motions. Quantitively speaking, the calf/lower leg assembly of the example robot 100 is only 5% mass of the entire leg assembly, while a human has lower legs weighing 30% of the entire leg limbs.
  • the actuators 434 represent the actuators in the arm and leg assemblies in FIG. 1.
  • An Ethernet bus 428 allows communication between the low level control board 430 and the control module 414.
  • the low level control board 430 is coupled to the sensors 432 via a serial line 440.
  • the low level control board 430 is coupled to the actuators 434 via an RS485 bus and a control area network (CAN) bus 442.
  • CAN control area network
  • the robot states command x cmd from the remote controller 410 includes the desired velocities ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ in the transverse plane and the desired - 17 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) body yaw rate ⁇ cmd , which are then mapped to a reference trajectory x ref ⁇ R 12 ⁇ h with h prediction horizons at each MPC time step for the on board controller 426 to track.
  • the contact schedule 412 summarizes time-based periodic contact sequences based on the desired gait period length (typically in the range of 0.3 to 0.7 seconds for the example robot 100) and the loco-manipulation task.
  • This gait sequence determines, at any time t, whether the leg is in stance phase (contact), or swing phase (no contact).
  • leg m 0, 1
  • the ground reaction forces and moments for leg m are then optimized by the MPC.
  • the Cartesian-space PD controller 422 commands the foot to swing to a desired location for the next step.
  • the arm assemblies 124 and 126 are primarily employed for object manipulation and handling.
  • Stance foot control inputs u [F stance ; M stance ], swing foot forces F swing , and arm forces F arm are then mapped to corresponding joint torques ⁇ ⁇ in the low- level control board 430 by leg and arm Jacobian matrices. Subsequently, the joint torque commands are sent to the motor control boards of the actuators 434 for executing the motion.
  • a fused linear state estimator is used that receives readings of an onboard inertial measurement unit - 18 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) (IMU) and the controller 426 (the Intel T265 VIO).
  • the sensor fusion allows the example robot 100 to obtain accurate and high-frequency robot state feedback.
  • the robotic control system architecture 400 may be configured to operate autonomously, semi-autonomously, and/or using directions provided by user(s).
  • the robotic system 400 may be implemented in various forms, a biped robot such as the robot 100, a wheel-legged robot, a quadruped robot, or some other arrangement.
  • the example robot 100 may further include mechanical components and/or electrical components. Nonetheless, the robot 100 is shown for illustrative purposes, and may include more or fewer components.
  • the various components of robot 100 may be connected in any manner, including wired or wireless connections. Further, in some examples, components of the robot 100 may be distributed among multiple physical entities rather than a single physical entity. Other example illustrations of robot 100 may exist as well.
  • the controller(s) 426 may operate as one or more general-purpose hardware processors or special purpose hardware processors (e.g., digital signal processors, application specific integrated circuits, etc.).
  • the controller(s) 426 may be configured to execute computer-readable program instructions, and manipulate data, both of which are stored in a data storage device.
  • the controller(s) 426 may also directly or indirectly interact with other components of the robotic system 400, such as sensor(s) 432, power source(s) 436, actuators 434, transceivers 438, mechanical components, and/or electrical components.
  • the transceiver 438 may be used to communicate data or receive command signals with an external device.
  • a data storage device on board the robot 100 may be one or more types of hardware memory.
  • the data storage may include or take the form of one or more computer- readable storage media that can be read or accessed by controller(s) 426.
  • the one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic, or another type of memory or storage, which can be integrated in whole or in part with controller(s) 426.
  • the data storage can be a single physical device. In other implementations, the data storage can be implemented using two or more physical devices, which may communicate with one another via wired or wireless communication.
  • the data storage may include the computer- readable program instructions and the data.
  • the data may be any type of data, such as configuration data, sensor data, and/or diagnostic data, among other possibilities.
  • the controller 426 may include one or more electrical circuits, units of digital logic, computer chips, and/or microprocessors that are configured to (perhaps among other tasks), interface between any combination of the mechanical components, the sensor(s) 432, the power source(s) 436, the electrical components, the control system 414, and/or a user of the robot 100.
  • the remote controller 410 may be a purpose-built embedded device for performing specific operations with one or more subsystems of the robotic device 100.
  • the remote controller 410 may monitor and physically change the operating conditions of the robotic system 400.
  • the controller 426 may serve as a link between portions of the robotic system 400, such as between mechanical components and/or electrical components.
  • the controller 426 may serve as an interface between the robotic system 400 and another computing device.
  • the controller 426 may serve as an interface between the robotic system 400 and a user.
  • the controller 426 may include various components for communicating with the robotic system 400, including a joystick, buttons, and/or ports, etc.
  • the example interfaces and communications noted above may be implemented via a wired or wireless connection, or both.
  • the controller 426 may perform other operations for the robotic system 400 as well.
  • the controller 426 may communicate with other systems of the robot 100 via wired or wireless connections, and may further be configured to communicate with one or more users of the example robot 100.
  • the controller 426 may receive an input (e.g., from a user or from another robot) through the transceiver 438 indicating an instruction to perform a particular gait in a particular direction, and at a particular speed.
  • a gait is a pattern of movement of the limbs of an animal, robot, or other mechanical structure.
  • the controller 426 may perform operations to cause the example robot 100 to move according to the requested gait.
  • the controller 426 may receive an input indicating an instruction to move to a particular geographical location.
  • the controller 426 may determine a direction, speed, and/or gait based on the environment through which the robotic system 400 is moving en route to the geographical location.
  • Operations of the control system in the control module 414 may be carried out by the controller(s) 426. Alternatively, these operations may be carried out by the remote controller 410, or a combination of the controller(s) 426 and the remote controller 410.
  • the control module 414 may partially or wholly reside on a device other than the robotic system 400, and therefore may at least in part control the robotic system 400 remotely.
  • Mechanical components represent hardware of the example robot architecture 400 that may enable the robot 100 to perform physical operations.
  • the robotic system 400 may include physical members such as wheeled legs, leg(s), arm(s), and/or wheel(s).
  • the physical members or other parts of robotic system 400 may further include actuators such as motors arranged to move the physical members in relation to one another.
  • the robotic system 400 may also include one or more structured bodies for housing the control module 414 and/or other components, and may further include other types of mechanical components.
  • the particular mechanical components used in a given robot may vary based on the design of the robot, and may also be based on the operations and/or tasks the robot may be configured to perform.
  • the mechanical components may include one or more removable components.
  • the robotic system 400 may be configured to add and/or remove such removable components, which may involve assistance from a user and/or another robot.
  • the robotic system 400 may be configured with removable arms, hands, feet, and/or legs, so that these appendages can be replaced or changed as needed or desired.
  • the robotic system 400 may include one or more removable and/or replaceable battery units or sensors.
  • the robotic system 400 may include sensor(s) 432 arranged to sense aspects of the robotic system 400.
  • the sensor(s) 432 may include one or more force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors, proximity sensors, motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras, among other possibilities.
  • the robotic system 400 may be configured to receive sensor data from sensors that are physically separated from the robot (e.g., sensors that are positioned on other robots or located within the environment in which the robot is operating).
  • the sensor(s) 432 may provide sensor data to the controller(s) 426 (perhaps by way of data) to allow for interaction of the robotic system 400 with its environment (e.g., surrounding terrain), as well as monitoring of the operation of the robotic system 400.
  • the sensor data may be - 21 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) used in evaluation of various factors for activation, movement, and deactivation of mechanical components and electrical components by control module 414.
  • the sensor(s) 432 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation.
  • sensor(s) 432 may include RADAR (e.g., for long-range object detection, distance determination, and/or speed determination), LIDAR (e.g., for short-range object detection, distance determination, and/or speed determination), SONAR (e.g., for underwater object detection, distance determination, and/or speed determination), VICON® (e.g., for motion capture), one or more cameras (e.g., stereoscopic cameras for 3D vision), a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which the robotic system 400 is operating.
  • the sensor(s) 432 may monitor the environment in real time, and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment.
  • the robotic system 400 may include sensor(s) 432 configured to receive information indicative of the state of the robotic system 400, including sensor(s) 432 that may monitor the state of the various components of the robotic system 400.
  • the sensor(s) 432 may measure activity of systems of the robotic system 400 and receive information based on the operation of the various features of the robotic system 400, such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic system 400.
  • the data provided by the sensor(s) 432 may enable the control module 414 to determine errors in operation as well as monitor overall operation of components of the robotic system 400.
  • the robotic system 400 may use force sensors to measure load on various components of the robotic system 400.
  • the robotic system 400 may include one or more force sensors on an arm or a leg to measure the load on the actuators that move one or more members of the arm or leg.
  • the robotic system 400 may use one or more position sensors to sense the position of the actuators of the robotic system and thus the joint angles of wheeled legs. For instance, such position sensors may sense states of extension, retraction, or rotation of the actuators on arms or legs.
  • the sensor(s) 432 may include one or more velocity and/or acceleration sensors.
  • the sensor(s) 432 may include an inertial measurement unit (IMU).
  • IMU inertial measurement unit
  • the IMU may sense velocity and acceleration in the world frame, with respect to the - 22 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) gravity vector. The velocity and acceleration sensed by the IMU may then be translated to that of the robotic system 400 based on the location of the IMU in the robotic system 400 and the kinematics of the robotic system 400.
  • the robotic system 400 may include other types of sensors not explicated discussed herein. Additionally or alternatively, the robotic system may use particular sensors for purposes not enumerated herein.
  • the robotic system 400 may also include one or more power source(s) 436 configured to supply power to various components of the robotic system 400.
  • the robotic system 400 may include a hydraulic system, electrical system, batteries, and/or other types of power systems.
  • the robotic system 400 may include one or more batteries configured to provide charge to components of the robotic system 400.
  • Some of the mechanical components and/or electrical components may each connect to a different power source, may be powered by the same power source, or be powered by multiple power sources.
  • Any type of power source may be used to power the robotic system 400, such as electrical power or a gasoline engine.
  • the robotic system 400 may include a hydraulic system configured to provide power to the mechanical components using fluid power. Components of the robotic system 400 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example.
  • the hydraulic system may transfer hydraulic power by way of pressurized hydraulic fluid through tubes, flexible hoses, or other links between components of the robotic system 400.
  • the power source(s) 436 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples.
  • the electrical components may include various mechanisms capable of processing, transferring, and/or providing electrical charge or electric signals.
  • the electrical components may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic system 400.
  • the electrical components may interwork with the mechanical components to enable the robotic system 400 to perform various operations.
  • the electrical components may be configured to provide power from the power source(s) 436to the various mechanical components, for example.
  • the robotic - 23 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) system 400 may include electric motors. Other examples of electrical components may exist as well.
  • the housing of the trunk enclosure 110 is connected to or houses appendages and components of the example robot 100 such as the arm assemblies 124 and 126 in FIG. 1.
  • the structure of the body may vary within examples and may further depend on particular operations that a given robot may have been designed to perform. For example, a robot developed to carry heavy loads may have a wide body that enables placement of the load. Similarly, a robot designed to reach high speeds may have a narrow, small body that does not have substantial weight.
  • the body and/or the other components may be developed using various types of materials, such as metals or plastics.
  • a robot may have a body with a different structure or made of various types of materials.
  • the body and/or the other components may include or carry the sensor(s) 432. These sensors 432 may be positioned in various locations on the example robot 100, such as on the body and/or on one or more of the appendages, among other examples.
  • the example robot 100 may carry a load, such as a type of cargo that is to be transported.
  • the load may also represent external batteries or other types of power sources (e.g., solar panels) that the example robot 100 may utilize.
  • Carrying the load represents one example use for which the example robot 100 may be configured, but the example robot 100 may be configured to perform other operations as well.
  • the example robot 100 may include various types of legs, arms, wheels, and so on.
  • the example robot 100 may be configured with one or more legs.
  • an implementation of the example robot 100 with one or more legs may additionally include wheels, treads, or some other form of locomotion.
  • An implementation of the robotic system with two legs may be referred to as a biped, and an implementation with four legs may be referred as a quadruped. Implementations with six or eight or 10 or more legs are also possible.
  • FIG.5A shows the range of motion for the example robot 100 from different perspective views.
  • FIG.5A shows the range of motion for the example robot 100 from different perspective views.
  • FIG. 5A shows a front view of the robot 100 with the arm assembly 124 having a range of 120 degrees via rotating the shoulder roll actuator 324 and the leg assembly 120 having a 60 degree range of motion via rotating the thigh actuator 234.
  • a side view shows the shoulder joint 310 of the arm assembly 124 having a 360 degree range of motion rotated by the shoulder pitch - 24 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) actuator 322.
  • the forearm link assembly 316 has a 300 degree range of motion around the elbow joint 314 as rotated by the elbow actuator 326.
  • the thigh assembly 214 has a 130 degree of freedom rotated by the hip joint actuator 232.
  • the calf link assembly 218 has a 130 degree of freedom around the knee joint 216 when rotated by the knee actuator 238.
  • a top view 520 shows that the arm assembly 124 has a yaw range of 165 degrees rotated by the shoulder yaw actuator 320.
  • FIG. 5B shows different positions of the thigh assembly 214 and calf link assembly 218 when rotated by the respective actuators 234 and 238 during a swing operation by the leg assembly 120.
  • FIG. 5C shows different positions of the thigh assembly 214 and calf link assembly 218 when rotated by the respective actuators 234 and 238 during a stepping operation by the leg assembly 120.
  • a first position 510 shows the leg assembly 120 in a stance position.
  • the second position 512 shows the leg assembly 120 in a swing position that swings the calf link assembly 218, a third position 514 shows the full swing position.
  • a fourth position 516 shows the calf link assembly 218 swinging back in position.
  • a final position 528 shows the leg assembly 120 advancing to a new stance position.
  • the SRBM of the example robot 100 for locomotion encompasses the body, shoulders, hips, and upper thighs, which are all treated as a single combined rigid body. This combined rigid body captures the main dynamic effect of the system.
  • the design of the example robot is centered around the assumption of rigid body dynamics, whereby the mass of the remaining arm and leg components constitutes only 15% of the total robot mass. As a result, inertia effects of these components have been disregarded in the dynamics model. [00120] FIG.
  • a first view 600 shows the position of the legs of the example robot 100. It is assumed the location of the ground reaction force and moment is a contact point on each foot, is at the projection of ankle joint location on the foot along the z-coordinate.
  • a second view 610 shows a robot model assumed as a single rigid body model neglecting leg dynamics. The location of a center of mass 612 and arrows 614 representing inputs of ground reaction forces and moments at contact points of the legs.
  • a third view 620 shows a force and moment model, showing arrows 622 representing a detailed definition of selection of forces and moments of inertia on the legs from the center of mass and the legs.
  • the SRBM is further evolved to consider the dynamic effect of the object on the humanoid robot, to form the example l-SRBM.
  • Loco-manipulation and dynamic handling of heavy loads represent significant challenges in humanoid hardware and control.
  • the dynamics of objects are not included in the humanoid dynamics model due to the variability in dynamics among different objects. Consequently, it becomes essential to capture the fundamental dynamics of the object to simplify and enable efficient and straightforward online utilization.
  • FIG.6B shows a series of stance diagrams for modification of the SRBM by accounting for a payload held by the arms of the example robot.
  • a first view 650 shows the robot 100 with a payload 640 carried by the arm assemblies 124 and 126.
  • a second view 660 shows a robot model with the payload 640 assumed as a single rigid body model neglecting leg dynamics. The location of a center of mass 662 for the robot, a center of mass for the payload 664, and arrows 666 representing inputs of ground reaction forces and moments at contact points of the legs.
  • a - 26 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) third view 670 shows a force and moment model, showing arrows 672 representing a detailed definition of selection of forces and moments of inertia on the legs from the center of mass and the legs. Another arrow 674 represents force from the payload 640.
  • the humanoid SRBM was expanded to encompass dynamic object carrying and loco-manipulation by integrating the simplified object dynamics into the simplified dynamics of the example robot 100.
  • a comparison was made between two different formalisms of load dynamics in SRBM, namely the Combined Rigid Body Model and the External Force Model.
  • the Combined Rigid Body Model treats the new SRBM as the combined rigid body of the robot and the object and updating the system dynamics with respect to the new combined CoM m c .
  • the External Force Model treats the object dynamics as an external gravitational forces/weights applied to the robot SRB CoM.
  • Equation (3) represents the moment equilibrium of the rigid body dynamics.
  • GI is the rigid body moment of inertia (MoI) in the world frame, which is obtained from the body frame MoI BI and body rotation matrix ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ .
  • ⁇ ⁇ ⁇ represents the distance vector from robot CoM pc to nth payload CoM and ⁇ ⁇ represents distance vector from the robot CoM to the mth contact point pf,m.
  • the example Multi-contact Force-and-moment-based MPC control framework is developed by the selection of MPC states, assumptions to linearize the state-space dynamics, MPC formalism, and low-level control associated with MPC.
  • the example control allows the MPC to serve as the primary control scheme, fully harnessing the capabilities of the proposed l- SRBM and contact schedules for dynamic loco-manipulation.
  • the State-space Dynamics Linearization is detailed below.
  • ⁇ 1, ⁇ 2, and ⁇ ⁇ ⁇ are binary values that represent the time- based foot contact schedules and load (1 for contact, 0 for non-contact).
  • the robot state x was modified and the new MPC state variable selections x were introduced to include constant 1, ⁇ ⁇ ⁇ ⁇ ; ⁇ ; ⁇ ⁇ ; 1 ⁇ .
  • Equation (12) J1 represents the cost of driving states to the desired trajectory based on the user commands.
  • Equation (13a) represents the discrete dynamics constraints derived from equation (7-11).
  • Equation (13b) describes the contact point friction constraint, which follows an inscribed friction pyramid approximation, ⁇ ⁇ ⁇ ⁇ 2 ⁇ /2, for more conservative lateral foot forces, where ⁇ is the contact friction coefficient of the ground.
  • Equation (13d-13f) represents the line-foot dynamics constraints, where ⁇ ⁇ ⁇ , ⁇ denotes the foot rotation matrix w.r.t the world frame.
  • the representation of the foot contact on the example robot 100 is simplified to a line foot contact, which is also a common assumption - 30 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) on humanoid with 5-DoF legs.
  • the Contact Wrench Cone was modified to enforce: 1) the x-direction ground reaction moment is zero, shown in equation (13d); 2) prevention of toe/heel lift due to the y-direction ground reaction moment, shown in equation (13e); and 3) friction constraints for y-direction reaction forces at line-foot heel and toe locations, shown in equation (13f).
  • the MPC-QP Problem was then condensed for Real-time Computation. Despite the extended prediction horizon embedded in the proposed MPC, resulting in large matrix sizes that demand significant computational resources for online optimal control problems, the convex MPC formulation was capable of being optimized to ensure efficient solutions as a Quadratic Programming (QP) optimization.
  • QP Quadratic Programming
  • a QP problem definition is: m in ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ (14) s. t. ⁇ ⁇ ⁇ ⁇ ⁇ (15) ⁇ ⁇ ⁇ ⁇ (16)
  • the problem (eqn. (12)) may be expressed with dynamics constraints (eqn.
  • the MPC solve time between non-condensed and condensed formulations on the example robot hardware using a PC with an AMD Ryzen 5600X processor at 4.2 GHz on Ubuntu 20.04 shows a reduction in time.
  • qpOASES C++
  • non-condensed standing time and walking time was 19.9 ms and 12.9 ms
  • condensed standing time and walking time was 3.2 ms and 1.3 ms.
  • quadprog MATLAB
  • non-condensed standing time and walking time was 432.5 ms and 367.2 ms
  • condensed standing time and walking time was 99.3 ms and 24.9 ms.
  • the swing force is acquired by Cartesian-space PD control law driving the swing foot to the desired heuristic foot position ⁇ ⁇ ⁇ ⁇ and velocity ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ , (25) where ⁇ t denotes the time duration of each foot swing and scalar kc represents the velocity feedback gain to achieve the desired walking speed. While holding objects with arms, the upper body joint commands are also generated by the Cartesian space control and contact (hand) Jacobian. [00138] Numerical simulations were conducted to test the example control MPC control architecture.
  • the example controllers were written in MATLAB scripts.
  • the qpOASES in the CasADi toolbox was used to solve the MPC problems online.
  • the MPC, state estimation, and low- level control scripts were written in C++.
  • the MPC problem was solved through qpOASES C++ interface.
  • a first image 710 shows the robot carrying a mass of .5 kg
  • a second image 712 shows the robot carrying a mass of 2.0 kg
  • a third image 714 shows the robot carrying a mass of - 33 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 4.0 kg
  • a fourth image 716 shows the robot carrying a mass of 3.0 kg
  • a fifth image 718 shows the robot carrying a mass of 1.5 kg
  • a final image 720 shows the robot carrying a mass of .5 kg.
  • the images 710-720 show a sequence of overlaid simulation snapshots of the motion of the example robot carrying a payload with a time-varying mass.
  • FIG. 7B Associated torque plots for the robot with the masses in FIG. 7A are shown in a graph 750 in FIG. 7B.
  • the graph 750 shows torques of the right leg of the example robot from the simulation.
  • a plot 752 shows the hip yaw torque
  • a plot 754 shows the hip roll torque
  • a plot 756 shows the thigh torque
  • a plot 758 shows the knee torque
  • a plot 760 shows the ankle torque.
  • a dashed line 770 shows the torque limit for the knee joint
  • a dashed line 772 shows a torque limit for the rest of the joints in the right leg.
  • a 6-axis IMU is integrated in the trunk enclosure 110 in FIG. 1, positioned near the center of mass (CoM) of the robot 100 for precise measurements.
  • CoM center of mass
  • an extended Kalman filter processes the IMU data and leg kinematics at a high refresh rate of 1 kHz.
  • an Intel Realsense T265 tracking camera is employed. This camera, through its onboard VIO algorithm, outputs the position and orientation of the body at a slower 200 Hz rate. The system ensures accurate and robust state estimation for the robot 100 by fusing the data from both sources.
  • the example robot 100 is equipped with an Intel Realsense D435i camera mounted in the sensor module 128.
  • This camera is designed to capture depth data and other physical environment information.
  • Control objective weight/gain tuning on the hardware for the example robot is very minimal to achieve the presented results.
  • FIGs. 8A-8C shows a series of loco-manipulation experiment snapshots.
  • FIG. 8A shows the example robot 100 walking over stacked wood slats while holding a 2.5 kg payload.
  • FIG.8B shows the example robot 100 walking while carrying a 0.9 kg payload at each arm assembly.
  • FIG.8C shows dynamic turning performed by the example robot 100 in place while holding a 2 kg payload under the control of the example MPC control system.
  • FIG. 8A-8C shows the example robot 100 walking over stacked wood slats while holding a 2.5 kg payload.
  • FIG.8B shows the example robot 100 walking while carrying a 0.9 kg payload at each arm assembly.
  • FIG.8C shows dynamic turning performed by the example robot 100 in place while holding a 2 kg payload under the control of the example MPC control system.
  • FIG. 8A shows the example robot 100 walking over stacked wood slats while holding a 2.5 kg payload.
  • FIG.8B shows the example robot 100 walking while carrying
  • FIG.12A is a graph 1200 showing velocity tracking, and height of the example robot 100 while walking.
  • the graph 1200 includes a plot of x-direction velocity 1210 and a plot of the height 1212 of the example robot 100 over time while walking under a command represented by a dashed line 1214.
  • FIG.12B is a graph 1250 of joint torques of the example robot 100 during a walk at a velocity of 1 m/s.
  • the graph 1250 shows torques of the right leg of the example robot from locomotion.
  • a plot 1252 shows the hip yaw torque
  • a plot 1254 shows the hip roll torque
  • a plot 1256 shows the thigh torque
  • a plot 1258 shows the knee torque
  • a plot 1260 shows the ankle torque.
  • a dashed line 1270 shows the torque limit for the knee joint
  • a dashed line 1272 shows a torque limit for the rest of the joints in the right leg.
  • FIG.12B shows that there is noticeable torque headroom for more dynamic motions.
  • dynamic loco-manipulation may be performed by the example robot 100.
  • FIG. 8A shows the snapshots of an example test robot without arms walking over uneven and unstable terrain with a 2.5 kg payload. The associated MPC solution and torque command plots are presented in FIG.13A.
  • FIG.13A shows a graph 1300 that plots the three left foot forces (Fx, Fy, and Fz) shown in FIG. 6B represented by plots 1302, 1304, and 1306.
  • a graph 1310 plots the three left foot forces (Fx, Fy, and Fz) shown in FIG.6B represented by plots 1312, 1314, and 1316.
  • FIG. 1320 plots the three left foot moments (Mx, My, and Mz) shown in FIG. 6B represented by plots 1322, 1324, and 1326.
  • a graph 1330 plots the three left foot moments (Mx, My, and Mz) shown in FIG.6B represented by plots 1332, 1334, and 1336.
  • the graphs in FIG.13A showcase the robustness of solutions in force constraint satisfaction and potential for further dynamic motions.
  • FIG. 13B shows two torque plots 1350 and 1360 for the test robot walking.
  • the plots 1350 and 1360 shows torques of the right leg and left leg of the example robot during walking.
  • a plot 1362 shows the hip yaw torque
  • a plot 1364 shows the hip roll torque
  • a plot 1366 shows the thigh torque
  • a plot 1368 shows the knee torque
  • Attorney Docket No.065715-000161WOPT (2024-016-02) 1370 shows the ankle torque.
  • a dashed line 1380 shows the torque limit for the knee joint
  • a dashed line 1382 shows a torque limit for the rest of the joints in the respective legs.
  • the example biped robot can perform in place stepping in various terrain setups, including indoor floor, outdoor concrete hard ground, and outdoor grass soft ground as shown in the snapshots of FIG.15. Experiments of walking forward and sideways on hard ground as shown in FIG. 15. In handling terrain perturbations, the example biped robot is capable of walking over grass terrain as well as uneven surfaces as shown in FIG.16 [00155]
  • the terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise.

Landscapes

  • Engineering & Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Manipulator (AREA)

Abstract

A bipedal robot having five degree of freedom leg assemblies is disclosed. The example leg assemblies each have a hip joint; a thigh link assembly rotatably coupled to the hip joint; and a knee joint coupled to the thigh link assembly. A calf link assembly is rotatably coupled to the knee joint. An ankle joint is coupled to the calf link assembly. A set of actuators is configured to control movement of the leg assembly. At least some of the actuators are positioned near the hip joint to concentrate mass of the leg assembly proximate to the hip joint. A transmission system is configured to transmit force from at least one of the actuators to either the knee joint or the ankle joint of the leg assembly.

Description

Attorney Docket No.065715-000161WOPT (2024-016-02) A BIPEDAL ROBOT FOR DYNAMIC AND ROBUST LOCATION IN DIVERSE ENVIRONMENTS PRIORITY CLAIM [0001] The present application claims the benefit of and priority to U.S. Provisional Application No. 63/579,166 filed on August 28, 2023. The contents of that application are hereby incorporated by reference in their entirety. TECHNICAL FIELD [0002] The present application generally relates to a bipedal robot with a leg actuator and linkage assembly design that allow for dynamic movement in diverse environments. BACKGROUND [0003] All publications herein are incorporated by reference to the same extent as if each individual publication or patent application was specifically and individually indicated to be incorporated by reference. The following description includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art. [0004] The recent technological and theoretical developments in both robot design and controls have allowed the world to witness many successful and highly autonomous legged robots. With such hardware and software advancements, researchers in the robotics field are now facing a challenge to develop mobile legged robots that can conduct given tasks fully autonomously. Another challenge is providing a control framework that can perform robustly in terrains with uneven surfaces with obstacles. [0005] In many real-life applications, robots are required to complete long-range operations over complex terrains with high energy efficiency. Many bipedal and quadruped robots have demonstrated outstanding maneuverability and dynamic locomotion in unknown terrain over the last decade. These robots have proven to have great potential to be controlled autonomously. Legged robots rely on a gait sequence and proper foot placement to overcome obstacles and uneven surfaces. Due to their morphology, legged robots have a unique capability to navigate - 1 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) rough terrain. They can leverage different gait sequences by lifting and swinging certain legs while walking to place their feet in strategic positions and overcome high obstacles or uneven surfaces. Legged robots have also demonstrated the ability to jump onto an obstacle three or four times higher than their body height. [0006] Commercial and sociological interests widely promote the motivation for studying bipedal robots. The desired outcomes of bipedal robot applications range from assisting or replacing humans in various situations, including disaster response missions, warehousing, and rehabilitation. It requires bipedal robots to be highly dynamic and capable of handling external disturbances and uneven terrain. [0007] Thus, there is a need for a bipedal robot with a leg design that allows highly dynamic movement. There is another need for a bipedal robot that concentrates multiple actuators near the thigh of the legs to allow for the use of a Model Predictive Control (MPC) based controller for efficient and fast operation of the legs of the robot. SUMMARY [0008] One disclosed example is a bi-pedal robot having a torso and a first and a second leg coupled to the torso. Each leg has an abduction adduction actuator attached to the torso. Each leg has a hip actuator rotatable by the abduction adduction actuator. A thigh actuator is rotatable by the hip actuator. A thigh link assembly is rotatably coupled to the hip actuator. A knee actuator is coupled to the thigh link assembly in proximity with the thigh actuator. An ankle actuator is coupled to one end of the thigh link assembly. A knee joint is coupled to an opposite end of the thigh link assembly. A calf link assembly is rotatably coupled to the knee joint and knee actuator. The knee actuator rotates the calf link assembly relative to the thigh link assembly. An ankle joint is coupled to the calf link assembly and the ankle actuator. A controller is couped to the actuators to control dynamic movement of the first and second legs to propel the robot. [0009] A further implementation of the example robot is where the legs each include a foot structure coupled to the ankle joint. Another implementation is where the legs each include a wheel coupled to the ankle joint. Another implementation is where the example robot includes a camera coupled to the controller, wherein the controller provides state estimation and processes - 2 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) vision data from the camera. Another implementation is where the example robot includes a sensor coupled to the controller, wherein the controller stores data sensed by the sensor. Another implementation is where the torso includes an enclosure with a power source and a payload compartment. Another implementation is where each of the actuators are identical motors. Another implementation is where the example robot includes an input to accept a command from an input device for the robot to traverse terrain. Another implementation is where the input device is one of a human input controller, an autonomous controller, or a semi-autonomous controller. Another implementation is where the thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts. Another implementation is where the example robot includes a belt and pulley coupling the knee actuator to the knee joint to rotate the calf link assembly. Another implementation is where the example robot includes a linkage assembly coupling the ankle actuator to the ankle joint. Another implementation is where the controller controls the dynamic movement of the legs based on a Model Predictive Control (MPC). Another implementation is where the MPC is based on a single rigid body dynamics model of the robot allowing linearization of the single rigid body dynamics model. Another implementation is where the MPC is applied to control the dynamic movement based on properties of the ground reaction forces and moments of inertia on the first and second legs. Another implementation is where the controller includes a multi-contact controller having an input of a contact schedule of contacts by the legs during the dynamic movement to provide stance control of the legs. Another implementation is where the multi-contact controller optimizes ground reaction forces and moments of the first and second legs. Another implementation is where the controller includes a Cartesian PD controller that provides swing control of the legs. Another implementation is where the example robot includes a first and second arm assembly mounted to the torso, wherein the first and second arm assemblies are controlled by the controller. Another implementation is where the first and second arm assembly each include a shoulder yaw actuator attached to the torso; a shoulder pitch actuator rotatable by the shoulder yaw actuator; and a shoulder roll actuator rotatable by the shoulder pitch actuator. Each of the arm assemblies have an upper arm assembly coupled to the shoulder pitch actuator. An elbow actuator is coupled to one end of the upper arm assembly. An elbow joint is coupled to the upper arm assembly. A forearm assembly - 3 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) rotates around the elbow joint. An elbow transmission couples the elbow joint to the elbow actuator. [0010] Another disclosed example is a leg assembly for a robot. The leg assembly includes a hip joint and a thigh link assembly rotatably coupled to the hip joint. A knee joint is coupled to the thigh link assembly. A calf link assembly is rotatably coupled to the knee joint. An ankle joint is coupled to the calf link assembly. A plurality of actuators are configured to control movement of the leg assembly. At least some of the actuators are positioned near the hip joint to concentrate mass of the leg assembly proximate to the hip joint. A transmission system is configured to transmit force from at least one of the actuators to either the knee joint or the ankle joint of the leg assembly. [0011] A further implementation of the example leg assembly is where the plurality of actuators include an abduction adduction actuator coupled to the hip joint. The abduction adduction actuator is attachable to a torso of the robot. A hip actuator is coupled to the hip joint and the thigh link assembly. The hip actuator is rotatable by the abduction adduction actuator. A thigh actuator is coupled to one end of the thigh link assembly and the hip actuator. The thigh actuator is rotatable by the hip actuator. Another implementation is where the leg assembly includes a foot structure coupled to the ankle joint. Another implementation is where the transmission system includes a belt including one end rotated by a pulley. Another implementation is where the pulley is rotatably coupled to a knee actuator, and another end of the belt is rotatably coupled to the knee joint to transmit force to rotate the calf link assembly relative to the thigh link assembly. Another implementation is where the first transmission system is a linkage assembly. Another implementation is where the linkage assembly transmits force between an ankle actuator and the ankle joint. Another implementation is where each of the plurality of actuators are identical motors. Another implementation is where the thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts. [0012] Another disclosed example is a control system for a robot having a limb with an upper assembly and a lower assembly, and a plurality of actuators attached to the upper assembly. The control system includes a low level controller coupled to the plurality of actuators. A multi- contact controller has inputs of a contact schedule of contacts by the limb during dynamic movement, reaction forces on the limb, and moments of inertia on the limb to provide stance - 4 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) control of the limb from a Model Predictive Control (MPC) based on a single rigid body dynamics model of the robot. A Cartesian PD controller provides swing control of the limb from the MPC based on reaction forces from movement of the limb. [0013] A further implementation of the example control system is where the limb is an arm, and where the upper assembly is an upper arm, and the lower assembly is a forearm rotatably coupled to the upper arm. Another implementation is where the plurality of actuators allows four degrees of movement of the arm. Another implementation is where the limb is a leg, and where the upper assembly is thigh link, and the lower assembly is a calf link rotatably coupled to the thigh link. Another implementation is where the plurality of actuators allows five degrees of movement of the leg. Another implementation is where the multi-contact controller optimizes ground reaction forces and moments of the leg. [0014] The above advantages and other advantages, and features of the present description will be readily apparent from the following Detailed Description when taken alone or in connection with the accompanying drawings. It should be understood that the summary above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure. BRIEF DESCRIPTION OF DRAWINGS [0015] In order to describe the manner in which the above-recited disclosure and its advantages and features can be obtained, a more particular description of the principles described above will be rendered by reference to specific examples illustrated in the appended drawings. These drawings depict only example aspects of the disclosure, and are therefore not to be considered as limiting of its scope. These principles are described and explained with additional specificity and detail through the use of the following drawings: [0016] FIG. 1 is a perspective view of an example bipedal robot with an example set of leg assemblies, according to an embodiment of the disclosure; [0017] FIG.2 is a close-up perspective view of one of the leg assemblies of the example bipedal robot in FIG.1, according to an embodiment of the disclosure; - 5 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [0018] FIG.2B is an opposite side close-up perspective view of one of the leg assemblies of the example bipedal robot in FIG 1, according to an embodiment of the disclosure; [0019] FIG.2C is an exploded perspective view of the components of one of the leg assemblies of the example bipedal robot in FIG.1, according to an embodiment of the disclosure; [0020] FIG. 3A is a perspective view of one of the arms of the example robot in FIG. 1, according to an embodiment of the disclosure; [0021] FIG. 3B is an exploded perspective view of the components of one of the arms of the example robot in FIG.1, according to an embodiment of the disclosure; [0022] FIG. 4 shows a block diagram of an example MPC based robotic control architecture system for the example robot in FIG.1, according to an embodiment of the disclosure; [0023] FIG. 5A shows the different degrees of freedom of the limbs on the example robot in FIG.1, according to an embodiment of the disclosure; [0024] FIG. 5B shows a sequence of movement of the leg assembly of the robot in FIG. 1, according to an embodiment of the disclosure; [0025] FIG. 5C shows a sequence of striding of the leg assembly of the robot in FIG. 1, according to an embodiment of the disclosure; [0026] FIG.6A shows a set of stance diagrams for developing a force and moment based model of the example robot in FIG.1, according to an embodiment of the disclosure; [0027] FIG.6B shows a set of stance diagrams that allows developing an external force model of the example robot in FIG.1 with a payload, according to an embodiment of the disclosure; [0028] FIG. 7A shows a sequence of images of the example robot in FIG. 1 lifting a dynamic load in a walking sequence, according to an embodiment of the disclosure; [0029] FIG.7B is a graph showing torque of various joints of the example robot in FIG.1 in the lift stances in FIG.7A, according to an embodiment of the disclosure; [0030] FIG.8A is a series of images showing the example robot in FIG.1 carrying a load while walking over wood slats, according to an embodiment of the disclosure; [0031] FIG.8B is a series of images showing the example robot in FIG.1 walking with each arm carrying a load, according to an embodiment of the disclosure; [0032] FIG. 8C is an image of the example robot in FIG. 1 performing a dynamic turn while holding a load, according to an embodiment of the disclosure; - 6 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [0033] FIG. 9A is a series of images showing an example test robot traversing uneven terrain, according to an embodiment of the disclosure; [0034] FIG. 9B is a series of images showing the example test robot traversing uneven random terrain, according to an embodiment of the disclosure; [0035] FIG. 9C is an image showing the example robot traversing uneven terrain, according to an embodiment of the disclosure; [0036] FIG. 10A is an image of the example robot performing dynamic turning in place, according to an embodiment of the disclosure; [0037] FIG.10B is a graph showing yaw rate tracking of the example robot performing dynamic turning in place, according to an embodiment of the disclosure; [0038] FIG. 11A is an image of the example robot standing while carrying a load, according to an embodiment of the disclosure; [0039] FIG. 11B is a box plot of the solve time of the example MPC based controller for different scenarios for the example robot standing while carrying a load, according to an embodiment of the disclosure; [0040] FIG. 12A is a graph of velocity tracking and height of the example robot while walking under the example MPC based control, according to an embodiment of the disclosure; [0041] FIG. 12B is a graph showing torque of various joints of the example robot in FIG. 1 while walking under the example MPC based control, according to an embodiment of the disclosure; [0042] FIG.13A is a set of graphs showing foot force and moments for the example robot while walking under the example MPC based control, according to an embodiment of the disclosure; [0043] FIG.13B is a set of graph showing leg torques of the example robot while walking under the example MPC based control, according to an embodiment of the disclosure; [0044] FIG. 14 shows snap shots of a double-leg balancing experiment of the example robot, according to an embodiment of the disclosure; [0045] FIG. 15 shows snap shots of a stepping in place experiment of the example robot, according to an embodiment of the disclosure; and [0046] FIG. 16 shows snap shots of an experiment of the example robot traversing uneven terrain, according to an embodiment of the disclosure. - 7 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) DETAILED DESCRIPTION [0047] Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. One skilled in the art will recognize many methods and materials similar or equivalent to those described herein, which could be used in the practice of the present invention. Indeed, the present invention is in no way limited to the methods and materials specifically described. [0048] In some embodiments, properties such as dimensions, shapes, relative positions, and so forth, used to describe and claim certain embodiments of the invention are to be understood as being modified by the term “about.” [0049] Various examples of the invention will now be described. The following description provides specific details for a thorough understanding and enabling description of these examples. One skilled in the relevant art will understand, however, that the invention may be practiced without many of these details. Likewise, one skilled in the relevant art will also understand that the invention can include many other obvious features not described in detail herein. Additionally, some well-known structures or functions may not be shown or described in detail below, so as to avoid unnecessarily obscuring the relevant description. [0050] The terminology used below is to be interpreted in its broadest reasonable manner, even though it is being used in conjunction with a detailed description of certain specific examples of the invention. Indeed, certain terms may even be emphasized below; however, any terminology intended to be interpreted in any restricted manner will be overtly and specifically defined as such in this Detailed Description section. [0051] While this specification contains many specific implementation details, these should not be construed as limitations on the scope of any inventions or of what may be claimed, but rather as descriptions of features specific to particular implementations of particular inventions. Certain features that are described in this specification in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination. Moreover, although features may be described above as acting in certain combinations and even initially claimed as such, one or more features from a claimed combination can in some cases be excised from the - 8 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) combination, and the claimed combination may be directed to a subcombination or variation of a subcombination. [0052] Similarly, while operations may be depicted in the drawings in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Moreover, the separation of various system components in the implementations described above should not be understood as requiring such separation in all implementations, and it should be understood that the described program components and systems can generally be integrated together in a single software product or packaged into multiple software products. [0053] The present disclosure relates to an example bipedal robot allowing for dynamical and agile locomotion behaviors. The example bipedal robot is a small-scale and power-dense unit that has five degree-of-freedom legs and corresponding actuators. The integration of advanced sensors and a vision camera enables precise state estimation for stable locomotion ability. Additionally, a robust force-and-moment-based Model Predictive Control (MPC) system is employed on the example robot and allows for real-time high-level planning and control, facilitating various dynamic gaits. Current results show the capabilities of the example bipedal robot in traversing different terrain dynamically, robustness to disturbances, and outstanding balancing performance. The example bipedal robot may be used for various applications, including hazardous environments, emergency responses, and industrial tasks, while offering a cost-effective alternative for legged locomotion research. [0054] Each leg assembly of the example robot has five degrees of freedom. These degrees of freedom include actuated abduction/adduction (ab/ad), hip, thigh, knee, and calf joints. The leg design is composed of both aluminum and carbon steel parts for the balance between overall weight and transmission rigidity. To minimize the leg mass and concentrate body mass to the hip area, all the actuators are strategically positioned towards the upper thigh and hip regions. To actuate the knees and ankles, bar linkages and belt-pulley transmissions are integrated into the leg assembly. The overall leg design is modular and the robot leg may be replaced with a wheel-leg or other types of legs with slight design modifications. The example leg assembly allows the example bipedal robot to have very power-dense legs while having reasonably low weight. The example leg assembly facilitates the use of an example MPC based control system. - 9 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [0055] An example of the bipedal robot is 0.6 m in standing height and weighs 14 kg. The example bipedal robot is modified from components from a Unitree A1 quadruped robot with A1 torque-actuated joint motors. Different manufacturing techniques such as CNC machining, laser cutting, or 3D printing may be used as well as different materials such as Aluminum 6061, Aluminum 7075, Abs plastics, carbon fiber infused PETG, and Nylon plastics, and the like to maximize the rigidity of the robot while simultaneously minimizing its weight. This allows the example robot to perform very dynamic maneuvers. Potential applications include fast running, hopping, and jumping motions that are not feasible with known humanoid robots. [0056] The example robot is designed around the use of laser cutting and commercially available parts to decrease production costs, with an emphasis on maintaining ease of servicing. The cost of producing such a bipedal hardware is very comparable to the production cost of a small quadruped robot, which is very affordable towards either individual hobbyists or research groups. [0057] The example robot can be equipped with different visual sensors including a camera such as an Intel T265 camera, enabling precise state estimation and vision data. The example robot is also capable of the integration of various sensors and devices for different application purposes. [0058] FIG. 1 shows a back perspective view of an example two leg (bipedal) robot 100. The two leg robot 100 includes a torso that is a trunk enclosure 110, and two five degrees of freedom (DoF) leg assemblies 120 and 122, and two arm assemblies 124 and 126. The top of the trunk enclosure 110 includes a sensor module 128. The leg assemblies 120 and 122 are attached to the bottom end of the trunk enclosure 110. The arm assemblies 124 and 126 are attached to the top of the trunk enclosure 110. The trunk enclosure 110 encloses components such as a power supply, a control system, a transceiver, payload and sensor support components. In this example, the sensor module 128 includes sensors such as a camera to provide vision data and thus enable precise state estimation or the robot 100. [0059] FIG.2A shows a close-up side perspective view of the leg assembly 120. The other leg assembly 122 is identical to the leg assembly 120. FIG. 2B shows a close-up reverse side perspective view of the leg assembly 120. FIG. 2C shows an exploded view of the components of the leg assembly 120. Each of the leg assemblies 120 and 122 includes a hip joint 210, a hip connecting link 212, a thigh link assembly 214, a knee joint 216, a calf link assembly 218, and an ankle joint 220, with a foot structure 222. The leg components are powered by different - 10 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) actuators, which may be identical electric motors. In this example, five actuators provide five degrees of freedom. A set of the actuators are positioned near the hip joint 210 to concentrate mass of the leg assembly 120 proximate the hip joint 210. The actuators are controlled by a robot control system that may be housed in the trunk enclosure 110. Each of the leg assemblies such as the leg assembly 120 includes an abduction/adduction (ab/ad) actuator 230, a hip actuator 232, a thigh actuator 234, an ankle actuator 236, and a knee actuator 238. In total, the example bipedal robot 100 has ten actuators for the two leg assemblies 120 and 122. [0060] The ab/ad actuator 230 is mounted on the trunk enclosure 110 of the example robot 100 in FIG.1. The output shaft of the ab/ad actuator 230 forms the hip joint 210. The output shaft of the ab/ad actuator 230 is opposite to the end of the actuator 230 attached to the bottom of the trunk enclosure 110. The output shaft of the ab/ad actuator 230 is connected to the hip connecting link 212 to rotate the hip connecting link 212. The hip connecting link 212 includes an ab/ad plate 240 that is perpendicular to a lateral plate 242. The output shaft of the ab/ad actuator 230 engages a hole 244 in the ab/ad plate 240 to allow the hip connecting link 212 to be rotated by the ab/ad actuator 230. [0061] The hip actuator 232 is mounted on the lateral plate 242 of the hip connecting link 212. The output shaft of the hip actuator 232 extends through a hole 246 in the lateral plate 242 and is connected to a thigh connecting plate 248 of the thigh assembly 214. The thigh actuator 234 is mounted on a support ring 250 connected perpendicularly to the thigh connecting plate 248. The output shaft of the thigh actuator 234 is connected to an outer thigh frame 252 of the thigh assembly 214. The outer frame 252 is joined to a parallel inner thigh frame 254 by a series of spacing pins 256. The lower ends of the thigh frames 252 and 254 hold a pin that forms the knee joint 216. In this, example the thigh frames 252 and 254 are fabricated from laser cut aluminum and joined by standoffs. The calf and foot structure 218 and 222 are each assembled using similar method of the thigh structure, using laser cut plates joined by standoffs. [0062] The thigh assembly 214 can be rotated independently of the thigh connecting plate 248 by the hip actuator 232. The ankle actuator 236 and the knee actuator 238 are mounted on the exterior surface of the inner thigh frame 254. The inner thigh frame 254 has two through holes 258 and 260 to allow the respective output shafts of the ankle actuator 236 and knee actuator 238 to extend through to the interior surface of the inner thigh frame 254. - 11 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [0063] Since both the ankle and knee actuators 236 and 238 are positioned away from the joints that they power, two transmission mechanisms are implemented to transmit power from the actuators 236 and 238 to the respective ankle joint 220 and knee joint 216. The first transmission mechanism for the knee actuator 238 is a timing belt 262 and a knee joint pulley 264. The pulley 264 is mounted on the pin between the distal ends of the thigh frames 252 and 254 that forms the knee joint 216. For this transmission mechanism, a knee actuator pulley 266 is mounted on the output shaft of the knee actuator 238. The knee joint pulley 264 is rigidly connected to the top side of the calf link assembly 218. The timing belt 262 is looped between the two pulleys 266 and 264 to allow power transmission from the knee actuator 238 to the calf link assembly 218 to rotate around the knee joint 216. In order to achieve proper tension on the timing belt 262, a tensioner 268 is mounted on a pin held between the thigh frames 252 and 254 that pushes the timing belt 262 inward, providing necessary tension. [0064] The calf link assembly 218 includes an interior calf frame structure 270 and an exterior calf frame structure 272. One end of each of the frame structures 270 and 272 is attached to the knee joint pulley 264 and thus rotate with the knee joint pulley 264. The opposite ends of the frame structures 270 and 272 hold a pin that forms the ankle joint 220. [0065] A second transmission mechanism is an ankle linkage assembly 280. The ankle linkage assembly 280 includes an ankle link connector 282 that is coupled to the output shaft of the ankle actuator 236. Thus, the ankle actuator 236 rotates the ankle link connector 282. One end of the ankle link connector 282 is rotatable connected to one end of an ankle driving link 284. The other end of the ankle driving link 284 is rotatably coupled to one vertex of a coupler link 286. The coupler link 286 includes two triangular plates 288 and 290 that are rotatably mounted on the knee joint 216 on another vertex to allow rotation around the knee joint 216. A third vertex of the triangular plates 288 and 290 are rotatably coupled to one end of an ankle joint follower link 292. The opposite end of ankle joint follower link 292 is rotatably coupled to the foot structure 222. The ankle actuator 236 allows the foot structure 222 to rotate around the ankle joint 220 via the ankle linkage assembly 280. Thus, the ankle linkage assembly 280 transmits the same output of ankle actuator 236 to the ankle joint 220, forming a two-stage four-bar- linkage transmission system. [0066] The example hybrid transmission mechanism with the timing belt 262 and pulley 264 assembly, and the ankle linkage assembly 280 harnesses the strengths of both linkage systems - 12 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) and timing belts, achieving transmission efficiency, gear reduction, and compactness. Specifically, the design integrates a two-stage linkage system forming two parallelograms, transmitting power from the ankle actuator 236 to the ankle joint 220 at a gear ratio of 1:1. The increased knee torque requirement is observed to ensure dynamic capabilities according to simulation results. The timing belt and pulley system both transmits and amplifies torque from the knee actuator 236 to the knee joint 216 while ensuring necessary joint speed requirements for dynamic motions. [0067] The MPC based control system explained below in FIG. 4 allows the robot 100 to traverse uneven terrain with large obstacles. The control system is designed for control of limbs each having an upper assembly and a lower assembly, where actuators for the limb are attached to upper assembly to concentrate mass of the limb proximate the joint attaching the upper assembly to the body of the robot. Thus, one limb may be the leg assemblies 120 and 122 where the upper assembly is the thigh link assembly 214 that is rotatably coupled to the hip joint 210. One end of the calf link assembly 218, which is the lower assembly, may be rotated on the knee joint 216. An opposite end of the calf link assembly 218 supports the ankle joint 220 and the attached foot structure 222. The thigh link assembly 214 has an end that is attached to the thigh connecting plate 248 that is rotated by the hip actuator 232. The ankle actuator 236 and thigh actuator 234 are on opposite sides of the end of the thigh link assembly 214. [0068] The ab/ad actuator 230 rotates a plate holding the hip actuator 232 and one end of the thigh link assembly 214. The hip actuator 232 rotates the thigh link assembly 214 relative to the plate on a first axis. The knee actuator 238 is located on the thigh link assembly 214 next to the thigh actuator 234. The thigh actuator 234 rotates the thigh link assembly 214 relative to the plate along a second axis. The knee actuator 238 rotates the calf link assembly 218 around the knee joint 216 via the pulley and timing belt. The ankle actuator 236 rotates the ankle joint 220 via the linkage assembly 280 that runs along the calf link assembly 218. [0069] The power transmission system of the leg assemblies 120 and 122 of the example robot 100 has the actuators of each leg assembly 120 and 122 located close to the trunk enclosure 110 to minimize inertia and concentrate mass to hip area due to the rigid body dynamics assumption in the example model predictive control (MPC) based controller. The power transmission system also includes the timing belts and pulleys utilized for power transmission. This leg - 13 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) assembly design provides the robot 100 with a stable and efficient power source for its movements. [0070] Overall, the example bipedal robot 100 is designed to be a versatile and adaptable machine, capable of performing a wide variety of tasks in various environments, while maintaining stability, maneuverability, and energy efficiency. Laser cutting the parts of leg assemblies of the robot 100 results in a generally light weight robot, which allows less unwanted dynamic effect of the legs in the system and lowers motor torque limit requirements during balancing control and navigating high obstacles. [0071] There are several benefits for the leg assembly design in FIGs. 2A-2C. First, the actuators are positioned close to the torso of the example robot to minimize leg weight and inertia, and improve dynamic capabilities. Second, the timing belt based transmission system for the calf assembly enables gear reduction and improves joint torque at the knee joint, which is the most torque demanding joint. Third, since ankle joint torque requirements are low, linkage is more space efficient than timing belt while being able to transmit power. Finally, the hybrid transmission setup ensures sufficient torques for each joint for highly dynamic motions while keeping the leg side and weight small. The gear ratio of the example knee transmission is 1.54, amplifying the torque at the knee joint 216 if the robot doesn’t have the optional arm assemblies 124 and 126. [0072] Another alternative is replacing the laser cut thigh, calf and foot components with computer numerical control (CNC) parts, increasing the structural rigidity of the whole leg assembly. A higher gear ratio of 2 may be made because of the added weight of two arm assemblies 124 and 126. [0073] FIG.3A shows a perspective view of the arm assembly 124 in FIG.1. FIG.3B shows an exploded view of the arm assembly 124. The arm assembly 126 in FIG.1 is identical to the arm assembly 124. The example arm assembly 124 has four degrees of freedom. The arm assembly 124 uses the same design philosophy of the leg assembly 120. Thus, all of the actuators of the arm assembly 124 are mounted close to the torso of the robot 100 and minimize the weight on the arm link. The example controller may also control limbs such as the arm assemblies 124 and 126 having an upper assembly and a lower assembly where the actuators are positioned on the upper assembly to concentrate mass of the arm assembly near the torso of the robot. In this example, the arm assembly 124 constitutes another limb that includes a shoulder joint 310, an - 14 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) upper arm assembly 312 (upper assembly), an elbow joint 314, and a forearm link assembly 316 (lower assembly). The arm assembly 124 includes a shoulder yaw actuator 320, a shoulder pitch actuator 322, a shoulder roll actuator 324, and an elbow actuator 326. [0074] As shown in FIG. 1, the shoulder yaw actuator 320 is attached to the top of the trunk enclosure 110 of the robot. As shown in FIGs. 3A-3B, the output shaft of the shoulder yaw actuator 320 is connected to a horizontal shoulder plate 330 that includes a mounting cradle 332. The shoulder pitch actuator 322 is mounted on the mounting cradle 332 of the horizontal shoulder plate 330. The output shaft of the shoulder pitch actuator 322 is then connected to a second shoulder plate 334. The shoulder plate 334 includes a mounting cradle 336. The shoulder roll actuator 324 is mounted on the mounting cradle 336. The whole upper arm assembly 312 is then mounted on the output shaft of the shoulder roll actuator 324. [0075] The upper arm assembly 312 includes inner and outer arm plates 340 and 342. Inside of the upper arm assembly 312, there are mounting points for the elbow actuator 326 to be rigidly connected to the upper arm assembly 312. The forearm link assembly 316 includes an inner support 344 and an outer support 346. One end of both supports 344 and 346 are rotatably coupled to the elbow joint 314. The opposite end of the supports 344 and 346 are attached to an end effector 348. [0076] An elbow actuator pulley 350 is connected to the output shaft of the elbow actuator 326. The elbow actuator pulley 350 is mounted on the upper section of the forearm link assembly 316. The elbow actuator pulley 350 drives one end of an elbow timing belt 352. The other end of the elbow timing belt 352 drives an elbow joint pulley 354 that is mounted on the elbow joint 314. The elbow pulley 354 is attached to the forearm link assembly 316. A pair of tensioner pins 356 create tension in the elbow timing belt 352. The elbow pulley belt transmission system of the pulleys 350 and 354 and the timing belt 352 is similar to that of the leg assembly 120 and thus transmits the torque from the elbow actuator 326 to the elbow joint 314 with a gear ratio of 1.45, amplifying the torque. [0077] In this example, the parts in the leg assemblies and arm assemblies are fabricated by laser-cut Aluminum 7075 T6. The aluminum is particularly attractive because it offers a high strength-to-weight ratio, rapid manufacturing, and low production cost. The combination of using lighter lower-limb materials and arranging actuators close to the torso both play a vital role in minimizing leg weight and inertia. This shifts the dominant dynamics of the system to the - 15 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) upper body in the form of the torso of the robot such as the trunk enclosure 110. The example leg assembly design enables the use of state-of-the-art control algorithms that exploit simple dynamics to make real-time decisions that achieve stability. Thus, both simple and complex controllers can be implemented on this platform, providing users the flexibility to explore a wide range of controllers. [0078] The example bipedal robot 100 uses a force-and-moment-based model predictive control (MPC) for real-time high-level planning and control. The MPC assumes the dynamics of the example biped robot as rigid body dynamics since most of the mass is concentrated around the hips. This allows linearizing the dynamics model of the example robot and allows use of force- based control input in MPC. The swing leg is controlled by a low-level Cartesian PD (proportional with derivative action) controller and its capture points are based on heuristic policies. [0079] In traditional humanoid robot systems, the legs are heavy and bulky throughout the entire assembly because the actuator placements are scattered across the entire leg. Such known design choices, in dynamic motions, induce a significant change of moment of inertia of the entire body as well as sudden shifts of center of mass (CoM) locations due to heavy legs, especially around the low calf area. [0080] The example biped robot 100 allows the deployment of a “Single-rigid-body-model- based MPC (SRBM MPC)” based controller. This example MPC controller treats the entire robot as single-rigid-body model, with only the properties of mass and moment of inertia. On the real example robot, if the entire body moment of inertia and center of mass (CoM) location changes within a reasonable range during motions, it can be closely approximated as a single- rigid-body model in dynamics modelling process of MPC control design. [0081] The advantages of utilizing SRBM MPC over other types of MPCs is that the entire dynamics model may be linearized, leaving a clean and simple quadratic programming problem that can be solved in real-time 20-50 times faster than a MPC with nonlinear complex dynamics model. The high-frequency feature associated with this solve time reduction can immensely boost the control performance of the example robot, especially in very dynamic motions. [0082] With the design of the leg assemblies 120 and 122 concentrating the location of the actuators, the CoM of each leg stays close to the upper thigh and hip area. Compared to traditional humanoid robot legs, where the CoM lies near lower thigh and knee area, the example - 16 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) leg assembly design induces much less change in entire body moment of inertia and CoM when performing fast legged motions. Quantitively speaking, the calf/lower leg assembly of the example robot 100 is only 5% mass of the entire leg assembly, while a human has lower legs weighing 30% of the entire leg limbs. This compact design can significantly reduce unwanted inertia change during leg swing. With the example leg assembly 120 in FIG. 1, when applying the SRBM MPC to the example robot 100, the assumed SRBM model in the MPC controller is very close to the actual robot dynamics, minimizing the gap between assumptions and actual robot dynamics. This results in much better robot control performance in dynamic motion from pairing the SRBM MPC with the example robot with the leg assemblies 120 and 122. [0083] A block diagram of an example robotic system and control architecture 400 that may be used in connection with the implementations described herein of the example robot 100 is shown in FIG. 4. The robotic system and control architecture 400 includes a remote controller 410, a contact schedule 412, a control module 414, and a hardware system 416. In this example, the remote controller 410 may be any suitable controller such as a Logitech F710 wireless gamepad to send user commands to the controllers in the control module 414 via a WiFi connection. [0084] The control module 414 includes a multi-contact MPC controller 420, a Cartesian PD controller 422, a low level controller 424, and an onboard controller 426. The onboard controller 426 executes the user commands in real-time. In this example, the onboard controller 426 is an onboard NUC 13 Pro mini-PC with Intel i5-1340P processors. [0085] The hardware system 416 includes a low level control board 430, sensors 432, actuators 434, a power management system 436, and a transceiver 438. In this example, the actuators 434 represent the actuators in the arm and leg assemblies in FIG. 1. An Ethernet bus 428 allows communication between the low level control board 430 and the control module 414. The low level control board 430 is coupled to the sensors 432 via a serial line 440. The low level control board 430 is coupled to the actuators 434 via an RS485 bus and a control area network (CAN) bus 442. [0086] The computed torque commands generated by the controllers 420 and 422 are subsequently sent to the custom low level control board 430 via Ethernet communication over the Ethernet bus 428. Information on the motor states and the robot states are retrieved from motor feedback and state estimation. In detail, the robot states command xcmd from the remote controller 410 includes the desired velocities ^^^^ ^ ,^ , ^^^^ ^ ,^ in the transverse plane and the desired - 17 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) body yaw rate ψcmd, which are then mapped to a reference trajectory xref ∈ ℝ12× h with h prediction horizons at each MPC time step for the on board controller 426 to track. The reference trajectory at each horizon consists of the robot CoM position, pc ∈ ℝ3, Euler angles Θ = [ϕ, θ, ψ], CoM velocity, ṗc ∈ ℝ3, and angular velocity, ω ∈ ℝ3. [0087] The contact schedule 412 is leveraged to synchronize the multi-contact event within the MPC, including foot contact and object contact. In this example, the contact schedule 412 is directed toward transporting a load and includes three phases of the MPC horizons time, a standing time period 450, a standing with load period 452, and a stepping with load period 454. A first set of bars 460 represents the contacts of the left foot over the periods 450, 452, and 454. A second set of bars 462 represents the contacts of the right foot over periods 450, 452, and 454. A third bar 464 represents contact with the load object over periods 450, 452, and 454. In this example the contacts of the left foot and the right foot are periodic reflecting the movement of the legs, while the contact with the load object is constant as the load object is carried during the movement. A prediction window 470 is provided between the periods 450 and 452. [0088] With this information, the MPC can foresee future contact changes and react accordingly. The contact schedule 412 of the bars 460, 462, and 464 can be defined via multiple channels, including user input, vision feedback, and task planning. The contact schedule 412 summarizes time-based periodic contact sequences based on the desired gait period length (typically in the range of 0.3 to 0.7 seconds for the example robot 100) and the loco-manipulation task. [0089] This gait sequence determines, at any time t, whether the leg is in stance phase (contact), or swing phase (no contact). When leg m is in stance phase, m = 0, 1, the ground reaction forces and moments for leg m are then optimized by the MPC. When the leg is in swing phase, the Cartesian-space PD controller 422 commands the foot to swing to a desired location for the next step. [0090] In the context of loco-manipulation tasks, the arm assemblies 124 and 126 are primarily employed for object manipulation and handling. Consequently, they are also controlled using the Cartesian PD controller 422. Stance foot control inputs u = [Fstance; Mstance], swing foot forces Fswing, and arm forces Farm are then mapped to corresponding joint torques ^^^ in the low- level control board 430 by leg and arm Jacobian matrices. Subsequently, the joint torque commands are sent to the motor control boards of the actuators 434 for executing the motion. A fused linear state estimator is used that receives readings of an onboard inertial measurement unit - 18 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) (IMU) and the controller 426 (the Intel T265 VIO). The sensor fusion allows the example robot 100 to obtain accurate and high-frequency robot state feedback. [0091] Referring to FIG. 4, the robotic control system architecture 400 may be configured to operate autonomously, semi-autonomously, and/or using directions provided by user(s). The robotic system 400 may be implemented in various forms, a biped robot such as the robot 100, a wheel-legged robot, a quadruped robot, or some other arrangement. [0092] The example robot 100 may further include mechanical components and/or electrical components. Nonetheless, the robot 100 is shown for illustrative purposes, and may include more or fewer components. The various components of robot 100 may be connected in any manner, including wired or wireless connections. Further, in some examples, components of the robot 100 may be distributed among multiple physical entities rather than a single physical entity. Other example illustrations of robot 100 may exist as well. [0093] The controller(s) 426 may operate as one or more general-purpose hardware processors or special purpose hardware processors (e.g., digital signal processors, application specific integrated circuits, etc.). The controller(s) 426 may be configured to execute computer-readable program instructions, and manipulate data, both of which are stored in a data storage device. The controller(s) 426 may also directly or indirectly interact with other components of the robotic system 400, such as sensor(s) 432, power source(s) 436, actuators 434, transceivers 438, mechanical components, and/or electrical components. The transceiver 438 may be used to communicate data or receive command signals with an external device. [0094] A data storage device on board the robot 100 may be one or more types of hardware memory. For example, the data storage may include or take the form of one or more computer- readable storage media that can be read or accessed by controller(s) 426. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic, or another type of memory or storage, which can be integrated in whole or in part with controller(s) 426. In some implementations, the data storage can be a single physical device. In other implementations, the data storage can be implemented using two or more physical devices, which may communicate with one another via wired or wireless communication. As noted previously, the data storage may include the computer- readable program instructions and the data. The data may be any type of data, such as configuration data, sensor data, and/or diagnostic data, among other possibilities. - 19 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [0095] The controller 426 may include one or more electrical circuits, units of digital logic, computer chips, and/or microprocessors that are configured to (perhaps among other tasks), interface between any combination of the mechanical components, the sensor(s) 432, the power source(s) 436, the electrical components, the control system 414, and/or a user of the robot 100. In some implementations, the remote controller 410 may be a purpose-built embedded device for performing specific operations with one or more subsystems of the robotic device 100. [0096] The remote controller 410 may monitor and physically change the operating conditions of the robotic system 400. In doing so, the controller 426 may serve as a link between portions of the robotic system 400, such as between mechanical components and/or electrical components. In some instances, the controller 426 may serve as an interface between the robotic system 400 and another computing device. Further, the controller 426 may serve as an interface between the robotic system 400 and a user. The instance, the controller 426 may include various components for communicating with the robotic system 400, including a joystick, buttons, and/or ports, etc. The example interfaces and communications noted above may be implemented via a wired or wireless connection, or both. The controller 426 may perform other operations for the robotic system 400 as well. [0097] During operation, the controller 426 may communicate with other systems of the robot 100 via wired or wireless connections, and may further be configured to communicate with one or more users of the example robot 100. As one possible illustration, the controller 426 may receive an input (e.g., from a user or from another robot) through the transceiver 438 indicating an instruction to perform a particular gait in a particular direction, and at a particular speed. A gait is a pattern of movement of the limbs of an animal, robot, or other mechanical structure. [0098] Based on this input, the controller 426 may perform operations to cause the example robot 100 to move according to the requested gait. As another illustration, the controller 426 may receive an input indicating an instruction to move to a particular geographical location. In response, the controller 426 (perhaps with the assistance of other components or systems) may determine a direction, speed, and/or gait based on the environment through which the robotic system 400 is moving en route to the geographical location. [0099] Operations of the control system in the control module 414 may be carried out by the controller(s) 426. Alternatively, these operations may be carried out by the remote controller 410, or a combination of the controller(s) 426 and the remote controller 410. In some - 20 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) implementations, the control module 414 may partially or wholly reside on a device other than the robotic system 400, and therefore may at least in part control the robotic system 400 remotely. [00100] Mechanical components represent hardware of the example robot architecture 400 that may enable the robot 100 to perform physical operations. As a few examples, the robotic system 400 may include physical members such as wheeled legs, leg(s), arm(s), and/or wheel(s). The physical members or other parts of robotic system 400 may further include actuators such as motors arranged to move the physical members in relation to one another. The robotic system 400 may also include one or more structured bodies for housing the control module 414 and/or other components, and may further include other types of mechanical components. The particular mechanical components used in a given robot may vary based on the design of the robot, and may also be based on the operations and/or tasks the robot may be configured to perform. [00101] In some examples, the mechanical components may include one or more removable components. The robotic system 400 may be configured to add and/or remove such removable components, which may involve assistance from a user and/or another robot. For example, the robotic system 400 may be configured with removable arms, hands, feet, and/or legs, so that these appendages can be replaced or changed as needed or desired. In some implementations, the robotic system 400 may include one or more removable and/or replaceable battery units or sensors. Other types of removable components may be included within some implementations. [00102] The robotic system 400 may include sensor(s) 432 arranged to sense aspects of the robotic system 400. The sensor(s) 432 may include one or more force sensors, torque sensors, velocity sensors, acceleration sensors, position sensors, proximity sensors, motion sensors, location sensors, load sensors, temperature sensors, touch sensors, depth sensors, ultrasonic range sensors, infrared sensors, object sensors, and/or cameras, among other possibilities. Within some examples, the robotic system 400 may be configured to receive sensor data from sensors that are physically separated from the robot (e.g., sensors that are positioned on other robots or located within the environment in which the robot is operating). [00103] The sensor(s) 432 may provide sensor data to the controller(s) 426 (perhaps by way of data) to allow for interaction of the robotic system 400 with its environment (e.g., surrounding terrain), as well as monitoring of the operation of the robotic system 400. The sensor data may be - 21 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) used in evaluation of various factors for activation, movement, and deactivation of mechanical components and electrical components by control module 414. For example, the sensor(s) 432 may capture data corresponding to the terrain of the environment or location of nearby objects, which may assist with environment recognition and navigation. In an example configuration, sensor(s) 432 may include RADAR (e.g., for long-range object detection, distance determination, and/or speed determination), LIDAR (e.g., for short-range object detection, distance determination, and/or speed determination), SONAR (e.g., for underwater object detection, distance determination, and/or speed determination), VICON® (e.g., for motion capture), one or more cameras (e.g., stereoscopic cameras for 3D vision), a global positioning system (GPS) transceiver, and/or other sensors for capturing information of the environment in which the robotic system 400 is operating. The sensor(s) 432 may monitor the environment in real time, and detect obstacles, elements of the terrain, weather conditions, temperature, and/or other aspects of the environment. [00104] Further, the robotic system 400 may include sensor(s) 432 configured to receive information indicative of the state of the robotic system 400, including sensor(s) 432 that may monitor the state of the various components of the robotic system 400. The sensor(s) 432 may measure activity of systems of the robotic system 400 and receive information based on the operation of the various features of the robotic system 400, such the operation of extendable legs, arms, or other mechanical and/or electrical features of the robotic system 400. The data provided by the sensor(s) 432 may enable the control module 414 to determine errors in operation as well as monitor overall operation of components of the robotic system 400. [00105] As an example, the robotic system 400 may use force sensors to measure load on various components of the robotic system 400. In some implementations, the robotic system 400 may include one or more force sensors on an arm or a leg to measure the load on the actuators that move one or more members of the arm or leg. As another example, the robotic system 400 may use one or more position sensors to sense the position of the actuators of the robotic system and thus the joint angles of wheeled legs. For instance, such position sensors may sense states of extension, retraction, or rotation of the actuators on arms or legs. [00106] As another example, the sensor(s) 432 may include one or more velocity and/or acceleration sensors. For instance, the sensor(s) 432 may include an inertial measurement unit (IMU). The IMU may sense velocity and acceleration in the world frame, with respect to the - 22 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) gravity vector. The velocity and acceleration sensed by the IMU may then be translated to that of the robotic system 400 based on the location of the IMU in the robotic system 400 and the kinematics of the robotic system 400. [00107] The robotic system 400 may include other types of sensors not explicated discussed herein. Additionally or alternatively, the robotic system may use particular sensors for purposes not enumerated herein. [00108] The robotic system 400 may also include one or more power source(s) 436 configured to supply power to various components of the robotic system 400. Among other possible power systems, the robotic system 400 may include a hydraulic system, electrical system, batteries, and/or other types of power systems. As an example illustration, the robotic system 400 may include one or more batteries configured to provide charge to components of the robotic system 400. Some of the mechanical components and/or electrical components may each connect to a different power source, may be powered by the same power source, or be powered by multiple power sources. [00109] Any type of power source may be used to power the robotic system 400, such as electrical power or a gasoline engine. Additionally or alternatively, the robotic system 400 may include a hydraulic system configured to provide power to the mechanical components using fluid power. Components of the robotic system 400 may operate based on hydraulic fluid being transmitted throughout the hydraulic system to various hydraulic motors and hydraulic cylinders, for example. The hydraulic system may transfer hydraulic power by way of pressurized hydraulic fluid through tubes, flexible hoses, or other links between components of the robotic system 400. The power source(s) 436 may charge using various types of charging, such as wired connections to an outside power source, wireless charging, combustion, or other examples. [00110] The electrical components may include various mechanisms capable of processing, transferring, and/or providing electrical charge or electric signals. Among possible examples, the electrical components may include electrical wires, circuitry, and/or wireless communication transmitters and receivers to enable operations of the robotic system 400. The electrical components may interwork with the mechanical components to enable the robotic system 400 to perform various operations. The electrical components may be configured to provide power from the power source(s) 436to the various mechanical components, for example. Further, the robotic - 23 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) system 400 may include electric motors. Other examples of electrical components may exist as well. [00111] The housing of the trunk enclosure 110 is connected to or houses appendages and components of the example robot 100 such as the arm assemblies 124 and 126 in FIG. 1. As such, the structure of the body may vary within examples and may further depend on particular operations that a given robot may have been designed to perform. For example, a robot developed to carry heavy loads may have a wide body that enables placement of the load. Similarly, a robot designed to reach high speeds may have a narrow, small body that does not have substantial weight. Further, the body and/or the other components may be developed using various types of materials, such as metals or plastics. Within other examples, a robot may have a body with a different structure or made of various types of materials. [00112] The body and/or the other components may include or carry the sensor(s) 432. These sensors 432 may be positioned in various locations on the example robot 100, such as on the body and/or on one or more of the appendages, among other examples. [00113] The example robot 100 may carry a load, such as a type of cargo that is to be transported. The load may also represent external batteries or other types of power sources (e.g., solar panels) that the example robot 100 may utilize. Carrying the load represents one example use for which the example robot 100 may be configured, but the example robot 100 may be configured to perform other operations as well. [00114] As noted above, the example robot 100 may include various types of legs, arms, wheels, and so on. In some examples, the example robot 100 may be configured with one or more legs. In some examples, an implementation of the example robot 100 with one or more legs may additionally include wheels, treads, or some other form of locomotion. An implementation of the robotic system with two legs may be referred to as a biped, and an implementation with four legs may be referred as a quadruped. Implementations with six or eight or 10 or more legs are also possible. [00115] FIG.5A shows the range of motion for the example robot 100 from different perspective views. FIG. 5A shows a front view of the robot 100 with the arm assembly 124 having a range of 120 degrees via rotating the shoulder roll actuator 324 and the leg assembly 120 having a 60 degree range of motion via rotating the thigh actuator 234. A side view shows the shoulder joint 310 of the arm assembly 124 having a 360 degree range of motion rotated by the shoulder pitch - 24 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) actuator 322. The forearm link assembly 316 has a 300 degree range of motion around the elbow joint 314 as rotated by the elbow actuator 326. The thigh assembly 214 has a 130 degree of freedom rotated by the hip joint actuator 232. The calf link assembly 218 has a 130 degree of freedom around the knee joint 216 when rotated by the knee actuator 238. A top view 520 shows that the arm assembly 124 has a yaw range of 165 degrees rotated by the shoulder yaw actuator 320. [00116] FIG. 5B shows different positions of the thigh assembly 214 and calf link assembly 218 when rotated by the respective actuators 234 and 238 during a swing operation by the leg assembly 120. FIG. 5C shows different positions of the thigh assembly 214 and calf link assembly 218 when rotated by the respective actuators 234 and 238 during a stepping operation by the leg assembly 120. A first position 510 shows the leg assembly 120 in a stance position. The second position 512 shows the leg assembly 120 in a swing position that swings the calf link assembly 218, a third position 514 shows the full swing position. A fourth position 516 shows the calf link assembly 218 swinging back in position. A final position 528 shows the leg assembly 120 advancing to a new stance position. [00117] The whole-body dynamics formulation of the example humanoid robot 100 may be presented in generalized coordinates. The joint-space generalized states q ∈ ℝ22 include ^^^, Θ, and ^^^. ^^^ ^^^ ^^^ ^ ^^^ ^^, ^^^ ^ ൌ ^^ ^ ^^^ ^^^ ^^ (1) where H ∈ ℝ22× 22 is the mass-inertia matrix and C ∈ ℝ22 is the joint-space bias force. τ represents the joint-space actuation. λ and J represent the external force applied to the system and corresponding Jacobian matrix. [00118] The multi-contact l-SRBM was built with the combination of simplified locomotion dynamics and object dynamics. Using reduced-order models for legged robots stems from the need for computational efficiency in online control and planning tasks, particularly in the context of predictive control. Control of the five degree of freedom leg assemblies such as the leg assembly 120 in FIG.1 only requires 3-D ground reaction forces [FFx; FFy; FFz] and 2-D ground reaction moments [FMy; FMz], all in the foot frame. In MPC, to conveniently optimize for the control input in 3-D locomotion, the ground reaction forces and moments in the world frame for leg m are represented as Fm = [Fx; Fy; Fz] and Mm = [Mx; My; Mz]. - 25 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [00119] During the SRBM evolution, the SRBM of the example robot 100 for locomotion encompasses the body, shoulders, hips, and upper thighs, which are all treated as a single combined rigid body. This combined rigid body captures the main dynamic effect of the system. The design of the example robot is centered around the assumption of rigid body dynamics, whereby the mass of the remaining arm and leg components constitutes only 15% of the total robot mass. As a result, inertia effects of these components have been disregarded in the dynamics model. [00120] FIG. 6A shows a series of stance diagrams of the example robot 100 showing the development of the SRBM evolution for locomotion. A first view 600 shows the position of the legs of the example robot 100. It is assumed the location of the ground reaction force and moment is a contact point on each foot, is at the projection of ankle joint location on the foot along the z-coordinate. A second view 610 shows a robot model assumed as a single rigid body model neglecting leg dynamics. The location of a center of mass 612 and arrows 614 representing inputs of ground reaction forces and moments at contact points of the legs. A third view 620 shows a force and moment model, showing arrows 622 representing a detailed definition of selection of forces and moments of inertia on the legs from the center of mass and the legs. The complete hardware actuation of the contact point, including 3-D linear movements, roll, and yaw, is fully achieved through the application of 5-D forces and moments. [00121] When applying payload(s) in loco-manipulation task, the SRBM is further evolved to consider the dynamic effect of the object on the humanoid robot, to form the example l-SRBM. Loco-manipulation and dynamic handling of heavy loads represent significant challenges in humanoid hardware and control. Typically, the dynamics of objects are not included in the humanoid dynamics model due to the variability in dynamics among different objects. Consequently, it becomes essential to capture the fundamental dynamics of the object to simplify and enable efficient and straightforward online utilization. [00122] FIG.6B shows a series of stance diagrams for modification of the SRBM by accounting for a payload held by the arms of the example robot. A first view 650 shows the robot 100 with a payload 640 carried by the arm assemblies 124 and 126. A second view 660 shows a robot model with the payload 640 assumed as a single rigid body model neglecting leg dynamics. The location of a center of mass 662 for the robot, a center of mass for the payload 664, and arrows 666 representing inputs of ground reaction forces and moments at contact points of the legs. A - 26 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) third view 670 shows a force and moment model, showing arrows 672 representing a detailed definition of selection of forces and moments of inertia on the legs from the center of mass and the legs. Another arrow 674 represents force from the payload 640. [00123] In the example approach, the humanoid SRBM was expanded to encompass dynamic object carrying and loco-manipulation by integrating the simplified object dynamics into the simplified dynamics of the example robot 100. Previously, a comparison was made between two different formalisms of load dynamics in SRBM, namely the Combined Rigid Body Model and the External Force Model. The Combined Rigid Body Model treats the new SRBM as the combined rigid body of the robot and the object and updating the system dynamics with respect to the new combined CoM mc. The External Force Model treats the object dynamics as an external gravitational forces/weights applied to the robot SRB CoM. [00124] Upon examination, it was observed that the Combined Rigid Body Model exhibited significant state changes when transitioning from a no-load phase to a loaded phase. When an object was loaded on the robot, the dynamics model undergoes significant changes in parameters such as rigid body mass, inertia, CoM position, and Euler angles. This introduces instability that is particularly pronounced when dealing with heavy loads. [00125] An External Force Model, shown in FIGs. 6A-6B, is a formalism that relies only on the knowledge of the mass of the object and the CoM of the object relative to the CoM of the robot. This example approach aims to address the challenges associated with these state changes and enhance the stability of the robot 100 with heavy loads. [00126] Consequently, the proposed l-SRBM with n external payloads can be represented as ^^^^ ^^^ ^ ^^^ ൌ ^^^ ^ ^^ ^ ∑ ^ ^ୀ^ ^^^ (2) (3) where equation
Figure imgf000029_0001
is denoted as mr, the nth payload mass is denoted as mn, and the gravity vector ^^ is [0; 0; ^^]. Equation (3) represents the moment equilibrium of the rigid body dynamics. GI is the rigid body moment of inertia (MoI) in the world frame, which is obtained from the body frame MoI BI and body rotation matrix ^^, ^^ ^^ ൌ ^^ ^ ^^ ^^. ^^^ ^ represents the distance vector from robot CoM pc to nth payload CoM and ^^^ represents
Figure imgf000029_0002
distance vector from the robot CoM to the mth contact point pf,m. - 27 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [00127] The example Multi-contact Force-and-moment-based MPC control framework is developed by the selection of MPC states, assumptions to linearize the state-space dynamics, MPC formalism, and low-level control associated with MPC. The example control allows the MPC to serve as the primary control scheme, fully harnessing the capabilities of the proposed l- SRBM and contact schedules for dynamic loco-manipulation. [00128] First, the State-space Dynamics Linearization is detailed below. The robot state is defined as x = ^ ^^^;Θ; ^^^ ^; ^^^, and the derivative of the Euler angle and body angular velocity have the following relationship, ^^௫ ൌ െ ^^ ^ sin ^^ ^ ^^ ^ cos ^^ cos ^^,
Figure imgf000030_0001
^^௭ ൌ ^^ ^ െ ^^ ^ sin ^^. (4) Equivalently, écos ù ^^ ^ ê ^^ cos ^^ െ sin ^^ 0 êcos ^^ sin ^^ cos ^^ 0ú ú ^ ^^ ^ ^. (5) êᇣᇧ s i n ^^ ᇧᇧᇤᇧᇧᇧ 0 ᇧᇧᇧᇧᇥ 1ú ^^ ^ ë ^^ೃ û The left-hand side of
Figure imgf000030_0002
(3) may be approximated the following relation with the assumption of small angular velocity during locomotion, ௗ ௗ ^ ^^ ^^ ^^^ ൌ ^^ ^^ ^^^ ^ ^^ ൈ ^ ^^ ^^ ^^^ ^ ^^ ^^ ^^^ , (6) With the above assumptions
Figure imgf000030_0003
model in FIGs.6A-6B can be written in the following form, ^^ é ^
Figure imgf000030_0004
- 28 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) é ^^ଷ ^^ ù é ^^ଷൈ^ ù ê ଷ ^^ଷ ^^ଷú ^^ ê ^^ ú (7) where I denotes 3-
Figure imgf000031_0001
and the body angular velocity SR is not invertible at θ = ±90°. The example robot does not have such a configuration with this MPC. σ1, σ2, and ^^^ ^ are binary values that represent the time- based foot contact schedules and load (1 for contact, 0 for non-contact). The
Figure imgf000031_0002
incorporation of contact schedules MPC to possess information about changes in contact modes within the prediction horizons, illustrated by the contact schedule block 412 in FIG.4. This, in turn, facilitates improved optimization of the current control inputs, particularly with regard to potential under-actuation scenarios during contact mode changes. [00129] With the nonlinearity Cc presented in equation (7), the robot state x was modified and the new MPC state variable selections x were introduced to include constant 1, ^^ ൌ ^^^; ^^; ^^^ ^; ^^; 1^. Consequently, the state-space dynamics equation was modified from equation (7) with continuous matrices Ac and Bc rewritten as ௗ ^^ ^ ^ ^௧ ൌ ^^^ ^^ோ, ^^, ^^^, ^^^ ^ ^^ ^ ^^^൫ ^^^, ^^^ ൯ ^^ (8) (9) To utilize this linearized
Figure imgf000031_0003
was transformed into a discrete-time representation at the kth time step with MPC step dt: x[k + 1] = Akx[k] + Bku[k] (10) A = I13 + Acdt, B = Bcdt (11) [00130] Next, MPC formation was conducted by establishing the formal optimal control problem definition for the MPC that employs the linearized SRBM. The convex MPC problem definition with a finite horizon h is as follows - 29 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) min x, ^^ ∑ ^ ^ ି ^^ ฮ^^^^^^1^ ^^^ 2 ^^ ^^^^ 1^^ ^^ᇥ^ ^ ‖^^‖ 2 ^ᇥ^^ (12) Equation (12)
Figure imgf000032_0001
J1 represents the cost of driving states to the desired trajectory based on the user commands. J2 represents the cost of minimizing the control input to achieve better energy efficiency for locomotion. Both of the costs are weighted by the diagonal matrices Qk and Rk at kth time step. [00131] The optimal control problem is subjected to several constraints. Equation (13a) represents the discrete dynamics constraints derived from equation (7-11). Equation (13b) describes the contact point friction constraint, which follows an inscribed friction pyramid approximation, ^^ ൌ √2µ/2, for more conservative lateral foot forces, where µ is the contact friction coefficient of the ground. [00132] Equation (13d-13f) represents the line-foot dynamics constraints, where ^^^ ,^ denotes the foot rotation matrix w.r.t the world frame. Given the inherent challenges associated with modeling flat humanoid foot contact for humanoid robots, the representation of the foot contact on the example robot 100 is simplified to a line foot contact, which is also a common assumption - 30 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) on humanoid with 5-DoF legs. The Contact Wrench Cone (CWC) was modified to enforce: 1) the x-direction ground reaction moment is zero, shown in equation (13d); 2) prevention of toe/heel lift due to the y-direction ground reaction moment, shown in equation (13e); and 3) friction constraints for y-direction reaction forces at line-foot heel and toe locations, shown in equation (13f). [00133] The MPC-QP Problem was then condensed for Real-time Computation. Despite the extended prediction horizon embedded in the proposed MPC, resulting in large matrix sizes that demand significant computational resources for online optimal control problems, the convex MPC formulation was capable of being optimized to ensure efficient solutions as a Quadratic Programming (QP) optimization. [00134] In a standard non-condensed MPC-QP problem definition, the augmented state X=x[1]...x[h]; u[0]... u[h − 1]] consists of both states x and decision variables u. A QP problem definition is: min ^ ^^ ଶ ^^ ^^ ^^ ^ ^^ ^^ ^^ (14) s. t. ^^^^ ^^ ൌ ^^^^ (15) ^^ ^^ ^ ^^ (16) The problem (eqn. (12)) may be expressed with dynamics constraints (eqn. (13a)) in terms of the above standard non-condensed QP form, where: ^^ ൌ blkdiag^^ ^^ ^ ^ℎ െ 1 ^^ ൌ 0 , (17) ^^ ൌ െ^^ ^^^ ^^^^^^ ^ ^ ℎ െ 1 ℎ െ 1 ^^ ^0^ଶൈ^^ ^^ (18) and
Figure imgf000033_0001
^^ ^ ^ ^ െ ^^^ ^ ^ ^ [00135] However, due to
Figure imgf000033_0002
∈ ℝ13h×25h, beq ∈ ℝ13h, it is computationally expensive to solve for the solutions online. In addition,
Figure imgf000033_0003
- 31 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) the decision variables needed for actuating the example humanoid robot are only the current step control inputs u[0]. Therefore, the resulting full-state calculation from the non-condensed approach can be unnecessary. Given that the state-space matrices Ad and Bd are time-varying and periodic, it is reasonable to apply a condensed and sparse QP formulation to solve the MPC problem. The MPC solve time between non-condensed and condensed formulations on the example robot hardware using a PC with an AMD Ryzen 5600X processor at 4.2 GHz on Ubuntu 20.04 shows a reduction in time. Using qpOASES (C++), non-condensed standing time and walking time was 19.9 ms and 12.9 ms, while condensed standing time and walking time was 3.2 ms and 1.3 ms. Using quadprog (MATLAB), non-condensed standing time and walking time was 432.5 ms and 367.2 ms, while condensed standing time and walking time was 99.3 ms and 24.9 ms. Without any further exploitation of optimal control problem structures, the condensed approach for solving the MPC-QP problem allows running the hardware MPC very efficiently online with up to a 1.5s prediction horizon and hyper-sampling of up to 300Hz. [00136] The following formula was applied to translate the H and f matrices from equations 17- 18 to a condensed and sparse QP form to solve efficiently Hcs = 2(Bqp T QBqp + R), (20) fcs = 2Bqp T Q(Aqpx[0] − xqp). (21) where xqp, Q, and R are all vertical concatenations of corresponding matrices from MPC time step 0 (current instance) to h − 1. And, Aqp = [A0, A0A1, ..., A0A1... Ah−1]T, (22) [00137] To keep the
Figure imgf000034_0001
in low-level force-to-torque mapping, the same assumptions for the SRBM were employed. In some known approaches, WBC-based low-level controllers are employed as high-frequency balancing controllers to track MPC trajectory while managing whole-body dynamics, which may result in: (1) incompatible motion targets due to the different feasibility regions of models with - 32 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) varied resolutions; and (2) diluted MPC control action in the entire control framework. Therefore, the control input from MPC was directed used and mapped to joint torques using contact Jacobian, with the assumption of the SRBM and negligible leg dynamics. Hence, the mth leg joint torque τm commands are: ் ^^^ ൌ ^ ^^௩ ^^௪ ൨ ^ ^^் ^,^ ൫ ^^^ ^ ^^^௪^^^,^൯ ^^் ^, (24) ^^ ^^^ where Jv ∈ ℝ3×5 and Jω
Figure imgf000035_0001
Jacobian. The swing foot force, Fswing,m and stance foot force cannot be both nonzero at any instance, and the foot is either under swing or stance phase according to the gait scheduler. The swing force is acquired by Cartesian-space PD control law driving the swing foot to the desired heuristic foot position ^^ ^^^ and velocity ^^^ ^^^, ^^ ^^^ ൌ ^^
Figure imgf000035_0002
^ ^ ^^^ ^^௧ ଶ ^ ^^^^ ^^^ ^ െ ^^^^ ^^ௗ^, (25)
Figure imgf000035_0003
where ∆t denotes the time duration of each foot swing and scalar kc represents the velocity feedback gain to achieve the desired walking speed. While holding objects with arms, the upper body joint commands are also generated by the Cartesian space control and contact (hand) Jacobian. [00138] Numerical simulations were conducted to test the example control MPC control architecture. In a Simulink simulation, the example controllers were written in MATLAB scripts. For more efficient computation, the qpOASES in the CasADi toolbox was used to solve the MPC problems online. In a ROS+Gazebo Simulation, the MPC, state estimation, and low- level control scripts were written in C++. The MPC problem was solved through qpOASES C++ interface. [00139] To demonstrate the effectiveness of the example control scheme in loco-manipulation and the importance of the contact schedule in the MPC, the convenience of the simulation framework was leveraged for time-varying load. The example robot can walk while carrying a time-varying load with mass in the range of [0.54.0] kg, as shown in different walking positions in FIG. 7A. A first image 710 shows the robot carrying a mass of .5 kg, a second image 712 shows the robot carrying a mass of 2.0 kg, a third image 714 shows the robot carrying a mass of - 33 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 4.0 kg, a fourth image 716 shows the robot carrying a mass of 3.0 kg, a fifth image 718 shows the robot carrying a mass of 1.5 kg, and a final image 720 shows the robot carrying a mass of .5 kg. The images 710-720 show a sequence of overlaid simulation snapshots of the motion of the example robot carrying a payload with a time-varying mass. The robot performs dynamic walking while holding this object at the same time. [00140] Associated torque plots for the robot with the masses in FIG. 7A are shown in a graph 750 in FIG. 7B. The graph 750 shows torques of the right leg of the example robot from the simulation. A plot 752 shows the hip yaw torque, a plot 754 shows the hip roll torque, a plot 756 shows the thigh torque, a plot 758 shows the knee torque, and a plot 760 shows the ankle torque. A dashed line 770 shows the torque limit for the knee joint, and a dashed line 772 shows a torque limit for the rest of the joints in the right leg. [00141] Effective state estimation is crucial for achieving accurate and dynamic locomotion in robotics. In this example, a 6-axis IMU is integrated in the trunk enclosure 110 in FIG. 1, positioned near the center of mass (CoM) of the robot 100 for precise measurements. To estimate the COM states, an extended Kalman filter processes the IMU data and leg kinematics at a high refresh rate of 1 kHz. To enhance the robustness of state estimation, an Intel Realsense T265 tracking camera is employed. This camera, through its onboard VIO algorithm, outputs the position and orientation of the body at a slower 200 Hz rate. The system ensures accurate and robust state estimation for the robot 100 by fusing the data from both sources. Additionally, for other applications, the example robot 100 is equipped with an Intel Realsense D435i camera mounted in the sensor module 128. This camera is designed to capture depth data and other physical environment information. [00142] Control objective weight/gain tuning on the hardware for the example robot is very minimal to achieve the presented results. The generalized MPC weights used across the tasks for a test robot version without the arm assemblies are: Qk = diag[5005005001501501501131110], Rk = diag[111111555555 ] × 10−3. For a test robot
Figure imgf000036_0001
as the example robot 100, the generalized MPC weights are: Qk = diag[200200500150150501131110], Rk = diag[111111555555 ] × 10−3. - 34 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [00143] FIGs. 8A-8C shows a series of loco-manipulation experiment snapshots. FIG. 8A shows the example robot 100 walking over stacked wood slats while holding a 2.5 kg payload. FIG.8B shows the example robot 100 walking while carrying a 0.9 kg payload at each arm assembly. FIG.8C shows dynamic turning performed by the example robot 100 in place while holding a 2 kg payload under the control of the example MPC control system. [00144] FIG. 9A-9C show uneven terrain locomotion experiment snapshots of an example robot traversing a terrain with wood slats that simulate unstable terrain. FIG. 9A shows a test robot without the arms of the example robot 100 walking over stacked wood slats with 0.6 m/s forward speed. FIG. 9B shows the test robot with arms walking over randomly distributed wood slats. FIG.9C shows the example robot 100 walking omnidirectionally on uneven terrain. [00145] The dynamic turning experiments were demonstrated on the example robot 100. FIG. 10A is an image of the example robot performing dynamic turning with the yaw rate tracking of the robot up to 3 rad/s. FIG.10B is a graph 1000 of yaw rate tracking rate of the example robot 100 over time. A trace 1010 shows the command. A trace 1020 shows the actual yaw rate plotted over time. FIG.8C showcases the turning capability of the robot 100 under the example MPC while carrying a 2 kg payload. [00146] To validate the effectiveness of loco-manipulation of the example MPC framework and showcase the load-carrying capability of the example robot 100, the load-carrying ability of the example robot 100 was demonstrated in both double-leg standing and stepping scenarios. FIG. 11A shows experiment snapshots of the example robot 100 standing while carrying an 8 kg load (50% robot mass). [00147] To demonstrate the efficiency of the example MPC and the significance of MPC condensation under different scenarios, including a locomanipulation task, 10,000 data points were collected. FIG. 11B shows a box plot 1100 of MPC solve time on hardware, under a few different scenarios including a plot for a double contact loco-manipulation 1110, a plot for a single contact loco-manipulation 1120, a plot for a double contact locomotion 1130, and a plot for a single contact locomotion 1140. The box plot 1100 shows the feasibility of hypersampling the MPC to 300 Hz. [00148] The locomotion capability of the example robot 100 with the example control framework and SRBM is dynamic and multi-adaptive. Dynamic uneven terrain locomotion on the hardware of the example robot 100 was demonstrated in both biped and humanoid forms. FIG. 9A - 35 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) presents experiment snapshots of uneven terrain locomotion with different terrain setups. In addition, the example robot 100, with augmented design and hardware for better dynamic capabilities, can walk with a speed of over 1 m/s with the example MPC based control. [00149] FIG.12A is a graph 1200 showing velocity tracking, and height of the example robot 100 while walking. The graph 1200 includes a plot of x-direction velocity 1210 and a plot of the height 1212 of the example robot 100 over time while walking under a command represented by a dashed line 1214. FIG.12B is a graph 1250 of joint torques of the example robot 100 during a walk at a velocity of 1 m/s. The graph 1250 shows torques of the right leg of the example robot from locomotion. A plot 1252 shows the hip yaw torque, a plot 1254 shows the hip roll torque, a plot 1256 shows the thigh torque, a plot 1258 shows the knee torque, and a plot 1260 shows the ankle torque. A dashed line 1270 shows the torque limit for the knee joint, and a dashed line 1272 shows a torque limit for the rest of the joints in the right leg. During the highest walking speed in this experiment, the example controller reasoned about the height and speed of the robot, allowing the robot to lower itself to allow further leg reach. FIG.12B shows that there is noticeable torque headroom for more dynamic motions. [00150] As a primary application of the example control framework, dynamic loco-manipulation may be performed by the example robot 100. FIG. 8A shows the snapshots of an example test robot without arms walking over uneven and unstable terrain with a 2.5 kg payload. The associated MPC solution and torque command plots are presented in FIG.13A. FIG.13A shows a graph 1300 that plots the three left foot forces (Fx, Fy, and Fz) shown in FIG. 6B represented by plots 1302, 1304, and 1306. A graph 1310 plots the three left foot forces (Fx, Fy, and Fz) shown in FIG.6B represented by plots 1312, 1314, and 1316. A graph 1320 plots the three left foot moments (Mx, My, and Mz) shown in FIG. 6B represented by plots 1322, 1324, and 1326. A graph 1330 plots the three left foot moments (Mx, My, and Mz) shown in FIG.6B represented by plots 1332, 1334, and 1336. The graphs in FIG.13A showcase the robustness of solutions in force constraint satisfaction and potential for further dynamic motions. [00151] FIG. 13B shows two torque plots 1350 and 1360 for the test robot walking. The plots 1350 and 1360 shows torques of the right leg and left leg of the example robot during walking. In each of the plots 1350 and 1360, a plot 1362 shows the hip yaw torque, a plot 1364 shows the hip roll torque, a plot 1366 shows the thigh torque, a plot 1368 shows the knee torque, and a plot - 36 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 1370 shows the ankle torque. A dashed line 1380 shows the torque limit for the knee joint, and a dashed line 1382 shows a torque limit for the rest of the joints in the respective legs. [00152] The proposed methods can be extended to practical applications such as controlling humanoid robots to efficiently transfer packages over different platforms. The example robot was tested to transport 2 kg totes semi-autonomously. During these tasks, the example robot exhibited multi-contact whole-body motions, which are well-suited for the proposed Multi- contact MPC. [00153] In hardware experiments, a variety of dynamic capabilities with the example control framework were shown by the example robot 100. FIG. 14 shows a series of snapshots from testing of dynamic capabilities using the test robot without arms. For example, in a double-leg stance, the biped has a standing height range from 0.35 m to 0.6 m, and it is capable of handling disturbances, including constant pushing force and impulses to its body. Due to the example MPC, the test robot can also react to terrain changes in real time. FIG. 14 shows a snapshot of the test robot balancing on a seesaw. [00154] In locomotion demonstrations, the example biped robot can perform in place stepping in various terrain setups, including indoor floor, outdoor concrete hard ground, and outdoor grass soft ground as shown in the snapshots of FIG.15. Experiments of walking forward and sideways on hard ground as shown in FIG. 15. In handling terrain perturbations, the example biped robot is capable of walking over grass terrain as well as uneven surfaces as shown in FIG.16 [00155] The terminology used herein is for the purpose of describing particular embodiments only, and is not intended to be limiting of the invention. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. Furthermore, to the extent that the terms “including,” “includes,” “having,” “has,” “with,” or variants thereof, are used in either the detailed description and/or the claims, such terms are intended to be inclusive in a manner similar to the term “comprising.” [00156] Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art. Furthermore, terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein. - 37 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) [00157] While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. Although the invention has been illustrated and described with respect to one or more implementations, equivalent alterations and modifications will occur or be known to others skilled in the art upon the reading and understanding of this specification and the annexed drawings. In addition, while a particular feature of the invention may have been disclosed with respect to only one of several implementations, such feature may be combined with one or more other features of the other implementations as may be desired and advantageous for any given or particular application. Thus, the breadth and scope of the present invention should not be limited by any of the above described embodiments. Rather, the scope of the invention should be defined in accordance with the following claims and their equivalents. - 38 - 4868-5457-8139.1 065715-000161WOPT

Claims

Attorney Docket No.065715-000161WOPT (2024-016-02) CLAIMS: 1. A bi-pedal robot comprising: a torso; a first and a second leg coupled to the torso, each leg having: an abduction adduction actuator attached to the torso; a hip actuator rotatable by the abduction adduction actuator; a thigh actuator rotatable by the hip actuator; a thigh link assembly rotatably coupled to the hip actuator; a knee actuator coupled to the thigh link assembly in proximity to the thigh actuator; an ankle actuator coupled to one end of the thigh link assembly; a knee joint coupled to an opposite end of the thigh link assembly; a calf link assembly rotatably coupled to the knee joint and knee actuator, the knee actuator rotating the calf link assembly relative to the thigh link assembly; an ankle joint coupled to the calf link assembly and the ankle actuator; and a controller couped to the actuators to control dynamic movement of the first and second legs to propel the robot. 2. The robot of claim 1, wherein the legs each include a foot structure coupled to the ankle joint. 3. The robot of claim 1, wherein the legs each include a wheel coupled to the ankle joint. 4. The robot of claim 1 further comprising a camera coupled to the controller, wherein the controller provides state estimation and processes vision data from the camera. 5. The robot of claim 1, further comprising a sensor coupled to the controller, wherein the controller stores data sensed by the sensor. 6. The robot of claim 1, wherein the torso includes an enclosure with a power source and a payload compartment. 7. The robot of claim 1, wherein each of the actuators are identical motors. 8. The robot of claim 1, further comprising an input to accept a command from an input device for the robot to traverse terrain. 9. The robot of claim 8, wherein the input device is one of a human input controller, an autonomous controller, or a semi-autonomous controller. - 39 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 10. The robot of claim 1, wherein the thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts. 11. The robot of claim 1, further comprising a belt and pulley coupling the knee actuator to the knee joint to rotate the calf link assembly. 12. The robot of claim 1, further comprising a linkage assembly coupling the ankle actuator to the ankle joint. 13. The robot of claim 1, wherein the controller controls the dynamic movement of the legs based on a Model Predictive Control (MPC). 14. The robot of claim 13, wherein the MPC is based on a single rigid body dynamics model of the robot allowing linearization of the single rigid body dynamics model. 15. The robot of claim 13, wherein the MPC is applied to control the dynamic movement based on properties of the ground reaction forces and moments of inertia on the first and second legs. 16. The robot of claim 13, wherein the controller includes a multi-contact controller having an input of a contact schedule of contacts by the legs during the dynamic movement to provide stance control of the legs. 17. The robot of claim 16, wherein the multi-contact controller optimizes ground reaction forces and moments of the first and second legs. 18. The robot of claim 16, wherein the controller includes a Cartesian PD controller that provides swing control of the legs. 19. The robot of claim 1, further comprising a first and second arm assembly mounted to the torso, wherein the first and second arm assemblies are controlled by the controller. 20. The robot of claim 19, wherein the first and second arm assembly each include: a shoulder yaw actuator attached to the torso; a shoulder pitch actuator rotatable by the shoulder yaw actuator; a shoulder roll actuator rotatable by the shoulder pitch actuator; an upper arm assembly coupled to the shoulder pitch actuator; an elbow actuator coupled to one end of the upper arm assembly; an elbow joint coupled to the upper arm assembly; a forearm assembly rotating around the elbow joint; and an elbow transmission coupling the elbow joint to the elbow actuator. - 40 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 21. A leg assembly for a robot, the leg assembly comprising: a hip joint; a thigh link assembly rotatably coupled to the hip joint; a knee joint coupled to the thigh link assembly; a calf link assembly rotatably coupled to the knee joint; an ankle joint coupled to the calf link assembly; a plurality of actuators configured to control movement of the leg assembly, wherein at least some of the actuators are positioned near the hip joint to concentrate mass of the leg assembly proximate to the hip joint; and a transmission system configured to transmit force from at least one of the actuators to either the knee joint or the ankle joint of the leg assembly. 22. The leg assembly of claim 21, wherein the plurality of actuators include: an abduction adduction actuator coupled to the hip joint, wherein the abduction adduction actuator is attachable to a torso of the robot; a hip actuator coupled to the hip joint and the thigh link assembly, the hip actuator rotatable by the abduction adduction actuator; and a thigh actuator coupled to one end of the thigh link assembly and the hip actuator, the thigh actuator rotatable by the hip actuator. 23. The leg assembly of claim 21, further comprising a foot structure coupled to the ankle joint. 24. The leg assembly of claim 21, wherein the transmission system includes a belt having one end rotated by a pulley. 25. The leg assembly of claim 24, wherein the pulley is rotatably coupled to a knee actuator, and wherein another end of the belt is rotatably coupled to the knee joint to transmit force to rotate the calf link assembly relative to the thigh link assembly. 26. The leg assembly of claim 21, wherein the first transmission system is a linkage assembly. 27. The leg assembly of claim 26, wherein the linkage assembly transmits force between an ankle actuator and the ankle joint. 28. The leg assembly of claim 21, wherein each of the plurality of actuators are identical motors. - 41 - 4868-5457-8139.1 065715-000161WOPT Attorney Docket No.065715-000161WOPT (2024-016-02) 29. The leg assembly of claim 21, wherein the thigh link assembly and calf link assembly include one of laser cut parts, computer numerical control (CNC) manufactured parts or 3D printed parts. 30. A control system for a robot having a limb with an upper assembly and a lower assembly, and a plurality of actuators attached to the upper assembly, the control system comprising: a low level controller coupled to the plurality of actuators; a multi-contact controller having inputs of a contact schedule of contacts by the limb during dynamic movement, reaction forces on the limb, and moments of inertia on the limb to provide stance control of the limb from a Model Predictive Control (MPC) based on a single rigid body dynamics model of the robot; and a Cartesian PD controller that provides swing control of the limb from the MPC based on reaction forces from movement of the limb. 31. The control system of claim 30, wherein the limb is an arm, and wherein the upper assembly is an upper arm, and the lower assembly is a forearm rotatably coupled to the upper arm. 32. The control system of claim 31, wherein the plurality of actuators allows four degrees of movement of the arm. 33. The control system of claim 30, wherein the limb is a leg, and wherein the upper assembly is thigh link, and the lower assembly is a calf link rotatably coupled to the thigh link. 34. The control system of claim 31, wherein the plurality of actuators allows five degrees of movement of the leg. 35. The control system of claim 31, wherein the multi-contact controller optimizes ground reaction forces and moments of the leg. - 42 - 4868-5457-8139.1 065715-000161WOPT
PCT/US2024/044221 2023-08-28 2024-08-28 A bipedal robot for dynamic and robust location in diverse environments Pending WO2025049602A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363579166P 2023-08-28 2023-08-28
US63/579,166 2023-08-28

Publications (1)

Publication Number Publication Date
WO2025049602A1 true WO2025049602A1 (en) 2025-03-06

Family

ID=94820375

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2024/044221 Pending WO2025049602A1 (en) 2023-08-28 2024-08-28 A bipedal robot for dynamic and robust location in diverse environments

Country Status (1)

Country Link
WO (1) WO2025049602A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230174177A1 (en) * 2020-05-04 2023-06-08 RI&D Pty Ltd A Vehicle

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080203955A1 (en) * 2002-08-28 2008-08-28 Honda Giken Kogyo Kabushiki Kaisha Legged Mobile Robot
US20150122559A1 (en) * 2012-05-31 2015-05-07 Thk Co., Ltd. Lower limb structure for legged robot, and legged robot
US20160236356A1 (en) * 2015-02-17 2016-08-18 Honda Motor Co., Ltd. Robot
US20160243699A1 (en) * 2015-02-24 2016-08-25 Disney Enterprises, Inc. Method for developing and controlling a robot to have movements matching an animation character
US20210162602A1 (en) * 2018-04-25 2021-06-03 Mitsubishi Electric Corporation Rotation connecting mechanism, robot, robot arm, and robot hand
CN115128960A (en) * 2022-08-30 2022-09-30 齐鲁工业大学 Method and system for controlling motion of biped robot based on deep reinforcement learning
WO2023008947A1 (en) * 2021-07-30 2023-02-02 국민대학교산학협력단 Moving unit switchable to leg and wheel modes, and hybrid robot comprising same
US11707852B1 (en) * 2022-11-04 2023-07-25 Agility Robotics, Inc. Grippers for robotic manipulation of objects and related technology

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080203955A1 (en) * 2002-08-28 2008-08-28 Honda Giken Kogyo Kabushiki Kaisha Legged Mobile Robot
US20150122559A1 (en) * 2012-05-31 2015-05-07 Thk Co., Ltd. Lower limb structure for legged robot, and legged robot
US20160236356A1 (en) * 2015-02-17 2016-08-18 Honda Motor Co., Ltd. Robot
US20160243699A1 (en) * 2015-02-24 2016-08-25 Disney Enterprises, Inc. Method for developing and controlling a robot to have movements matching an animation character
US20210162602A1 (en) * 2018-04-25 2021-06-03 Mitsubishi Electric Corporation Rotation connecting mechanism, robot, robot arm, and robot hand
WO2023008947A1 (en) * 2021-07-30 2023-02-02 국민대학교산학협력단 Moving unit switchable to leg and wheel modes, and hybrid robot comprising same
CN115128960A (en) * 2022-08-30 2022-09-30 齐鲁工业大学 Method and system for controlling motion of biped robot based on deep reinforcement learning
US11707852B1 (en) * 2022-11-04 2023-07-25 Agility Robotics, Inc. Grippers for robotic manipulation of objects and related technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230174177A1 (en) * 2020-05-04 2023-06-08 RI&D Pty Ltd A Vehicle
US12522307B2 (en) * 2020-05-04 2026-01-13 RI&D Pty Ltd Vehicle

Similar Documents

Publication Publication Date Title
He et al. Mechanism, actuation, perception, and control of highly dynamic multilegged robots: A review
US20250291353A1 (en) Design and control of wheel-legged robots navigating high obstacles
US12172719B2 (en) Whole body manipulation on a legged robot using dynamic balance
Hutter et al. Toward combining speed, efficiency, versatility, and robustness in an autonomous quadruped
Lee et al. Development of a quadruped robot system with torque-controllable modular actuator unit
Li et al. Dynamic loco-manipulation on hector: Humanoid for enhanced control and open-source research
Sentis et al. Implementation and stability analysis of prioritized whole-body compliant controllers on a wheeled humanoid robot in uneven terrains
Čížek et al. Design, construction, and rough-terrain locomotion control of novel hexapod walking robot with four degrees of freedom per leg
Liu et al. Legged robots—an overview
Mudalige et al. Hyperdog: An open-source quadruped robot platform based on ros2 and micro-ros
JP2025541512A (en) Balance control method, device, equipment and computer program for leg-wheeled robot
Hong et al. Development of a tele-operated rescue robot for a disaster response
Nicholson et al. Llama: Design and control of an omnidirectional human mission scale quadrupedal robot
WO2025049602A1 (en) A bipedal robot for dynamic and robust location in diverse environments
Turrisi et al. Pacc: A passive-arm approach for high-payload collaborative carrying with quadruped robots using model predictive control
Xia et al. The Duke Humanoid: Design and Control for Energy-Efficient Bipedal Locomotion Using Passive Dynamics
Xin et al. Variable autonomy of whole-body control for inspection and intervention in industrial environments using legged robots
Pan et al. Pegasus: a novel bio-inspired quadruped robot with underactuated wheeled-legged mechanism
Buettner et al. Nimble Limbs-Intelligent attachable legs to create walking robots from variously shaped objects
Léziart Locomotion control of a lightweight quadruped robot
Hegde Robotics: Vision and Control Techniques
Remy et al. Quadrupedal robots with stiff and compliant actuation
Xu et al. Design and experiments of a human-leg-inspired omnidirectional robotic leg
Biradar et al. Design of a Robot for carrying out research on hybrid robot's mobility: case of a mecanum wheel-legged robot
Madadi et al. Design and development of a biped robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24860944

Country of ref document: EP

Kind code of ref document: A1