WO2017052060A1 - Système de commande de dispositif en temps réel ayant une architecture hiérarchique et système de commande de robot en temps réel utilisant celui-ci - Google Patents
Système de commande de dispositif en temps réel ayant une architecture hiérarchique et système de commande de robot en temps réel utilisant celui-ci Download PDFInfo
- Publication number
- WO2017052060A1 WO2017052060A1 PCT/KR2016/008037 KR2016008037W WO2017052060A1 WO 2017052060 A1 WO2017052060 A1 WO 2017052060A1 KR 2016008037 W KR2016008037 W KR 2016008037W WO 2017052060 A1 WO2017052060 A1 WO 2017052060A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- layer
- real
- time
- device control
- control system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
Definitions
- the present invention relates to a real time device control system and a real time robotic system. More specifically, the present invention relates to a real-time device control system having a hierarchical architecture capable of accurate real-time processing, easy development and debugging, and robust hardware, and a real-time robot control system using the same.
- Robots can be largely divided into hardware and software, and they are integrated to form a system.
- Components of the robot hardware include a driver and a controller for moving the robot joint, a battery and a power controller, a communication module, a sensor, an exoskeleton of the robot, an electronic circuit, and a battery. These different types of elements are combined according to the characteristics of each desired robot to form a robot hardware platform.
- the present invention is to solve the above problems, in the robot control system that requires real-time, while several independent processes for the same hardware control and processing can coexist, the operation of the robot can be stably controlled accordingly
- the purpose of the present invention is to provide a real-time device control system having a hierarchical architecture capable of providing robustness and scalability, and a real-time robot control system using the same.
- a system comprising: a first layer including the one or more devices to be controlled; A second layer including a device control module directly controlling the device at an upper end of the first layer; A third layer including a shared memory connected to the device control module at an upper end of the second layer; A fourth layer including one or more agents performing an independent process using the shared memory at an upper end of the third layer; And a fifth layer that controls the one or more agents according to a user command at an upper end of the fourth layer.
- the system for solving the above problems, the real-time robot control system, at least one control target device corresponding to the joint or sensor of the robot; And a control system connected to the at least one controlled device to operate the device, wherein the control system includes a first layer including at least one controlled device and a device directly at an upper layer of the first layer.
- a second layer including a device control module configured to control a third layer; a third layer including a shared memory connected to the device control module at an upper layer of the second layer; and the shared memory at an upper layer of the third layer.
- a fourth layer including one or more agents for performing an independent process using; And a fifth layer that controls the one or more agents according to a user command in an upper layer of the fourth layer, to operate the one or more devices using inter-layer communication adjacent to each other.
- the method for solving the above problems can be implemented with a program for executing the method on a computer and a recording medium on which the program is recorded.
- a plurality of agents having mutually independent processes and a shared memory in which references generated according to operations of the plurality of agents are stored are provided, and the reference to the hardware device is controlled using the reference.
- FIG. 1 is a conceptual diagram schematically showing an entire system according to an embodiment of the present invention.
- FIG. 2 is a flowchart illustrating a control method of a robot control system according to an exemplary embodiment of the present invention.
- 3 to 4 are diagrams for describing a relationship between a shared memory and a system according to an exemplary embodiment of the present invention.
- FIG. 5 is a diagram for explaining data exchange between a device control module and an agent according to an embodiment of the present invention.
- FIG. 6 is a block diagram illustrating a device control module according to an embodiment of the present invention.
- FIG. 7 is a flowchart illustrating a control operation of the robot control system according to another exemplary embodiment.
- FIG. 8 is a diagram illustrating a hierarchical structure and an operating environment according to an exemplary embodiment of the present invention.
- components expressed as means for performing the functions described in the detailed description include all types of software including, for example, a combination of circuit elements or firmware / microcode, etc. that perform the functions. It is intended to include all methods of performing a function which are combined with appropriate circuitry for executing the software to perform the function.
- the invention, as defined by these claims, is equivalent to what is understood from this specification, as any means capable of providing such functionality, as the functionality provided by the various enumerated means are combined, and in any manner required by the claims. It should be understood that.
- FIG. 1 is a conceptual diagram schematically showing an entire system according to an embodiment of the present invention.
- an entire system may include one or more devices 100, a device control module 200, a shared memory 300, one or more agents 400, and a user system 500. Include.
- the device 100 may include one or more driving devices that finally perform the operation of the robot control system.
- the drive device may comprise a hardware device or a software device.
- the drive device may include, for example, at least one of a joint device, a sensor device including a sensor board, or a simulator device that controls drive for the articulated motor.
- the device 100 may be controlled according to a control signal received from the device control module 200, and may output various data such as sensor data to the device control module 200.
- the term device 100 is not limited to hardware, but may be used as a concept including a software driver for driving an actual hardware device. Accordingly, each device 100 may be physically and softwarely connected to the device control module 200.
- Each device 100 may form a communication network with the device control module 200.
- the communication network may form a system network using a controller area network (CAN) protocol for system stability.
- CAN controller area network
- each device 100 may be connected to the device control module 200 through one or more CAN communication channels, and receive a message composed of CAN frame data according to a preset control period through the CAN communication channel.
- a message may be output to the device control module 200.
- the message may include a motor control reference, an encoder value, a controller state value, a pulse width modulation (PWM) command, a sensor value, or various other setting or output values.
- PWM pulse width modulation
- the device control module 200 obtains hardware control data for controlling one or more devices 100 from each reference generated from the plurality of agents 400 and stored in the shared memory, and the hardware control data. And transmits a control signal according to the reference to the one or more devices 100 selected from.
- the device control module 200 may always reside on an operating system for controlling the robot control system and may be executed in the background.
- the device control module 200 may uniquely directly communicate the device 100 with reference to the shared memory 300, and may transmit a control signal or receive a sensor signal through the communication channel.
- the device control module 200 may transfer a reference for controlling the joint device 100 to the joint device 100 or receive necessary sensor information from the sensor device 100.
- the device control module 200 may include a real-time thread created on the operating system. The thread is synchronized with the motion generation operation cycle of the system to enable real time processing. In addition, the device control module 200 may further include a non-real time thread for processing data reading and conversion.
- each agent 400 may be implemented as independent software modules having independent processes.
- the agents 400 may each process different motions and perform a process for outputting a reference corresponding thereto.
- each agent 400 may include a motion agent, a controller agent, a communication agent or a walking agent, a damping agent, and various other agents.
- the agents 400 can create and operate respective threads without sharing heap, data, and static memory, and mutually share them.
- the necessary data for each can be provided to the shared memory 300, thereby allowing organic processing without mutual collision, and facilitates software development and processing.
- each agent 400 may refer to hardware abstraction data and user-defined data of the shared memory 300 according to a defined process, and store the reference data generated based on the hardware abstraction data of the shared memory 300.
- the user defined data may include shared data for sharing information between the agents 400 and various data for driving other user-definable systems.
- the hardware abstraction data may include abstracted reference, sensor data, motion owner variable, and command data to control the device 100.
- the device control module 200 may generate a control signal for each device 100 by using the hardware abstraction data and hardware information previously stored in the hardware database 250.
- the device control module 200 identifies the control target device 100 using the hardware abstraction data extracted from the shared memory 300, and generates a control signal for the control target devices 100.
- the control signal according to the reference may be output to the control target device 100.
- the processing cycle of each agent 400 needs to be shorter than the operation cycle of processing the motion information of the system. Accordingly, the agent 400 generates a reference from the sensor data, the device control module 200 generates and outputs a control signal from the reference through the shared memory 300, and the time for updating the sensor data is It may be included in the first operating period of the system. Thus, the series of operations can all be processed within the first operating period.
- the user system 500 may provide a user interface for controlling and monitoring the agent 400 and the device control module 200.
- the user system 500 may include middleware for controlling the agent 400, and may provide various interfaces that may be connected to other external systems.
- FIG. 2 is a flowchart illustrating a control method of a robot control system according to an exemplary embodiment of the present invention.
- 3 to 4 are diagrams for describing a relationship between a shared memory and a system according to an exemplary embodiment of the present invention.
- FIG. 5 is a diagram for explaining data exchange between a device control module and an agent according to an embodiment of the present invention.
- the device control module 200 obtains hardware abstraction data from a reference stored in the shared memory 300.
- the device control module 200 generates a control signal for hardware control from the hardware abstraction data (S107), and transmits the generated control signal to one or more devices 100 (S109).
- the device control module 200 receives sensor data from the devices 100 corresponding to the sensor (S111), and updates the received sensor data in the shared memory 300 (S113).
- the series of operation steps may be all processed within the first period corresponding to the real-time operation period of the robot control system, thereby ensuring real-time.
- each of the agents 400 and the device control module 200 may perform data exchange and transfer processing using the shared memory 300.
- the reference corresponding to each device 100 may be stored in the shared memory 300, and the device control module 200 may obtain a reference and use it to output a control signal. .
- Such a plurality of agents 400 and the device control module 200 may configure a multi-agent system around the shared memory 300.
- each part performing independent work may be separately developed by several developers, or may have an advantageous structure in a robot control system development environment in which they may collaborate.
- developers may use the shared memory 300 while interacting with the computational output of another agent 400 while ensuring a development space independent from the process concurrent development model. Will be able to send and receive.
- the hardware abstraction data may include sensor data, reference data, motion owner- and command data, and the device control module 200 may access only the hardware abstraction data area of the shared memory 300. .
- the device control module 200 accesses the hardware abstraction data area of the shared memory 300 to update sensor data received from the device 100, or obtains updated reference data to control the device 100. You can generate a signal.
- the hardware abstraction data may have a data format converted by abstracting detailed data about the robot device control, and the device control module 200 may convert it into an actual hardware control signal and deliver it to the appropriate devices 100. have.
- the agent 400 developer or user can facilitate the control without a deep understanding of the hardware.
- the developer or the user may transfer the abstracted hardware input information as a reference through the shared memory 300, and the device control module 200 may generate a low level control signal for controlling the device 100 from the hardware abstraction data.
- the device control module 200 may manage hardware information required to generate the control signal using the hardware database 250 described above.
- the hardware information may include, for example, a list of devices 100, joint motor information (deceleration ratio, encoder pulse, driver channel number, etc.), a communication protocol, and the like.
- the device control module 200 may load the hardware database 250 to determine hardware information of the driving target device 100, thereby generating an optimal control signal for controlling the driving target device 100. can do. In addition, even if there is a change in hardware or using hardware of a new configuration, it is possible to apply only by modifying the hardware database 250, so that it is robust to hardware change and the hardware can provide scalable characteristics.
- the hardware abstraction data may include reference data, sensor data, motion owner, and command data.
- the reference data may be updated according to the calculation result in each agent 400, and may include a target value in the current step for the device control module 200 to control each device 100.
- the reference data may include a joint motion reference and a joint controller reference.
- the sensor data may include measurement data that the device control module 200 receives from each device 100.
- the measurement data may include, for example, state information at a current step including an encoder value of the joint device and sensing data.
- the command data may include command information for controlling the device control module 200 and the agent 400 at a higher system level, and may include command target process information and parameter information.
- the shared memory 300 may include motion owner information.
- the hardware abstraction data area of the shared memory 300 may include a memory area 350 for each agent 400 that can update reference data for each agent 400.
- each agent 400 may update its calculated reference in its memory space area.
- each agent 400 may calculate and update reference data corresponding to each device 100. For example, when a total of 31 joint devices 100 exist from J1 to J31, a memory space area of each agent 400 may include a reference data area corresponding to each of the joint devices 100. .
- the shared memory 300 may include a motion owner variable for each of the joint devices 100. Therefore, each motion owner variable space may include the same number of motion owner variables as the number of the joint devices 100.
- Each motion owner variable may represent one agent having the authority to the joint device 100 among a plurality of preset agents 400. Accordingly, the device control module 200 may determine which agent 400 the control right for the joint device 100 depends on.
- control right for each joint device 100 may be transferred to another agent 400 or the device control module 200 according to the change of the motion owner variable.
- the device control module 200 may first identify the agent 400 having the control right of the specific joint device 100 from the motion owner variable.
- the device control module 200 may collect reference data of the identified agent 400, and combine the collected reference data to generate overall reference data for the overall joint device 100.
- the device control module 200 may generate a control signal for each device 100 by using the entire reference data, and may appropriately transmit the signal.
- each joint of the robot can be controlled without collision in different agents 400.
- one agent 400 controls lower body joints through an algorithm for stabilizing lower body posture, and the other agent 400 generates a specific task motion of the upper body, the results of the two agents 400 are determined. In total, the whole body task of the robot can be performed. This enables efficient control according to the characteristics of the multi-agent system of the robot.
- FIG. 6 is a block diagram illustrating a device control module according to an embodiment of the present invention.
- the device control module 200 includes a motion selector 210, a controller signal accumulator 220, a signal combiner 230, and an information handler 240.
- the reference data for the joint may include two or more reference signals for joint motion control and detailed control. Accordingly, the agent 400 corresponding to each joint device 100 may generate the two or more reference signals as reference data and store the same in the shared memory 300.
- the reference signal may be referred to as a motion reference and a controller reference.
- the motion reference may include reference data that provides a dominant value for each joint, and the controller reference may include detailed reference data that is added to or subtracted from the motion reference.
- the reference is not limited to the name.
- the motion reference output data M1 to Mm and the controller references M1 to Mm may be input to the device control module 200 from the shared memory 300.
- one motion reference may be selected for each joint device 100, but all the controller references may be accumulated and added.
- the motion selector 210 may select motion reference data corresponding to each joint device 100 from the motion reference data based on the motion owner variable information, and output the motion reference data to the signal combiner 230. have. Therefore, one motion reference data may be selected for one joint device 100.
- controller signal accumulator 220 may accumulate each controller reference data and output the result value to the signal combiner 230 regardless of the motion owner variable.
- the signal combiner 230 may generate the reference data for each final joint device 100 by synthesizing the motion reference data and the controller reference data accumulated result value, and output them to the appropriate target joint devices 100. Can be.
- the signal combiner 230 may identify the type of the reference and classify the processing space according to the reference type.
- the signal combiner 230 may include a type identifier and a spatial processor.
- the reference data may have other types, such as task processing, as well as joint motion, such that a type identifier may identify whether the task type or the joint type is, and the spatial processor may determine each other according to the type. Can provide processing of other data spaces.
- separating the motion reference from the controller reference functional separation may be enabled in the process of generating the robot motion. For example, if a bipedal motion is generated, a basic walking pattern is generated in one agent 400 to generate a motion reference, a damping controller is designed in another agent 400, and another agent 400 is generated. By designing a controller to catch the vibration in the controller and outputting it to the controller reference, it is very easy to design and develop.
- the information handler 240 may perform a function of synthesizing sensor data collected from the sensor device 100 or other measurement target devices and outputting them to the shared memory 300.
- FIG. 7 is a flowchart illustrating a control operation of the robot control system according to another exemplary embodiment.
- the robot In general, when a problem occurs in a real experiment using a robot, the robot must be driven again from the beginning. In the case of a mobile platform, the robot initialization process is simple, but when the initialization is difficult in the articulated system or the ground like a humanoid, and it is necessary to initialize it in the air by using a crane, etc., the entire initialization process is very cumbersome and time consuming. do.
- the device control module 200 can debug and test the robot again without the process of initializing such a robot.
- system initialization is first performed (S201), and a plurality of agents 400 having respective mutually independent processes operate (S202).
- the user when the user tests the motion algorithm through the agent 400 and a problem is issued, the user simply passes the motion owner to another agent 400 or the device control module 200, and sends a code for the suspended agent 400. Can be modified.
- the motion owner variable may be switched back to the original agent 400 (S209).
- the developer can continue the experiment after bringing the motion owner. As a result, it can accelerate development, and from the user's point of view, it can be further utilized to continuously observe the robot's joint reference on other special eggs to detect the collision and to switch the robot to the motion owner in case of a collision. It has the effect of allowing you to experiment safely.
- FIG. 8 is a view for explaining the operating environment of the robot system according to an embodiment of the present invention.
- the robot software in order to use the robot software in general, it is necessary to be able to operate various types of robots, not software that can operate only one robot platform, so that it is extensible and easy to change in robot hardware. It must be adaptable, and not only the actual robot platform but also the robot simulator needs to be controlled by the same software.
- the robot control system 1000 is capable of utilizing functions of other useful robot middlewares such as a robot operation system (ROS) in the United States or an open platform for robotics services (OPRoS) in Korea. You can build a system. Accordingly, it is possible to provide an environment in which various vision solutions provided by software on a cloud robot system or functions for managing tasks of a robot can be easily applied to the system of the present invention.
- ROS robot operation system
- OPRoS open platform for robotics services
- the device control module 200 controlling the robot devices 100 operates accordingly to provide real-time control of the entire system.
- the device control module 200 controlling the robot devices 100 operates accordingly to provide real-time control of the entire system.
- other robot softwares of higher level may provide connection or determination criteria between motions, or may operate several robots simultaneously.
- FIG. 9 is a diagram showing a hierarchical architecture design of a robot system according to an embodiment of the present invention.
- the robot control system 1000 processes each data in order to provide an environment in which a plurality of agents or an arbitrary agent are created and operated independently in accordance with an embodiment of the present invention.
- Modules may include a layered structure.
- Each layered structure may be connected to the robot control system 1000, or may be implemented in software or hardware on a real-time operating system (RTOS) on which the robot control system 1000 is mounted or installed.
- RTOS real-time operating system
- the real-time operating system may provide global timer interrupts to the fourth layer and the second layer to synchronize operation cycles between layers.
- each agent of the fourth layer may be implemented as a process of the real-time operating system, and may access shared memory, obtain sensor data, and store reference data according to thread operations included in the process.
- the device control module of the second layer synchronized thereto stores the sensor data of the device in the shared memory and generates the device control signal according to the reference data and the motion owner of the shared memory according to the thread operation on the real-time operating system. Can output to devices.
- the robot control system 1000 which can be implemented on a real-time operating system, includes a first layer including one or more controlled devices (joints or sensors) included in a robot platform or a simulator, and the first layer.
- a second layer including a device control module directly controlling the device at an upper end of the layer, a third layer including a shared memory connected to the device control module at an upper end of the second layer, and the first layer;
- a fourth layer including one or more agents performing an independent process using the shared memory at an upper end of the third layer;
- a fifth layer that controls the one or more agents according to a user command at an upper end of the fourth layer.
- each communication protocol may be preset so that the first to fifth layers can communicate only with adjacent layers.
- Each tier can only access the next tier through the upper or lower tier, and through this controlled system, it can maintain a stabilized and systematic system.
- each device may be included in the first layer.
- the devices may include low level robotic devices that are the subjects of substantial control, for example devices of the driver's controller, sensor board or robot simulator.
- the Diabis control module 200 may always reside in the background and execute the robot to control the robot.
- the second layer may be the only layer that can directly control the devices of the robotic system.
- the device control module of the second layer may transfer the reference of the joint generated from the shared memory to the robot device, and conversely obtain the value of the sensor from the device.
- the second layer may be operated by a real time thread generated from a real time operating system (RTOS), and the thread of the second layer may have a period synchronized with a control period of motion generation. If the device is linked with the simulator, the thread of the second layer may operate in synchronization with the simulator's time.
- the second tier can also have non-real-time threads that can read and interpret instructions, and non-real-time threads can receive and process other instructions in the remaining time of the real-time thread.
- the device control module may have a hierarchical architecture residing in the background of the system and transferring control signals for controlling the device from the reference obtained from the shared memory to the first layer.
- the third layer may be a shared memory layer, and may include an abstraction data unit and a user-defined data unit of hardware.
- the hardware abstraction data unit may include the aforementioned hardware abstraction data
- the type of hardware abstraction data may include sensor data, reference data, a motion owner, and command information.
- the device control module of the second layer may be connected only to the shared memory of the third layer.
- the user defined data unit may temporarily or permanently store agent shared data shared among a plurality of agent processes existing in the fourth layer and robot driving data according to user definition.
- the fourth layer is a layer for driving each agent process for the user of the external process to create their own robot motion, etc., since the agent processes are executed independently of each other in the layer like grape grains, AL).
- Each agent independently reads sensor data from the shared memory of the third layer, generates a motion, and updates the joint reference of the generated motion in the shared memory.
- agent processes of the fourth layer may set to the motion owner which eggs have ownership of the reference of the joint.
- each agent can create a very short period of fast real-time threads from the real-time operating system (RTOS), which is used to synchronize the motion generation threads of each agent with the real-time threads of the fourth layer device control module described above.
- RTOS real-time operating system
- a thread generating motion can be synchronized in real time with the device control module by the fast thread, and the operation can be resumed at the same time as the synchronization, and suspended after one reference operation loop. have. This operation may be repeated repeatedly to control the robot control system 1000.
- agents not all agents directly generate the motion of the robot, but there may be an agent that detects a collision and brings the motion owner from another agent to make the robot safe, and may perform ancillary processing to help other agents.
- agent N in FIG. 9 there may also be an agent (agent N in FIG. 9) that is implemented as a communication module (comm. Module) to exchange information with the fifth layer and to control other agencies.
- the fifth layer may include a user interface module that provides a control function corresponding to the agents and a monitoring function for the robot control system 1000.
- the fifth layer may include various processes to provide convenience for controlling the robot.
- the fifth layer may include a GUI (Graphic User Interface) for easily giving a command, monitoring, or a logging program for storing data.
- GUI Graphic User Interface
- the fifth layer is an accessible area of an external process, and existing middleware such as ROS and OPRoS may provide one or more interface functions for controlling agents.
- existing middleware such as ROS and OPRoS may provide one or more interface functions for controlling agents.
- the robot control system 1000 may include a structure that can be infinitely extended, thereby providing a structural possibility to control a hyper multi-agent system. Can be.
- the above-described systems and methods according to the present invention can be stored in a computer-readable recording medium produced as a program for execution in a computer
- examples of the computer-readable recording medium is ROM, RAM, CD- ROMs, magnetic tapes, floppy disks, optical data storage, and the like, and also include those implemented in the form of carrier waves (eg, transmission over the Internet).
- the computer readable recording medium can be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
- functional programs, codes, and code segments for implementing the method can be easily inferred by programmers in the art to which the present invention belongs.
Landscapes
- Engineering & Computer Science (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Manipulator (AREA)
- Numerical Control (AREA)
Abstract
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| EP16848786.6A EP3354416A4 (fr) | 2015-09-21 | 2016-07-22 | Système de commande de dispositif en temps réel ayant une architecture hiérarchique et système de commande de robot en temps réel utilisant celui-ci |
| US15/762,063 US10857672B2 (en) | 2015-09-21 | 2016-07-22 | Real-time device control system having hierarchical architecture and realtime robot control system using same |
| CN201680054891.6A CN108136578B (zh) | 2015-09-21 | 2016-07-22 | 具有分层架构的实时设备控制系统及利用其的实时机器人控制系统 |
| JP2018514449A JP6938473B2 (ja) | 2015-09-21 | 2016-07-22 | 階層的なアーキテクチャを有するリアルタイムデバイス制御システム及びこれを用いたリアルタイムロボット制御システム |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201562221215P | 2015-09-21 | 2015-09-21 | |
| US62/221,215 | 2015-09-21 | ||
| KR10-2016-0020776 | 2016-02-22 | ||
| KR1020160020776A KR102235168B1 (ko) | 2015-09-21 | 2016-02-22 | 계층적 아키텍처를 갖는 실시간 디바이스 제어 시스템 및 이를 이용한 실시간 로봇 제어 시스템 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2017052060A1 true WO2017052060A1 (fr) | 2017-03-30 |
Family
ID=58386230
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2016/008037 Ceased WO2017052060A1 (fr) | 2015-09-21 | 2016-07-22 | Système de commande de dispositif en temps réel ayant une architecture hiérarchique et système de commande de robot en temps réel utilisant celui-ci |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2017052060A1 (fr) |
Cited By (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020004428A (ja) * | 2019-01-10 | 2020-01-09 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP2020077430A (ja) * | 2019-01-23 | 2020-05-21 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP2020098608A (ja) * | 2019-01-10 | 2020-06-25 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| CN111580988A (zh) * | 2020-04-29 | 2020-08-25 | 广州虎牙科技有限公司 | 开放平台的实现方法、装置、存储介质和计算机设备 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050240412A1 (en) * | 2004-04-07 | 2005-10-27 | Masahiro Fujita | Robot behavior control system and method, and robot apparatus |
| KR20090041997A (ko) * | 2007-10-25 | 2009-04-29 | 강원대학교산학협력단 | 이종 인터페이스들로 구성된 모듈 기반의 지능형로봇시스템을 위한 계층 구조 및 이의 구조화 방법 |
| US20090254217A1 (en) * | 2008-04-02 | 2009-10-08 | Irobot Corporation | Robotics Systems |
| US20100280661A1 (en) * | 2009-04-30 | 2010-11-04 | Abdallah Muhammad E | Hierarchical robot control system and method for controlling select degrees of freedom of an object using multiple manipulators |
| KR101275020B1 (ko) * | 2011-11-18 | 2013-06-17 | 한국생산기술연구원 | 통신망을 이용한 로봇 개발 플랫폼의 집단협업공간 제공시스템 |
-
2016
- 2016-07-22 WO PCT/KR2016/008037 patent/WO2017052060A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20050240412A1 (en) * | 2004-04-07 | 2005-10-27 | Masahiro Fujita | Robot behavior control system and method, and robot apparatus |
| KR20090041997A (ko) * | 2007-10-25 | 2009-04-29 | 강원대학교산학협력단 | 이종 인터페이스들로 구성된 모듈 기반의 지능형로봇시스템을 위한 계층 구조 및 이의 구조화 방법 |
| US20090254217A1 (en) * | 2008-04-02 | 2009-10-08 | Irobot Corporation | Robotics Systems |
| US20100280661A1 (en) * | 2009-04-30 | 2010-11-04 | Abdallah Muhammad E | Hierarchical robot control system and method for controlling select degrees of freedom of an object using multiple manipulators |
| KR101275020B1 (ko) * | 2011-11-18 | 2013-06-17 | 한국생산기술연구원 | 통신망을 이용한 로봇 개발 플랫폼의 집단협업공간 제공시스템 |
Non-Patent Citations (1)
| Title |
|---|
| See also references of EP3354416A4 * |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2020004428A (ja) * | 2019-01-10 | 2020-01-09 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP2020098608A (ja) * | 2019-01-10 | 2020-06-25 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP7303061B2 (ja) | 2019-01-10 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP7303132B2 (ja) | 2019-01-10 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP2020077430A (ja) * | 2019-01-23 | 2020-05-21 | ソフトサーボシステムズ株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| JP7303131B2 (ja) | 2019-01-23 | 2023-07-04 | モベンシス株式会社 | モーション制御プログラム、モーション制御方法及びモーション制御装置 |
| CN111580988A (zh) * | 2020-04-29 | 2020-08-25 | 广州虎牙科技有限公司 | 开放平台的实现方法、装置、存储介质和计算机设备 |
| CN111580988B (zh) * | 2020-04-29 | 2023-09-05 | 广州虎牙科技有限公司 | 开放平台的实现方法、装置、存储介质和计算机设备 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102235168B1 (ko) | 계층적 아키텍처를 갖는 실시간 디바이스 제어 시스템 및 이를 이용한 실시간 로봇 제어 시스템 | |
| WO2017052061A1 (fr) | Système de commande de robot en temps réel connecté à un système d'exploitation à usage général et système de commande de dispositif en temps réel l'utilisant | |
| WO2017052060A1 (fr) | Système de commande de dispositif en temps réel ayant une architecture hiérarchique et système de commande de robot en temps réel utilisant celui-ci | |
| Hild et al. | Myon, a new humanoid | |
| Nesnas | The claraty project: coping with hardware and software heterogeneity | |
| JP2023038338A (ja) | ロボット制御装置、ロボット制御方法、端末装置、端末制御方法、及びロボット制御システム | |
| KR20080073414A (ko) | 지능형 로봇의 지능적 작업 관리를 위한 컴포넌트 기반의작업 관리 소프트웨어의 구조 | |
| WO2022097821A1 (fr) | Dispositif de robot de codage réalisant une fonction ido | |
| Vick et al. | Using OPC UA for distributed industrial robot control | |
| WO2015141984A1 (fr) | Dispositif d'assemblage de robot | |
| WO2017052059A1 (fr) | Système de commande en temps réel, dispositif de commande en temps réel et procédé de commande de système | |
| WO2024186007A1 (fr) | Dispositif et procédé de commande de robot | |
| Zielinski et al. | Applications of MRROC++ robot programming framework | |
| Peekema et al. | Open-source real-time robot operation and control system for highly dynamic, modular machines | |
| Domínguez-Brito et al. | Coolbot: A component model and software infrastructure for robotics | |
| Spirleanu et al. | An experimental framework for Multi-Agents using RTOS based robotic controllers | |
| CN114072739B (zh) | 控制系统、支持装置以及机器可读取存储介质 | |
| Cabrera-Gámez et al. | CoolBOT: A component-oriented programming framework for robotics | |
| WO2025079912A1 (fr) | Procédé de cartographie et de positionnement pour robot, et robot mobile l'utilisant | |
| US20240278425A1 (en) | Containerized plug-in system for robotics | |
| Zhou et al. | A real-time controller development framework for high degrees of freedom systems | |
| Fernandez-Perez et al. | Integrating systems in robotics | |
| Ruskowski et al. | Isaac Sim Integrated Digital Twin for Feasibility Checks in Skill-Based Engineering | |
| Dominguez-Brito et al. | Component software in robotics | |
| Spranger | Myon, a New Humanoid 9 9 Manfred Hild¹, 2, Torsten Siedel¹, Christian Benckendorff¹, Christian Thiele¹, and |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16848786 Country of ref document: EP Kind code of ref document: A1 |
|
| ENP | Entry into the national phase |
Ref document number: 2018514449 Country of ref document: JP Kind code of ref document: A |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 15762063 Country of ref document: US |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2016848786 Country of ref document: EP |