WO2024085159A1 - 作業ロボット調整方法、センシングシステム、センシング方法、移動式ロボット、動作改変システム、動作改変方法、作業ロボット、作業再現システム、作業再現方法、作業習熟システム、作業習熟方法及び作業再現ロボット - Google Patents
作業ロボット調整方法、センシングシステム、センシング方法、移動式ロボット、動作改変システム、動作改変方法、作業ロボット、作業再現システム、作業再現方法、作業習熟システム、作業習熟方法及び作業再現ロボット Download PDFInfo
- Publication number
- WO2024085159A1 WO2024085159A1 PCT/JP2023/037595 JP2023037595W WO2024085159A1 WO 2024085159 A1 WO2024085159 A1 WO 2024085159A1 JP 2023037595 W JP2023037595 W JP 2023037595W WO 2024085159 A1 WO2024085159 A1 WO 2024085159A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- robot
- sensor
- information
- sensing
- motion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1656—Programme controls characterised by programming, planning systems for manipulators
- B25J9/1664—Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J5/00—Manipulators mounted on wheels or on carriages
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0084—Programme-controlled manipulators comprising a plurality of manipulators
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1628—Programme controls characterised by the control loop
- B25J9/163—Programme controls characterised by the control loop learning, adaptive, model based, rule based expert control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0631—Resource planning, allocation, distributing or scheduling for enterprises or organisations
- G06Q10/06316—Sequencing of tasks or work
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0633—Workflow analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/04—Manufacturing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/36—Nc in input of data, input key till input tape
- G05B2219/36442—Automatically teaching, teach by showing
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
- G05B2219/40116—Learn by operator observation, symbiosis, show, watch
Definitions
- the present disclosure relates to an adjustment method for a work robot that performs work in a work environment, a sensing system, a sensing method, a mobile robot, a motion modification system, a motion modification method, a work robot, a work reproduction system, a work reproduction method, a work mastery system, a work mastery method, and a work reproduction robot.
- Patent Document 1 discloses a support system that reduces the workload of workers by determining the work content of the worker based on various information stored in a database, including the shape of the work object, and displaying information on a display device based on information obtained from various sensors installed in the work robot.
- Patent Document 2 discloses a work estimation device that generates a work area image Ic from a cell camera 30 and an overall image Ia from a ceiling camera 80, and performs image analysis.
- Patent Document 3 discloses a robot control method in which a two-dimensional code expressing work information is read by a sensor, an image of a work object is captured by a sensor, operation information for causing a robot to perform a task is generated based on the captured image and the work information, and the robot is caused to perform a task based on the operation information.
- Patent Document 4 discloses a technology for providing a safety education system that effectively communicates the circumstances of an accident, in which accident scene image data is generated based on three-dimensional data of the accident scene and input viewpoint information, of an accident scene image viewed from a viewpoint specified by the viewpoint information, and the image data is displayed on a display unit.
- Patent Document 5 discloses a work proficiency system that identifies the work content of non-standard work using non-standard work model information including the conditions of non-standard work (indicating work with a low level of proficiency), work procedure information, and images of the inside of the work site, in order to identify work with a low level of proficiency.
- the present disclosure therefore aims to provide a method for adjusting a work robot that reduces the time and cost required for analyzing the movements of workers and programming.
- the work of a worker may involve, for example, movement or large movements.
- the sensors cameras
- the work of the worker predetermined movement
- the present disclosure therefore aims to provide a sensing system, sensing method, and mobile robot that can adequately sense a specific movement of a worker.
- the present disclosure therefore aims to provide a sensing system, sensing method, and work robot that can check the operation of a work robot and adjust it to an appropriate operation.
- the present disclosure therefore aims to provide a behavior modification system, a behavior modification method, and a work robot that enables a work robot to work efficiently, the work robot using a learning model that has learned the work of a worker.
- the present disclosure therefore aims to provide a task reproduction system, a task reproduction method, and a task reproduction robot that make it easier to determine the cause of an abnormality.
- the present disclosure therefore aims to provide a work training system, a work training method, and a work reproduction robot that can help workers become proficient at their work.
- the work robot adjustment method disclosed herein includes the steps of moving a mobile robot having a sensor to an environment where a worker operates, recording the worker's movements using the sensor, learning the worker's movements based on the recording, making the work robot perform the same movements as the worker's movements based on the learning, and adjusting the worker's movements to match the movements of the work robot.
- the sensing system comprises a first sensor for sensing a predetermined motion of a sensing target, a second sensor for sensing the predetermined motion of the sensing target from a position different from the first sensor, a first mobile robot including the first sensor and a first moving mechanism, and a management control device capable of communicating with the first sensor, the second sensor, and the first moving mechanism, the management control device having a determination unit that determines whether or not a predetermined part of the sensing target that moves during a predetermined motion is sensed based on first information acquired by the first sensor and second information acquired by the second sensor, and a control unit that operates the first moving mechanism so that the predetermined part is sensed when the determination unit determines that the predetermined part is not sensed.
- the sensing method further comprises a first mobile robot including a first sensor for sensing a predetermined motion of a sensing target, a second sensor for sensing the predetermined motion of the sensing target from a position different from the first sensor, the first sensor, and a first moving mechanism, and a management control device capable of communicating with the first sensor, the second sensor, and the first moving mechanism, wherein the management control device determines whether or not a predetermined part of the sensing target that moves during a predetermined motion is sensed based on first information acquired by the first sensor and second information acquired by the second sensor, and when it is determined by the determination unit that the predetermined part is not sensed, operates the first moving mechanism so that the predetermined part is sensed.
- a first mobile robot including a first sensor for sensing a predetermined motion of a sensing target, a second sensor for sensing the predetermined motion of the sensing target from a position different from the first sensor, the first sensor, and a first moving mechanism, and a management control device capable of communicating with the first sensor
- the sensing system includes a first sensor for sensing a predetermined motion of a sensing target, a second sensor for sensing the predetermined motion of the sensing target from a position different from the first sensor, a first mobile robot including the first sensor and a first moving mechanism, and a management control device capable of communicating with the first sensor, the second sensor, and the first moving mechanism, wherein the management control device includes a determination unit that determines whether a predetermined part of the sensing target that moves during a predetermined motion is sensed based on first information acquired by the first sensor and second information acquired by the second sensor, and determines whether the predetermined part sensed by the first sensor is the same as the predetermined part sensed by the second sensor, and a control unit that operates the first moving mechanism so that the predetermined part sensed by the first sensor is different from the predetermined part sensed by the second sensor when it is determined that the predetermined part sensed by the first sensor is the same as the predetermined part sensed by the second sensor.
- the mobile robot includes a movement mechanism, a first sensor that senses a sensing target, a second sensor that senses the sensing target from a position different from that of the first sensor, a drive mechanism capable of moving the position of the second sensor, and an information processing unit that controls the first sensor, the second sensor, the movement mechanism, and the drive mechanism, and the information processing unit includes a determination unit that determines whether or not a specific part of the sensing target that moves during a specific operation is sensed based on first information acquired by the first sensor and second information acquired by the second sensor, and a control unit that operates the movement mechanism or the drive mechanism so that the specific part is sensed when the determination unit determines that the specific part is not sensed.
- the mobile robot includes a moving mechanism, a first sensor that senses a sensing target, a second sensor that senses the sensing target from a position different from that of the first sensor, a drive mechanism capable of moving the position of the second sensor, and an information processing unit that controls the first sensor, the second sensor, the moving mechanism, and the drive mechanism.
- the information processing unit has a determination unit that determines whether a predetermined part of the sensing target that moves during a predetermined operation is sensed based on first information acquired by the first sensor and second information acquired by the second sensor, and determines whether the predetermined part sensed by the first sensor and the predetermined part sensed by the second sensor are the same, and a control unit that operates the moving mechanism or the drive mechanism so that the predetermined part sensed by the first sensor and the predetermined part sensed by the second sensor are different when it is determined that the predetermined part sensed by the first sensor and the predetermined part sensed by the second sensor are the same.
- the sensing system comprises a first sensor for sensing a predetermined motion of a sensing target, a work robot that operates according to an operation instruction, a second sensor for sensing the robot motion of the work robot, and a management control device capable of communicating with the first sensor, the second sensor, and the work robot, the management control device having a learning unit that learns the predetermined motion by referring to first information acquired by the first sensor, a motion information generation unit that generates motion control information for providing the motion instruction to the work robot by referring to a learning result of the predetermined motion by the learning unit, and an adjustment unit that compares the first information with second information acquired by the second sensor and adjusts the motion control information so that the robot motion of the work robot approximates the predetermined motion.
- the sensing method includes a first sensor for sensing a predetermined motion of a sensing target, a work robot that operates according to an operation instruction, a second sensor for sensing the robot motion of the work robot, and a management control device capable of communicating with the first sensor, the second sensor, and the work robot, and the management control device learns the predetermined motion by referring to first information acquired by the first sensor, generates motion control information for giving the operation instruction to the work robot by referring to a learning result of the predetermined motion by the learning unit, compares the first information with second information acquired by the second sensor, and adjusts the motion control information so that the robot motion of the work robot approximates the predetermined motion.
- the working robot is a working robot that operates in response to an operation instruction, and includes a first sensor for sensing a predetermined operation of a sensing target, a second sensor for sensing the robot operation of the working robot, and an information processing unit capable of communicating with the first sensor and the second sensor, and the information processing unit includes a learning unit that learns the predetermined operation by referring to first information acquired by the first sensor, a operation information generating unit that generates operation control information for providing the operation instruction to the working robot by referring to a learning result of the predetermined operation by the learning unit, and an adjustment unit that compares the first information with second information acquired by the second sensor and adjusts the operation control information so that the robot operation of the working robot approximates the predetermined operation.
- the behavior modification system comprises a work robot, a sensor, and a management control device capable of communicating with the work robot and the sensor.
- the management control device comprises a learning unit that learns a standard behavior model corresponding to a predetermined motion of the sensing target based on sensing information corresponding to the predetermined motion of the sensing target obtained using the sensor, a model generation unit that references the standard behavior model and generates a modified behavior model in which the execution time of each motion in the standard behavior model is set shorter than the time required for each motion when the standard behavior model was generated, and a control unit that operates the work robot by reference to the modified behavior model.
- the behavior modification system further comprises a work robot, a plurality of sensors for sensing a plurality of different sensing targets, and a management control device capable of communicating with the work robot and the plurality of sensors, the management control device comprising: a learning unit that learns each of the predetermined behaviors of the plurality of sensing targets and a plurality of standard behavior models corresponding to each of the predetermined behaviors of the plurality of sensing targets based on a plurality of sensing information corresponding to the predetermined behaviors of the plurality of sensing targets acquired using the plurality of sensors; a model generation unit that generates a modified behavior model that integrates at least a portion of the predetermined behaviors of the plurality of sensing targets by referring to the plurality of standard behavior models; and a control unit that operates the work robot by referring to the modified behavior model.
- the movement modification method is characterized in that it learns a standard movement model corresponding to a predetermined movement of a sensing target based on sensing information corresponding to the predetermined movement of the sensing target obtained using a sensor, references the standard movement model to generate a modified movement model in which the execution time of each movement in the standard movement model is set shorter than the time required for each movement when the standard movement model was generated, and operates the work robot by reference to the modified movement model.
- the working robot is characterized by having a drive mechanism for operating the working robot, a learning unit that learns a standard motion model corresponding to a predetermined motion of a sensing target based on sensing information corresponding to the predetermined motion of the sensing target obtained using a sensor, a model generation unit that references the standard motion model and generates a modified motion model in which the execution time of each motion in the standard motion model is set shorter than the time required for each motion when the standard motion model was generated, and a control unit that operates the working robot to control the drive mechanism by reference to the modified motion model.
- the task reproduction system comprises a task reproduction robot, a sensor capable of sensing the motion of the task reproduction robot, and a management control device capable of communicating with the task reproduction robot and the sensor, the management control device comprising: a learning unit that learns a standard motion model corresponding to a predetermined motion of a sensing target based on first sensing information corresponding to the predetermined motion of the sensing target; a control unit that causes the task reproduction robot to perform a reproducing motion one or more times by referring to the standard motion model; an input unit that inputs accident or malfunction information; and a detection unit that detects the occurrence of the accident or malfunction based on second sensing information corresponding to the reproducing motion of the task reproduction robot obtained using the sensor.
- the task reproduction system comprises a task reproduction robot, a sensor capable of sensing the motion of the task reproduction robot, and a management control device capable of communicating with the task reproduction robot and the sensor, the management control device comprising: a learning unit that learns a standard motion model corresponding to a predetermined motion of a sensing target based on first sensing information corresponding to the predetermined motion of the sensing target; a control unit that causes the task reproduction robot to perform the reproduction motion one or more times by referring to the standard motion model; a storage unit that stores task manual information or process chart information of the sensing target; and a detection unit that detects the occurrence of a motion different from the task manual information or process chart information based on second sensing information corresponding to the reproduction motion of the task reproduction robot acquired using the sensor.
- the task reproduction method includes a task reproduction robot, a sensor capable of sensing the motion of the task reproduction robot, and a management control device capable of communicating with the task reproduction robot and the sensor, and the management control device learns a standard motion model corresponding to a predetermined motion of a sensing target based on first sensing information corresponding to the predetermined motion of a sensing target, causes the task reproduction robot to perform a reproducing motion one or more times by referring to the standard motion model, inputs accident or malfunction information, and detects the occurrence of the accident or malfunction based on second sensing information corresponding to the reproducing motion of the task reproduction robot obtained using the sensor.
- the task reproduction method includes a task reproduction robot, a sensor capable of sensing the motion of the task reproduction robot, and a management control device capable of communicating with the task reproduction robot and the sensor, and the management control device learns a standard motion model corresponding to a predetermined motion of the sensing target based on first sensing information corresponding to the predetermined motion of the sensing target, causes the task reproduction robot to perform the reproducing motion one or more times by referring to the standard motion model, stores task manual information or process chart information of the sensing target, and detects the occurrence of a motion different from the task manual information or process chart information based on second sensing information corresponding to the reproducing motion of the task reproduction robot obtained using the sensor.
- the task reproducing robot further comprises a sensor capable of sensing the motion of the task reproducing robot and an information processing device capable of communicating with the sensor, the information processing device comprising: a learning unit that learns a standard motion model corresponding to a predetermined motion of the sensing target based on first sensing information corresponding to the predetermined motion of the sensing target; a control unit that causes the task reproducing robot to perform the reproducing motion one or more times by referring to the standard motion model; an input unit that inputs accident or malfunction information; and a detection unit that detects the occurrence of the accident or malfunction based on second sensing information corresponding to the reproducing motion of the task reproducing robot acquired using the external sensor.
- the work reproduction robot further comprises a sensor capable of sensing the operation of the work reproduction robot and an information processing device capable of communicating with the sensor, the information processing device being characterized by comprising: a learning unit that learns a standard operation model corresponding to a predetermined operation of the sensing target based on first sensing information corresponding to the predetermined operation of the sensing target; a control unit that causes the work reproduction robot to perform the reproduction operation one or more times by referring to the standard operation model; a storage unit that stores work manual information or process chart information of the sensing target; and a detection unit that detects the occurrence of an operation different from the work manual information or the process chart information based on second sensing information corresponding to the reproduction operation of the work reproduction robot acquired using the sensor.
- the work training system comprises a work reproduction robot, a sensor capable of sensing the movements of a new worker, and a management control device capable of communicating with the work reproduction robot and the sensor, the management control device comprising: a memory unit that stores a standard movement model learned based on first sensing information corresponding to a specific movement of an experienced worker; a control unit that causes the work reproduction robot to perform the reproduction movement by referring to the standard movement model; and a detection unit that detects the points where the movement of the new worker differs from the standard movement model based on second sensing information corresponding to the movement of the new worker obtained using the sensor.
- the work mastery method includes a work reproduction robot, a sensor capable of sensing the movements of a new worker, and a management control device capable of communicating with the work reproduction robot and the sensor, and the management control device stores a standard movement model learned based on first sensing information corresponding to a specific movement of an experienced worker, causes the work reproduction robot to perform a reproducing movement by referring to the standard movement model, and detects differences in the movement of the new worker from the standard movement model based on second sensing information corresponding to the movement of the new worker obtained using the sensor.
- the work reproduction robot further comprises a sensor capable of sensing the movements of a new worker and an information processing device capable of communicating with the sensor, and the information processing device comprises a memory unit that stores a standard movement model learned based on first sensing information corresponding to a specific movement of an experienced worker, a control unit that causes the work reproduction robot to perform a reproducing movement by referring to the standard movement model, and a detection unit that detects differences in the movement of the new worker from the standard movement model based on second sensing information corresponding to the movement of the new worker obtained using the sensor.
- the present disclosure provides a method for adjusting a work robot that reduces the time and cost required for analyzing the movements of workers and programming them.
- the present disclosure also provides a sensing system, sensing method, and mobile robot that can adequately sense a specific motion of a worker.
- the present disclosure also provides a sensing system, sensing method, and work robot that can check the operation of a work robot and adjust it to an appropriate operation.
- the present disclosure provides a method for modifying the behavior of a work robot that performs work using a learning model that has learned the work of a worker, and a work robot that enables the work robot to perform work efficiently.
- the present disclosure also provides a task reproduction system, a task reproduction method, and a task reproduction robot that make it easier to determine the cause of an abnormality.
- the present disclosure also provides a work training system, a work training method, and a work reproduction robot that can help workers become proficient at their work.
- FIG. 1 is a diagram showing an example of a system configuration in a working robot adjustment method according to a first embodiment of the present disclosure.
- FIG. FIG. 2 is a diagram illustrating an example of the working robot shown in FIG. 1 .
- 2 is a diagram showing an example of a sensor mounting member shown in FIG. 1 .
- 1 is a block diagram showing an example of a functional configuration in a working robot adjustment method according to a first embodiment of the present disclosure.
- FIG. 4 is an example of a flowchart illustrating the processing of a working robot adjustment method according to a first embodiment of the present disclosure.
- 13 is a diagram showing an example of a modified example of the sensor mounting member.
- FIG. 11 is a diagram illustrating an example of a system configuration of a sensing system according to a second embodiment of the present disclosure.
- FIG. 6B is a diagram showing an example of the mobile robot shown in FIG. 6A.
- FIG. 11 is a diagram showing an example of a sensing system according to a second embodiment of the present disclosure when a mobile robot moves.
- FIG. 11 is a block diagram showing an example of the configuration and functions of a sensing system according to a second embodiment of the present disclosure.
- 11 is a block diagram showing an example of the functions of a management control device in a sensing system of a second embodiment of the present disclosure.
- FIG. 13 is an example of a flowchart illustrating processing of a sensing system according to a second embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a system configuration of a sensing system according to a first modified example of the second embodiment of the present disclosure.
- FIG. 13B is a diagram showing an example of the mobile robot shown in FIG. 13A.
- FIG. 13 is a diagram illustrating an example of a system configuration of a sensing system according to a second modification of the second embodiment of the present disclosure.
- 15B is a diagram showing an example of the sensor mounting member shown in FIG. 15A.
- FIG. 13 is a diagram illustrating an example of the configuration and functions of a sensing system according to a second modification of the second embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of a system configuration of a sensing system according to a third embodiment of the present disclosure.
- FIG. 17B is a diagram showing an example of the humanoid robot shown in FIG. 17A.
- FIG. 11 is a block diagram showing an example of the configuration and functions of a sensing system according to a third embodiment of the present disclosure.
- FIG. 13A to 13C are diagrams illustrating an example of each sensing period in a sensing system according to a third embodiment of the present disclosure.
- 13 is an example of a flowchart illustrating processing of a sensing system according to a third embodiment of the present disclosure.
- 22 is an example of a flowchart showing more detailed processing of the operation control information generating process shown in step S3103 of FIG. 21.
- 22 is an example of a flowchart showing more detailed processing of the action control information adjustment processing indicated in step S3106 of FIG. 21.
- FIG. 13 is a diagram illustrating an example of a system configuration of a sensing system according to a first modified example of the third embodiment of the present disclosure.
- FIG. 24B is a diagram showing an example of the humanoid robot shown in FIG. 24A.
- FIG. 13 is a block diagram showing an example of a function of a humanoid robot in a sensing system according to a first modified example of a third embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a system configuration of a sensing system according to a second modification of the third embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of a system configuration of a sensing system according to a third modification of the third embodiment of the present disclosure.
- FIG. 27B is a diagram showing an example of the sensor mounting member shown in FIG. 27A.
- FIG. 13 is a diagram illustrating an example of a system configuration of an operation modification system according to a fourth embodiment of the present disclosure.
- FIG. 29B is a diagram showing an example of the humanoid robot shown in FIG. 29A.
- FIG. 13 is a diagram showing an example of a relationship between a standard behavior model and a behavior modification model in a behavior modification system according to a fourth embodiment of the present disclosure.
- FIG. 13 is a block diagram showing an example of the configuration and functions of an operation modification system according to a fourth embodiment of the present disclosure.
- FIG. 13 is a block diagram showing an example of the functions of a management control device in an operation modification system of embodiment 4 according to the present disclosure.
- 13 is an example of a flowchart showing the processing of an operation modification system according to a fourth embodiment of the present disclosure.
- 34 is an example of a flowchart showing more detailed processing of the modified action model generation processing indicated in step S4103 of FIG. 33.
- FIG. 13 is a diagram illustrating an example of a system configuration of an operation modification system according to a first modified example of a fourth embodiment of the present disclosure.
- FIG. 35C is a diagram showing an example of the humanoid robot shown in FIG. 35B.
- FIG. 13 is a block diagram showing an example of functions of a humanoid robot in a behavior modification system according to a first modified example of a fourth embodiment of the present disclosure.
- 13 is an example of a flowchart illustrating more detailed processing of a modified action model generation process in the action modification system according to a first modified example of the fourth embodiment of the present disclosure.
- FIG. 13 is a diagram showing an example of a relationship between a standard action model and an action modification model in a action modification system according to a first modified example of a fourth embodiment of the present disclosure.
- FIG. 13 is a diagram illustrating an example of sensing when a worker reproduces an action in the task reproduction system according to the fifth embodiment of the present disclosure.
- FIG. 13 is a diagram showing an example of sensing when a task reproduction robot is made to reproduce a motion.
- 13 is an example of a humanoid robot in a task reproduction system according to a fifth embodiment of the present disclosure.
- FIG. 13 is a block diagram showing an example of the configuration and functions of a task reproduction system according to a fifth embodiment of the present disclosure.
- FIG. 13 is a block diagram showing an example of functions of a management control device in a work reproduction system according to a fifth embodiment of the present disclosure.
- 13 is an example of a flowchart illustrating a process of a task reproduction system according to a fifth embodiment of the present disclosure.
- 44 is a flowchart showing a more detailed example of the worker action learning/standard action model generation process shown in step S5102 of FIG. 43.
- FIG. 23 is an example of a flowchart illustrating a process of the task reproduction system according to a first modified example of the fifth embodiment of the present disclosure.
- FIG. 23 is a diagram illustrating an example of a system configuration of a task reproduction system according to a second modified example of the fifth embodiment of the present disclosure.
- FIG. 46B is a diagram showing an example of the humanoid robot shown in FIG. 46A.
- FIG. 23 is a block diagram showing an example of functions of a humanoid robot in a task reproduction system according to a second modified example of the fifth embodiment of the present disclosure.
- FIG. 48B is a diagram showing an example of the humanoid robot shown in FIG. 48A.
- a block diagram showing an example of the configuration and functions of a task training system of embodiment 6 according to the present disclosure A block diagram showing an example of the functions of a management control device in a work training system of embodiment 6 according to the present disclosure.
- 13 is an example of a flowchart showing the processing of a task mastery system of embodiment 6 according to the present disclosure.
- 52 is an example of a flowchart showing more detailed processing of the worker motion learning and model generation processing shown in step S6102 of FIG. 51 .
- 52 is an example of a flowchart showing more detailed processing of the motion detection processing indicated in step S6105 of FIG. 51.
- FIG. 54B is a diagram showing an example of the humanoid robot shown in FIG. 54A.
- FIG. 1 is a diagram for explaining the working robot adjustment method. To avoid complicating Fig. 1, the reference numerals for the sensing areas of the sensors are only used to denote the sensing area 330a of the mounting member sensor 33, which will be described later. Details of the working robot adjustment system 100 for carrying out the working robot adjustment method will be described later with reference to Fig. 3.
- the working robot adjustment method includes multiple humanoid robots 20a-d that function as mobile robots, and sensor mounting members 30a-d that are connected to each of the humanoid robots 20a-d.
- Each humanoid robot 20a-d moves to the vicinity of a worker 400 working on the work line 201 in the workplace 200 upon receiving a command from a management control device 60 (see FIG. 3) described below, or upon receiving a command from an information processing device 25 (see FIG. 3) provided inside the humanoid robot 20a-d.
- Each sensor mounting member 30a-d is connected to each humanoid robot 20a-d, and therefore moves in accordance with the movement of the humanoid robot 20a-d.
- automatic learning refers to automatically creating a learned model and learning to make judgments/analyses using a learned model.
- the humanoid robot 20 functioning as a working robot is made to perform the same movements as those of the worker 400 upon receiving a command from the management control device 60 or an instruction from the information processing device 25 provided inside the humanoid robots 20a-d.
- the process of making the humanoid robot 20 perform the same movements as those of the worker 400 is also performed by automatic learning.
- the humanoid robot 20 performs the automatically learned task, for example, about 300 times, and the process is repeated until the humanoid robot 20 can execute the task with the same actions, route, speed, etc. as the automatically learned task.
- the process is repeated by increasing the speed of the automatically learned task, for example, by 20 times, until the automatically learned task can be executed with the same actions, route, speed, etc. as the automatically learned task.
- the learning results are reflected in multiple work robots (humanoid robots 20), and the above process is repeated until the movements of the multiple work robots are adjusted so that they are consistent from start to stop. This makes it possible to reduce the time and cost required for analyzing the movements of workers and programming them.
- FIG. 2 is a diagram showing an example of the work robot and sensor mounting members shown in FIG. 1.
- the configurations of the humanoid robots 20a-d and the sensor mounting members 30a-d will be described with reference to FIG. 2.
- the humanoid robot 20 includes a robot body 21, a robot movement mechanism 22, a robot sensor 23, a robot imaging device 24 included in the robot sensor 23, an information processing device 25 (see FIG. 3), and an arm 26.
- the humanoid robot 20 can move using a robot movement mechanism 22 provided below the robot body 21, and moves to the vicinity of the work line 201 in the workplace 200 upon receiving commands from outside the humanoid robot 20, such as a management control device, or by referring to a program recorded in an information processing device 25.
- the robot body 21 is provided with a robot movement mechanism 22 below, an arm 26 above, and a robot sensor 23 above the arm 26.
- An information processing device 25 is also provided inside the robot body 21.
- the robot movement mechanism 22 may be of any configuration, for example it may be equipped with a rotating body driven by a motor, or it may have legs shaped to resemble human legs.
- the robot sensor 23 is provided above the humanoid robot 20, preferably at the top of the robot body 21, in other words, near the head of the humanoid robot, and detects the worker 400.
- the robot sensor 23 also sequentially acquires information indicating at least the distance and angle between the arm 26 and an object around the humanoid robot 20 on which the humanoid robot 20 is working.
- Examples of the robot sensor 23 include the highest performance camera, a thermo camera, a high-pixel, telephoto, ultra-wide-angle, 360-degree, high-performance camera, radar, solid-state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or various other sensor groups. These are also examples of the robot imaging device 24.
- robot sensor 23 examples include a vibration meter, a hardness meter, fine sound, ultrasound, vibration, infrared, ultraviolet, electromagnetic waves, temperature, humidity, spot AI weather forecast, high-precision multi-channel GPS, low-altitude satellite information, or long-tail incident AI data.
- Examples of sensor information acquired from the robot sensor 23 include images, distance, vibration, heat, smell, color, sound, ultrasound, ultraviolet light, infrared light, etc., and preferably the image and distance information is acquired by the robot imaging device 24.
- the robot sensor 23 (robot imaging device 24) performs this detection every nanosecond, for example.
- the sensor information is used, for example, for motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, analysis of cornering, speed, etc.
- the arm 26 is attached to the top of the robot body 21 so that it can rotate freely.
- a gripping part (not shown) for grasping an object is attached to the tip of the arm 26.
- the sensor mounting member 30 is connected to the humanoid robot 20 via the gripping part.
- the humanoid robot 20 may further include a sensor in the center of the robot body 21, for example in the torso of the humanoid robot.
- the sensor is located at a different height than the robot sensor 23 located near the top of the robot body. The different height allows the sensor to detect the movements of the worker 400 from different angles.
- the sensor mounting member 30 includes a mounting member main body 31, a mounting member movement mechanism 32, a mounting member sensor 33, and a mounting member imaging device 34.
- the sensor mounting member 30 can be moved by the mounting member movement mechanism 32 provided below the mounting member main body 31.
- the mounting member body 31 is, for example, a rod- or cane-shaped member, and its material is not particularly limited.
- the length of the mounting member body 31 is longer than the height (back height) of the humanoid robot 20.
- the mounting member body 31 is provided with a mounting member movement mechanism 32 below, preferably at the lower end, and a mounting member sensor 33 above, preferably at the upper end, of the mounting member body 31.
- the mounting member moving mechanism 32 is configured with a rotating body such as a caster, and assists the sensor mounting member 30 in moving in accordance with the movement of the humanoid robot 20. Note that in this embodiment, it is not assumed that the sensor mounting member 30 moves autonomously, but a mounting member control unit (not shown) that issues commands to the mounting member moving mechanism 32 may be provided, and the mounting member moving mechanism 32 may be moved based on a signal from the mounting member control unit.
- the mounting member sensor 33 is provided above the mounting member main body 31 and detects the worker 400.
- the mounting member sensor 33 also sequentially acquires information that at least indicates the distance and angle between the arm 26 and an object on which the humanoid robot 20 is working, which is located in the vicinity of the humanoid robot 20.
- An example of the mounting member sensor 33 is similar to the robot sensor 23, and an example of the mounting member imaging device 34 is similar to the robot imaging device 24.
- an example of the acquired sensor information is similar to the robot sensor 23, and an example of the detection timing of the sensor information is also similar to the robot sensor 23.
- the mounting member imaging device 34 is included in the mounting member sensor 33.
- the mounting member sensor 33 which includes the mounting member imaging device 34, is positioned at a position higher than the height (back height) of the humanoid robot 20. This allows the mounting member sensor 33 to detect the movements of the worker from a higher position than the robot sensor 23.
- the mounting member sensor 33 is provided on the mounting member body 31 so that its sensing area 330 is oriented in a direction that detects the movements of the worker 400.
- the robot sensor 23 is also provided on the robot body 21 so that its sensing area (not shown) is oriented in a direction that detects the movements of the worker 400.
- the four humanoid robots 20a-d and the four sensor mounting members 30a-d are each positioned to detect the movements of the worker 400 from different positions, heights, and/or directions.
- the number of robots for recording the movements of the worker 400 is not limited to four, and may be one to three, or five or more.
- the sensor mounting member 30d is provided with an extension member 35d and an additional mounting member sensor 33d2 to sense the hand of the worker 400.
- the extension member 35d is a rod-shaped member and is arranged to extend horizontally from the vicinity of the mounting member sensor 33d1.
- the mounting member sensor 33d2 is provided at the tip of the extension member 35d, and the mounting member sensor 33d2 senses the worker 400 from above.
- the mounting member sensor 33d2 makes it easier to detect the movements of the worker 400.
- the number of sensors possessed by the multiple humanoid robots will be eight in total, and the number of sensors possessed by the multiple sensor mounting members will be five in total, making it possible to detect the movements of the worker 400 with a total of 13 sensors.
- FIG. 3 is a block diagram showing an example of the functional configuration of the work robot adjustment system 100.
- the work robot adjustment system 100 is composed of a humanoid robot 20, a sensor mounting member 30, and a management control device 60.
- the humanoid robot 20 is connected to the robot communication unit 68 of the management control device 60 and the sensor mounting member 30 via wireless or wired communication, respectively, and receives commands from the management control device 60 and acquires detection results from the sensor mounting member 30.
- the sensor mounting member 30 may be configured to be able to communicate with the management control device 60. Also, multiple humanoid robots 20, rather than just one, may be connected to the management control device 60.
- the humanoid robot 20 includes a robot sensor 23, a robot imaging device 24 included in the robot sensor 23, and an information processing device 25.
- the information processing device 25 includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the information processing device 25 also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the information processing device 25 also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the information processing device 25.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- ROM 1230 stores therein a boot program or the like to be executed by the information processing device 25 upon activation, and/or a program that depends on the hardware of the information processing device 25.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the information processing device 25, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the information processing device 25.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described programs or software modules may be stored in a computer-readable storage medium on the information processing device 25 or in the vicinity of the information processing device 25.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the programs to the information processing device 25 via the network.
- the blocks in the flowcharts and diagrams in this embodiment may represent stages of a process in which an operation is performed or "parts" of a device responsible for performing the operation. Particular stages and “parts" may be implemented by dedicated circuitry, programmable circuitry provided with computer-readable instructions stored on a computer-readable storage medium, and/or a processor provided with computer-readable instructions stored on a computer-readable storage medium.
- the dedicated circuitry may include digital and/or analog hardware circuitry and may include integrated circuits (ICs) and/or discrete circuits.
- the programmable circuitry may include reconfigurable hardware circuitry including AND, OR, XOR, NAND, NOR, and other logical operations, flip-flops, registers, and memory elements, such as, for example, field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), and the like.
- FPGAs field programmable gate arrays
- PLAs programmable logic arrays
- a computer-readable storage medium may include any tangible device capable of storing instructions that are executed by a suitable device, such that a computer-readable storage medium having instructions stored thereon comprises an article of manufacture that includes instructions that can be executed to create means for performing the operations specified in the flowchart or block diagram.
- Examples of computer-readable storage media may include electronic storage media, magnetic storage media, optical storage media, electromagnetic storage media, semiconductor storage media, and the like.
- Computer-readable storage media may include floppy disks, diskettes, hard disks, random access memories (RAMs), read-only memories (ROMs), erasable programmable read-only memories (EPROMs or flash memories), electrically erasable programmable read-only memories (EEPROMs), static random access memories (SRAMs), compact disk read-only memories (CD-ROMs), digital versatile disks (DVDs), Blu-ray disks, memory sticks, integrated circuit cards, and the like.
- RAMs random access memories
- ROMs read-only memories
- EPROMs or flash memories erasable programmable read-only memories
- EEPROMs electrically erasable programmable read-only memories
- SRAMs static random access memories
- CD-ROMs compact disk read-only memories
- DVDs digital versatile disks
- Blu-ray disks memory sticks, integrated circuit cards, and the like.
- the computer readable instructions may include either assembler instructions, instruction set architecture (ISA) instructions, machine instructions, machine-dependent instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including object-oriented programming languages such as Smalltalk (registered trademark), JAVA (registered trademark), C++, etc., and conventional procedural programming languages such as the "C" programming language or similar programming languages.
- ISA instruction set architecture
- machine instructions machine-dependent instructions
- microcode firmware instructions
- state setting data or source or object code written in any combination of one or more programming languages, including object-oriented programming languages such as Smalltalk (registered trademark), JAVA (registered trademark), C++, etc., and conventional procedural programming languages such as the "C" programming language or similar programming languages.
- the computer-readable instructions may be provided to a processor of a general-purpose computer, special-purpose computer, or other programmable data processing apparatus, or to a programmable circuit, either locally or over a local area network (LAN), a wide area network (WAN) such as the Internet, so that the processor of the general-purpose computer, special-purpose computer, or other programmable data processing apparatus, or to a programmable circuit, executes the computer-readable instructions to generate means for performing the operations specified in the flowcharts or block diagrams.
- processors include computer processors, processing units, microprocessors, digital signal processors, controllers, microcontrollers, etc.
- the sensor mounting member 30 is connected to the arm 26 of the humanoid robot 20 and moves in accordance with the humanoid robot 20.
- the mounting member sensor 33 (mounting member imaging device 34) detects information about the object and transmits the information to the information processing device 25 via the mounting member communication unit (not shown).
- the management control device 60 is a control device that issues commands to the humanoid robot 20 to realize the work adjustment robot adjustment method.
- the management control device 60 may also acquire sensor information stored in the storage device 1224.
- the management control device 60 is composed of a CPU 60A, RAM 60B, ROM 60C, an input/output unit (I/O) 60D, a bus 60E such as a data bus or control bus that connects these, and a robot transmission/reception unit 68.
- a recording medium 62 is connected to the I/O 60D.
- a robot-related transceiver unit 68 is connected to the I/O 60D, which transmits and receives operation control information, including work information, between the control system of the humanoid robot 20.
- FIG. 4 is an example of a flowchart showing the process of the working robot adjustment method of this embodiment.
- the information processing device 25 instructs the humanoid robot 20, which functions as a mobile robot, to move to the work site (work environment) 200 (step S101).
- the movement is caused by the operation of the robot movement mechanism 22 of the humanoid robot 20.
- the sensor mounting member 30 is connected to the humanoid robot 20, it moves in conjunction with the movement of the humanoid robot 20.
- the sensing areas (imaging areas) of the robot sensor 23 (robot imaging device 24) and the sensor mounting member sensor 33 (mounting member imaging device 34) detect the worker 400 from different directions.
- the arrangement of such humanoid robots 20 and sensor mounting members 30 is performed, for example, by recording a floor plan of the workplace 200 in advance in the storage device 1224 and/or recording medium 62, and associating the positions of each humanoid robot 20, etc. with the recorded floor plan.
- the arrangement of the humanoid robots 20, etc. is based on positions optimized through machine learning.
- the movements of the worker 400 on the work line 201 are detected by the multiple sensors 23a-d, 33a-d (multiple image capture devices 24a-d, 34a-d) (step S102).
- the information processing device 25 of each humanoid robot 20a-d acquires sensor information detected by the various sensors.
- the acquired sensor information is stored in the storage device 1224.
- the information processing device 25 learns the movements of the worker 400 based on the sensor information stored, in other words, recorded, in the storage device 1224 (step S103). In the learning, motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, cornering, speed, etc. are analyzed, and the optimal movements of the humanoid robot 20 are learned by automatic learning.
- the information processing device 25 or the management control device 60 instructs the humanoid robot 20, which functions as a working robot, to perform the same movements as the worker 400 based on the automatic learning of step S103 (step S104).
- the humanoid robot 20 is made to repeatedly perform the movements based on the automatic learning of step S103 so that the robot's arm 26 and/or the robot movement mechanism 22 become the same as the work (movement) of the worker 400 obtained by the automatic learning of step S103.
- This movement is made to be performed several hundred times, for example, about 300 times, and this process is repeated until the humanoid robot performs the same movements, route (movement), speed, etc. as the work obtained by the automatic learning.
- the speed of the automatically learned task (movement) is increased, for example to 20 times faster, and the humanoid robot 20 is made to perform the movement based on the automatic learning in step S103. Again, the movement is repeated several hundred times, for example about 300 times, until the humanoid robot 20 can perform the same movement, route (movement), and speed as the task obtained by automatic learning.
- the information processing device 25 or the management control device 60 performs a process of adjusting the movement of the humanoid robot (working robot) so that it coincides with the movement of the worker 400 (step S105). Specifically, the information processing device 25 or the management control device 60 instructs the multiple work robots, in this embodiment the multiple humanoid robots 20, to perform the same movement as the worker 400 based on the result obtained in step S104. This process is repeated until the movements of the multiple humanoid robots 20 are synchronized. When the movements of the multiple humanoid robots 20 are synchronized, the series of steps in the working robot adjustment method is completed.
- the humanoid robot 20 which is a mobile robot, is equipped with sensors, and the humanoid robot 20 automatically moves to an appropriate position according to a pre-stored program or machine learning results. Therefore, compared to the case where fixed sensors are placed in the workplace 200, there is no need to rearrange sensors or increase the number of sensors depending on the workplace, and the sensing environment for the worker 400 can be prepared in a short period of time and at low cost.
- the humanoid robot 20 is also connected to a sensor mounting member 30 on which a sensor (imaging device) is arranged. Therefore, when the humanoid robot 20 moves, multiple sensors can be moved simultaneously.
- the working robot adjustment method also employs a configuration in which a mobile robot (humanoid robot 20) equipped with a sensor senses the work (movements) of the worker 400. This allows the mobile robot to move in accordance with the movement of the worker 400, making it possible to perform sensing at a position suitable for the movement of the worker 400.
- the sensor mounting member 30 (imaging device 40 for mounting member) is placed at a position higher than the height (back height) of the humanoid robot. Therefore, the movements of the worker 400 can be sensed from a more bird's-eye view position.
- the working robot adjustment method employs a configuration in which multiple humanoid robots 20a-d and multiple sensor mounting members 30a-d are each placed in different positions, and the movements of the worker 400 are sensed by these sensors. This makes it possible to sense one task performed by the worker 400 from different positions, and allows a large amount of data required for automatic learning to be acquired at once. As a result, it is possible to reduce the time and cost required for analyzing the movements of the worker 400 and programming them.
- the movements of the worker 400 are automatically learned based on sensor information, and in the automatic learning, motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, cornering, speed, etc. are analyzed. This makes it possible to analyze the movements of the worker 400 from multiple perspectives at once, reducing the time and cost required for analyzing and programming the movements of the worker 400.
- the working robot adjustment method also employs a configuration in which the working robot (humanoid robot 20) is operated based on learning, and the process is repeated until the automatically learned work and the work robot's movement have the same movement, route (movement), speed, etc. This reduces the time required to check whether the work robot's movement is accurate with the work of the worker 400.
- the same robot (humanoid robot 20) is used as both the mobile robot and the working robot. This eliminates the need to manufacture the robot in separate processes, reducing the cost and time required to adjust the working robot.
- the working robot adjustment method also employs a configuration that adjusts the movements of the working robot (humanoid robot 20) so that they match those of the worker. This ensures the accuracy of the movements of the working robot and the work of the worker 400.
- the working robot adjustment method also employs a configuration in which the results of automatic learning are applied to multiple working robots (humanoid robots 20) and the process is repeated until the movements of the multiple working robots become identical. This further ensures the accuracy of the movements of the working robots and the work of the worker 400.
- FIG. 5 shows an example of a modified sensor mounting member.
- sensor mounting member 30' is provided with three or more (eight in FIG. 5) sensors 33a'-33g' for the sensor mounting member.
- the sensor mounting member 30' comprises a mounting member main body 30', a mounting member moving mechanism 22', a plurality of sensor mounting member sensors 33a'-33g' (a plurality of imaging devices), and a plurality of extension members 35a', 35b'. Note that in FIG. 5, to avoid cluttering the drawing, only the extension members 35a', 35b' are indicated by reference numerals.
- the sensor mounting member 30' has multiple extension members that give it an appearance resembling the legs of a spider.
- the sensor mounting member 30' has multiple sensor mounting member sensors 33a'-33g' arranged in different directions and/or at different heights.
- the sensor mounting member 30' allows multiple sensors (imaging devices) to be mounted, and multiple sensors (imaging devices) can be moved at once via the mounting member movement mechanism 32'. This makes it possible to sense the work of workers from various heights, positions, and/or directions, even in a workplace where the space is too small to mount multiple sensor mounting members.
- the mobile robot humanoid robot
- the sensor mounting member are described as being multiple. However, this is not limited to the above, and for example, a single mobile robot equipped with a sensor may be moved around to record the work of workers.
- the learning in S103 has been described as being performed automatically.
- the learning does not necessarily have to be automatic, and may be other known machine learning, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- the mobile robot and the work robot are described as being the same humanoid robot. However, the mobile robot and the work robot may be different robots.
- FIG. 6A is a diagram showing an example of a system configuration of a sensing system according to embodiment 2 of the present disclosure.
- the sensing system includes a first humanoid robot 2020a and a second humanoid robot 2020b that function as mobile robots. Note that the number of humanoid robots is not limited to two.
- Each humanoid robot 2020a, 2020b moves to the vicinity of a worker 400 working on the work line 201 of the workplace 200 upon receiving instructions from a management control device 2060 (see FIG. 8) described later, or upon instructions from each information processing device 2025a, 2025b (see FIG. 8) provided in each humanoid robot 2020a, 2020b.
- the sensing system senses a predetermined movement of the worker 400 using a first sensor 2023a (first imaging device 2024a) provided in the first humanoid robot 2020a and a second sensor 2023b (second imaging device 2024b) provided in the second humanoid robot 2020b.
- the second sensor 2023b (second imaging device 2024b) senses the predetermined movement of the worker 400 from a position different from that of the first sensor 2023a (first imaging device 2024a).
- FIG. 6B is a diagram showing an example of the mobile robot shown in FIG. 6A.
- the humanoid robot 2020 functioning as a mobile robot includes a robot body 2021, a robot movement mechanism 2022, a robot sensor 2023, a robot imaging device 2024 included in the robot sensor 2023, an information processing device 2025, and a robot arm 2026.
- the humanoid robot 2020 can move using a robot movement mechanism 2022 provided below the robot body 2021, and moves to the vicinity of the work line 201 in the workplace 200 upon receiving instructions from outside the humanoid robot 2020, such as a management control device 2060, or by referring to a program stored in an information processing device 2025.
- the robot main body 2021 includes a robot torso 2211 and a robot head 2212.
- the robot torso 2211 and the robot head 2212 form a first drive mechanism, and are capable of changing the sensing area 2230 (imaging area 2240) of the robot sensor 2023 (robot imaging device 2024).
- the configuration of the drive mechanism is not particularly limited, and may be configured such that, for example, the robot head 2212 rotates a predetermined angle relative to the robot torso 2211, or the robot torso 2211 rotates a predetermined angle relative to the robot movement mechanism 22, by a servo motor (not shown).
- a robot movement mechanism 2022 is provided below the robot torso 2211, a robot arm 2026 is provided on each side of the robot torso 2211, and a robot sensor 2023 is provided in the robot head 2212.
- An information processing device 2025 is also provided inside the robot main body 2021.
- the robot movement mechanism 2022 may be of any configuration, for example it may be provided with a rotating body driven by a motor, or may have legs that resemble the shape of a human leg. As an example, if the robot movement mechanism 2022 is configured to resemble the shape of a human leg, a servo motor is provided at the location that corresponds to a human joint, and the movement mechanism is formed by rotating it by a predetermined angle.
- the robot sensor 2023 is preferably provided in the robot head 2212 and senses the worker 400.
- the robot sensor 2023 also sequentially acquires information indicating at least the distance and angle between an object around the humanoid robot 2020 on which the humanoid robot 2020 is working and the robot arm 2026.
- Examples of the robot sensor 2023 include the highest performance camera, a thermal camera, a high pixel/telephoto/ultra-wide angle/360 degree/high performance camera, radar, solid-state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or a variety of other sensor groups. These are also examples of the robot imaging device 2024.
- robot sensor 2023 examples include a vibration meter, a hardness meter, a micro vibration meter, an ultrasonic measuring device, a vibration measuring device, an infrared measuring device, an ultraviolet measuring device, an electromagnetic wave measuring device, a thermometer, a hygrometer, a spot AI weather forecast, a high-precision multi-channel GPS, low-altitude satellite information, or long-tail incident AI data.
- Examples of sensor information acquired from the robot sensor 2023 include images, distance, vibration, heat, smell, color, sound, ultrasound, radio waves, ultraviolet light, infrared light, humidity, etc., and preferably the image and distance information is acquired by the robot imaging device 2024.
- the robot sensor 2023 (robot imaging device 2024) performs this sensing every nanosecond, for example.
- the sensor information is used, for example, for motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, analysis of cornering, speed, etc.
- the robot arm 2026 comprises a right arm 2261 and a left arm 2262.
- the right arm 2261 comprises a right gripping support part 2263 and a right gripping part 2265
- the left arm 2262 comprises a left gripping support part 2264 and a left gripping part 2266.
- the right gripping support part 2263 is a mechanism for supporting the right gripping part 2265
- the left gripping support part 2264 is a mechanism for supporting the left gripping part 2266, and may be shaped like a human arm, for example.
- the gripping parts 2265 and 2266 are mechanisms for gripping, for example, parts for work, and may be shaped like a human hand, for example.
- the robot arm 2026 constitutes a second drive mechanism.
- the configuration of the drive mechanism is not particularly limited, and for example, if the robot arm 2026 is to resemble a human shape, a configuration may be adopted in which servo motors are provided at each joint location, such as a location corresponding to a human shoulder, a location corresponding to an elbow, a location corresponding to a wrist, a location corresponding to a finger joint, and the like, and rotated at a predetermined angle.
- the humanoid robot 2020 may further be provided with a sensor, for example, on the robot torso 2211 (see FIG. 13B).
- the sensor is located at a different height than the robot sensor 2023 located on the robot head 2212. The different height allows the sensor to sense the movements of the worker 400 from different angles.
- the sensing system operates the movement mechanisms and drive mechanisms of each humanoid robot 2020a, 2020b so that the second sensor 2023b (second imaging device 2024b) senses a specified movement of the worker 400 from a position different from that of the first sensor 2023a (first imaging device 2024a) and so that the sensing areas 2230a, 2230b (imaging areas 2240a, 2240b) of each sensor each sense a different specified part of the worker 400.
- the sensing areas 2230a, 2230b imaging areas 2240a, 2240b of each sensor do not need to be entirely different, and may be set so that at least a portion of the specific area is different.
- specific areas include the worker's neck, arm, and wrist.
- the specific area may be recognized by each sensor using a known image recognition technology, or the specific area may be recognized by learning using the learning unit 2663 (see FIG. 9).
- the movement mechanism 2022 and drive mechanism of each humanoid robot are operated so that the sensing area 2230a (imaging area 2240a) of the first sensor 2023a (first imaging device 2024a) provided on the first humanoid robot 2020a senses the left arm of the worker 400, and the sensing area 2230b (imaging area 2240b) of the second sensor 2023b (second imaging device 2024b) provided on the second humanoid robot 2020b senses the right arm of the worker 400.
- the sensing system determines whether or not a specific part that moves when the worker 400 performs a specific operation is sensed, based on the first information acquired by the first sensor 2023a (first imaging device 2024a) and the second information acquired by the second sensor 2023b (second imaging device 2024b).
- the specific operations are diverse and include, for example, assembling and moving parts, painting a product, and the worker's own movement.
- the sensing system activates the first movement mechanism 2022a (see FIG. 8) of the first humanoid robot 2020a and/or the second movement mechanism 2022b (see FIG. 8) of the second humanoid robot 2020b so that the specific part is sensed.
- FIG. 7 shows an example of a mobile robot moving in the sensing system of this embodiment.
- the second movement mechanism 2022b of the second humanoid robot 2020b (second sensor 2023b (second imaging device 2024b)) that was sensing the right arm is also operated.
- the first information and the second information are stored, and the sensing system learns the predetermined actions of the worker based on the stored first information and second information.
- This learning is performed by automatic learning, which is learning that automatically creates a learned model and automatically performs judgment/analysis using the learned model, for example.
- the sensing system refers to the learning results, as well as the work manual information and/or schedule information for the worker 400, to generate operation information that gives operation instructions to a work robot performing the work of the worker 400.
- the work manual information includes, for example, the name and content of each work item, the order of the work items, and information on the standard work time required for each work item.
- the schedule information includes, for example, information indicating the work time and start/end times for the entire work, information indicating the work time and start/end times for each work item, and information indicating the worker for each work item.
- FIG. 8 is a block diagram showing an example of the configuration and functions of the sensing system 2100 of this embodiment.
- the sensing system 2100 includes a first humanoid robot 2020a, a second humanoid robot 2020b, and a management control device 2060.
- the first humanoid robot 2020a and the second humanoid robot 2020b are each connected to the communication unit 2064 of the management control device 2060 via wireless or wired communication, and receive instructions from the management control device 2060 and transmit information acquired by each sensor.
- the first humanoid robot 2020a and the second humanoid robot 2020b may also be connected to each other via wireless or wired communication, and transmit and receive information acquired by each sensor and instructions.
- the first humanoid robot 2020a includes a first moving mechanism 2022a, a first sensor 2023a which is a sensor for the robot, a first imaging device 2024a which is an imaging device for the robot included in the first sensor 2023, a first information processing device 2025a, a first driving mechanism, and a second driving mechanism.
- the second humanoid robot 2020b also includes a second moving mechanism 2022b, a second sensor 2023b which is a sensor for the robot, a second imaging device 2024b which is an imaging device for the robot included in the second sensor 2023, a second information processing device 2025b, and two driving mechanisms.
- the first humanoid robot 2020a and the second humanoid robot 2020b have the same configuration.
- the first information processing device 2025a includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the first information processing device 2025a also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the first information processing device 2025a also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the first information processing device 2025a.
- the storage device 1224 may also store first information and second information.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- the ROM 1230 stores therein a boot program or the like to be executed by the first information processing device 2025a upon activation, and/or a program that depends on the hardware of the first information processing device 2025a.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the first information processing device 2025a, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the first information processing device 2025a.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described program or software module may be stored in a computer-readable storage medium on the first information processing device 2025a or in the vicinity of the first information processing device 2025a.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the first information processing device 2025a via the network.
- the management control device 2060 is a control device that issues instructions to the humanoid robots 2020a and 2020b in order to realize the sensing system 2100.
- the management control device 2060 also acquires the sensor information (first information and second information) stored in the storage device 1224.
- the management control device 2060 is composed of a CPU 2060A, a RAM 2060B, a ROM 2060C, an input/output unit (I/O) 2060D, a bus 2060E such as a data bus or a control bus that connects these, and a communication unit 2068.
- a storage medium 2062 is connected to the I/O 2060D.
- a communication unit 2064 is connected to the I/O 2060D, which transmits and receives sensor information, work manual information, schedule information, etc. between the control system of the humanoid robot 2020.
- FIG. 9 is a block diagram showing an example of the functions of the management control device 2060 in the sensing system of this embodiment.
- the management control device 2060 includes a storage medium 2062, a communication unit 2064, and a processing unit 2066.
- the storage medium 2062 includes, for example, at least one of a semiconductor storage device, a magnetic tape device, a magnetic disk device, or an optical disk device.
- the storage medium 2062 stores driver programs, operating system programs, application programs, data, etc., used for processing in the processing unit 2066.
- the storage medium 2062 stores first information and second information.
- the storage medium 2062 also stores work manual information and/or schedule information for the worker 400.
- the communication unit 2064 has a wireless communication interface circuit such as Wi-Fi (registered trademark) and/or a wired communication interface circuit such as Ethernet (registered trademark).
- the communication unit 2064 transmits and receives various information to and from the humanoid robots 2020a and 2020b through the interface circuits.
- the processing unit 2066 has one or more processors and their peripheral circuits.
- the processing unit 2066 centrally controls the overall operation of the sensing system 2100, and is, for example, a CPU.
- the processing unit 2066 executes processing by referencing programs (driver programs, operating system programs, application programs, etc.) stored in the storage medium 2062.
- the processing unit 2066 can also execute multiple programs (application programs, etc.) in parallel.
- the processing unit 2066 includes a determination unit 2661, a control unit 2662, a learning unit 2663, and an operation information generation unit 2664. Each of these units is a functional module realized by a program executed by a processor included in the processing unit 2060. Alternatively, each of these units may be implemented in the processing unit 2066 as firmware.
- the determination unit 2661 determines, from the first information and the second information, whether or not a specific part that moves when the worker 400 performs a specific operation is sensed. The determination unit 2661 also determines whether or not the specific part sensed by the first sensor 2023a (first imaging device 2024a) is the same as the specific part sensed by the second sensor 2023b (second imaging device 2024b).
- the control unit 2662 operates the first moving mechanism 2022a of the humanoid robot 2020a and/or the second moving mechanism 2022b of the humanoid robot 2020b so that the predetermined part is sensed.
- control unit 2662 determines that the predetermined part sensed by the first sensor 2023a (first imaging device 2024a) and the predetermined part sensed by the second sensor 2023b (second imaging device 2024b) are the same, the control unit 2662 operates the first moving mechanism 2022a of the humanoid robot 2020a and/or the second moving mechanism 2022b of the humanoid robot 2020b so that the predetermined part sensed by the first sensor 2023a (first imaging device 2024a) and the predetermined part sensed by the second sensor 2023b (second imaging device 2024b) are different.
- the learning unit 2663 learns the specified actions of the worker 400 by referring to the first information and the second information stored in the storage medium 2062 and/or the storage device 1224.
- the motion information generating unit 2664 refers to the learning results by the learning unit 2663 and generates motion information that gives motion instructions to the humanoid robot 2020 that functions as a work robot. Note that the motion information generating unit 2664 may refer to work manual information and/or schedule information when generating the motion information.
- FIG. 10 is an example of a flowchart showing the processing of the sensing system of the present embodiment.
- the information processing device 2025 instructs the multiple humanoid robots 2020 (two in this embodiment) functioning as mobile robots to move to the workshop 200 (step S2101).
- the movement is caused by the operation of the robot movement mechanism 2022 of each humanoid robot 2020.
- each sensing area 2230 imaging area 2240
- robot sensors 2023 robot imaging devices 2024
- the placement of such multiple humanoid robots 2020 is performed, for example, by storing a floor plan of the workplace 200 in advance in the storage device 1224 and/or storage medium 2062, and associating the position of each humanoid robot 2020 with the stored floor plan.
- the placement of the humanoid robot 2020 may be based on a position optimized through machine learning.
- the plurality of sensors 2023a, 2023b sense a predetermined movement of the worker 400 on the work line 201 (step S2102).
- the control unit 2662 instructs the sensing area 2230a (imaging area 2240a) of the first sensor 2023a (first imaging device 2024a) to sense the left arm of the worker 400, and the sensing area 2230b (imaging area 2240b) of the second sensor 2023b (second imaging device 2024b) to sense the right arm of the worker 400, and the movement mechanism 2022 and drive mechanism of each humanoid robot are operated.
- the first information acquired by the first sensor 2023a (first imaging device 2024a) and the second information acquired by the second sensor 2023b (second imaging device 2024b) are stored in the storage medium 2062 via the storage device 1224 and/or the communication unit 2064.
- the storage device 1224 and the storage medium 2062 function as a storage unit.
- step S2103 it is determined whether or not a specific part of the worker 400 is being sensed. If the determination unit 2661 determines that the specific part is not being sensed, the control unit 2662 operates the first moving mechanism 2022a and/or the second moving mechanism 2022b so that the specific part is sensed.
- the management control device 2060 learns the specified motion by referring to the first information and second information accumulated, in other words stored, in the memory device 1224 and/or the storage medium 2062, and generates motion information that gives motion instructions to the humanoid robot 2020 functioning as a work robot by referring to the learning results (step S2104).
- FIG. 11 is an example of a flowchart showing more detailed processing of the specific part sensing determination processing shown in step S2103 of FIG. 10.
- the determination unit 2661 determines from the first information and the second information whether or not a predetermined part that moves when the worker 400 performs a predetermined motion is sensed (steps S2202, S2203).
- the first sensor 2023a first imaging device 2024a
- the second sensor 2023b second imaging device 2024b
- the determination unit 2661 refers to the first information and the second information to determine whether or not the first sensor 2023a (first imaging device 2024a) is sensing the left arm as the predetermined part.
- the control unit 2662 operates the first moving mechanism 2022a and/or the second moving mechanism 2022b so that the predetermined part is sensed (step S2206).
- the control unit 2662 operates the first moving mechanism 2022a and moves the first humanoid robot 2020a so that the first sensor 2023a (first image capture device 2024a) can sense the left arm, which is a part of the predetermined part of the worker 400.
- the control unit 2662 also operates the second moving mechanism 2022b and moves the second humanoid robot 2020b so that the second sensor 2023b (second image capture device 2024b) can sense the right arm, which is the other part of the predetermined part of the worker 400.
- each sensor senses a part of a specific area and another part, so the data (information) necessary for the worker 400 to learn the work can be acquired efficiently.
- the determination unit 2661 determines whether the predetermined part sensed by the first sensor 2023a (first image capture device 2024a) and the predetermined part sensed by the second sensor 2023b (second image capture device 2024b) are the same (steps S2204, S2205). As an example, when sensing starts, the first sensor 2023a (first image capture device 2024a) senses the left arm of the worker 400, and the second sensor 2023b (second image capture device 2024b) senses the right arm of the worker 400, respectively, but a situation may occur in which the same predetermined part (for example, the back) is sensed due to a predetermined movement of the worker 400. Therefore, the determination unit 2661 determines whether the predetermined parts sensed by each sensor are the same.
- control unit 2662 operates the first moving mechanism 2022a and/or the second moving mechanism 2022b so that the specific area sensed by the first sensor 2023a (first imaging device 2024a) is different from the specific area sensed by the second sensor 2023b (second imaging device 2024b) (S2206).
- the control unit 2662 operates the first moving mechanism 2022a and the second moving mechanism 2022b so that the first sensor 2023a (first imaging device 2024a) senses the left arm of the worker, and the second sensor 2023b (second imaging device 2024b) senses the right arm of the worker.
- FIG. 12 is an example of a flowchart showing more detailed processing of the motion information generation processing shown in step S2104 of FIG. 10.
- the first information and the second information are stored in the storage device 1224 and/or the storage medium 2062 (step S2301), and the learning unit 2663 refers to the first information and the second information stored in the storage device 1224 and/or the storage medium 2062 to learn the movements of the worker 400 (step S2302).
- learning motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, cornering, speed, etc. are analyzed, and optimal movements of the humanoid robot 2020 that can also function as a working robot are learned by automatic learning. This makes it possible to analyze a given movement of the worker 400 from multiple perspectives at once, reducing the time and cost required for analyzing and programming the movements of the worker 400.
- the motion information generating unit 2664 refers to the learning results by the learning unit 2663 (step S2303) and generates motion information that gives motion instructions to the humanoid robot 2020 functioning as a work robot (step S2304).
- the motion information generating unit 2664 may refer to work manual information and/or schedule information for the sensing target (S2303).
- the humanoid robot 2020 functioning as a work robot is able to perform the work (predetermined motion) of the worker 400.
- the sensing system 2100 of this embodiment when a specific part of the worker 400 to be sensed that moves during a specific movement is not sensed, the first moving mechanism 2022a operates so that the specific part is sensed. This makes it possible to prevent a situation in which the specific part is not sensed due to a specific movement of the worker 400, and enables sufficient sensing to be performed.
- the humanoid robot 2020 which is a mobile robot, is equipped with a sensor 2023 (imaging device 2024), while the humanoid robot 2020 moves to an appropriate position according to the sensing results, not just pre-stored programs and machine learning results. Therefore, compared to the case where fixed sensors are arranged in the workshop 200, a sufficient sensing environment can be created without causing situations such as rearranging sensors or increasing the number of sensors.
- each of the multiple (two in this embodiment) mobile robots is provided with a movement mechanism, and when a specific part of the worker 400 to be sensed that moves during a specific movement is not sensed, each movement mechanism operates to sense the specific part, making it possible to sense the specific movement of the worker 400 from multiple different angles.
- the movement mechanism is operated so that one of the multiple (two in this embodiment) sensors 2023 (imaging devices 2024) senses a part of a specific part of the worker 400, and the movement mechanism is operated so that the other sensor (imaging device) senses another part of the specific part, so that the specific movement of the worker 400 can be sensed from multiple different angles and multiple specific parts.
- the sensor information (in this embodiment, the first information and the second information) acquired by each sensor is stored, the predetermined motion of the worker 400 is learned by referring to the stored sensor information, and the learning results are referred to to generate motion information that gives operational instructions to the humanoid robot that functions as a work robot, so that motion information that sufficiently reflects the sensing results is generated.
- the sensing system 2100 of this embodiment when generating the motion information, the work manual information and/or the schedule information are referenced before the motion information is generated.
- the worker 400 does not always perform the motions faithful to the work, and in some cases may perform unnecessary motions or omit necessary motions. Therefore, by referencing the work manual information and the schedule information, it is possible to prevent unnecessary or inappropriate specified motions by the worker 400 from being reflected in the motion information.
- FIGS. 1 and 13B are diagrams illustrating an example of a sensing system according to a first modified example of the present embodiment.
- FIG. 13A is a diagram showing an example of a system configuration in a sensing system according to Variation 1 of Embodiment 2 of the present disclosure.
- This sensing system is characterized in that a humanoid robot 2020c that functions as a mobile robot is provided with a torso sensor 2023d (torso image capture device 2024d) that corresponds to the second sensor.
- the management control device 2060 is not necessarily required, and the sensing system can be configured by the humanoid robot 2020c alone.
- FIG. 13B is a diagram showing an example of the mobile robot shown in FIG. 13A.
- the humanoid robot 2020c functioning as a mobile robot includes a robot main body 2021c, a robot movement mechanism 2022c, a head sensor 2023c, a head imaging device 2024c included in the head sensor 2023c, a body sensor 2023d, a body imaging device 2024d included in the body sensor 2023d, an information processing device 2025c, and a robot arm.
- the robot main body 2021c includes a robot torso 2211 and a robot head 2212.
- the robot torso 2211 and the robot head 2212 form a first drive mechanism 2021c (see FIG. 14), and are capable of changing the sensing area 2230c (imaging area 2240c) of the head sensor 2023c (head imaging device 2024c) and the sensing area 2230d (imaging area 2240d) of the torso sensor 2023d (torso imaging device 2024d).
- the robot movement mechanism 2022c functions as a first movement mechanism.
- Head sensor 2023c (head imaging device 2024c) functions as a first sensor, and torso sensor 2023d (torso imaging device 2024d) functions as a second sensor. Head sensor 2023c (head imaging device 2024c) and torso sensor 2023d (torso imaging device 2024d) are disposed at different height positions, so torso sensor 2023d (torso imaging device 2024d) functioning as the second sensor senses the predetermined movement of the sensing target from a different position than head sensor 2023c (head imaging device 2024c).
- the configuration of the information processing device 2025c is the same as the first information processing device 2025a of the first humanoid robot 2200a.
- the robot arm is also the same as the first humanoid robot 2020a.
- FIG. 14 is a block diagram showing an example of the functions of a mobile robot in this sensing system.
- the information processing device 2025c includes an information processing unit 2066c, a communication interface 1222c, and a storage device 1224c, and the information processing unit 2066c includes a determination unit 2661c, a control unit 2662c, a learning unit 2663c, and an operation information generating unit 2664c. That is, in the sensing system 2100', the information processing unit 2066c performs the same processing as the processing unit 2066 of the management control device 2060.
- the information processing device 2025c is configured to be able to communicate with the head sensor 2023c (head imaging device 2024c), the body sensor 2023d (head imaging device 2024d), the first moving mechanism 2022c, and the first driving mechanism 2021c.
- the humanoid robot 2020c of the sensing system 2100' is equipped with an information processing unit 2066c in an information processing device 2025c, and therefore the humanoid robot 2020c alone constitutes a sensing system.
- the control unit 2662c of the humanoid robot 2020c instructs the head sensor 2023c (head imaging device 2024c) functioning as a first sensor to sense the left arm of the worker 400, and the torso sensor 2023d (torso imaging device 2024d) functioning as a second sensor to sense the right arm of the worker 400. Then, the determination unit 2661c determines whether or not a specific part that moves when the worker 400 performs a specific operation is sensed from the sensor information (first information and second information) acquired by each sensor, and if it is determined that the specific part is not sensed, the control unit 2662c operates the first movement mechanism 2022c and/or the first drive mechanism 2021c so that the specific part is sensed.
- the determination unit 2661c also determines whether the specific part sensed by the head sensor 2023c (head imaging device 2024c) is the same as the specific part sensed by the torso sensor 2023d (torso imaging device 2024d), and when it is determined that the specific part sensed by each sensor (imaging device) is the same, the control unit 2662c operates the first moving mechanism 2022c and/or the first driving mechanism 2021c so that the specific part sensed by each sensor (imaging device) is different.
- the humanoid robot 2020c can constitute a sensing system by itself, making it possible to perform sufficient sensing even in places where communication with the management control device 2060 is not possible, for example.
- the self-acting robot 2020c is equipped with multiple (two in this modified example) sensors (imaging devices), making it possible to perform sufficient sensing even in a narrow space for sensing the worker 400, for example.
- the humanoid robot functioning as the mobile robot does not necessarily have to be one, but may be multiple.
- the number of sensors will increase by a multiple of the number of humanoid robots, making it possible to obtain a large amount of sensor information at one time.
- Modification 2 of the second embodiment 15A and 15B are diagrams illustrating an example of a sensing system according to Modification 2 of this embodiment.
- FIG. 15A is a diagram showing an example of a system configuration of a sensing system according to Variation 2 of Embodiment 2.
- This sensing system is characterized in that it includes a sensor mounting member 2030 in addition to a humanoid robot 2020 that functions as a mobile robot.
- FIG. 15B is a diagram showing an example of the sensor mounting member shown in FIG. 15A.
- the sensor mounting member 2030 comprises a mounting member main body 2031, a mounting member moving mechanism 2032, a mounting member sensor 2033, and a mounting member imaging device 2034.
- the sensor mounting member 2030 can be moved by the mounting member moving mechanism 2032 provided below the mounting member main body 2031.
- the mounting member moving mechanism 2032 does not necessarily have to be provided.
- the mounting member body 2031 is, for example, a rod- or cane-shaped member, and its material is not particularly limited.
- the length of the mounting member body 2031 is longer than the height (back height) of the humanoid robot 2020, for example, 2.1 meters.
- the mounting member body 2031 is provided with a mounting member movement mechanism 2032 below, preferably at the lower end, and a mounting member sensor 2033 above, preferably at the upper end, of the mounting member body 2031.
- the mounting member moving mechanism 2032 is configured with a rotating body such as a caster, and assists the sensor mounting member 2030 in moving in accordance with the movement of the humanoid robot 2020. Note that in this embodiment, it is not assumed that the sensor mounting member 2030 moves autonomously, but a mounting member control unit (not shown) that issues instructions to the mounting member moving mechanism 2032 may be provided, and the mounting member moving mechanism 2032 may be moved based on a signal from the mounting member control unit.
- the mounting member sensor 2033 (mounting member imaging device 2034) functioning as a second sensor is provided above the mounting member main body 2031 and senses the worker 400.
- An example of the mounting member sensor 2033 is similar to the robot sensor 2023, and an example of the mounting member imaging device 2034 is similar to the robot imaging device 2024.
- an example of acquired sensor information is also similar to the robot sensor 2023, and an example of the sensing timing of the sensor information is also similar to the robot sensor 2023.
- the mounting member imaging device 2034 is included in the mounting member sensor 2033.
- the mounting member sensor 2033 which includes the mounting member imaging device 2034, is disposed at a position higher than the height (back height) of the humanoid robot 2020. This allows the mounting member sensor 2033 to sense the movements of the worker 400 from a position higher than the robot sensor 2023.
- FIG. 16 is a diagram showing an example of the configuration and functions of a sensing system according to Variation 2 of Embodiment 2.
- the sensor mounting member 2030 is configured to be able to communicate with the information processing device 2025 of the humanoid robot 2020 via wireless or wired communication.
- the sensor mounting member 2030 may be configured to be able to communicate with the communication unit 2064 of the management control device 2060 instead of or in addition to the information processing device 2025.
- the configurations of the humanoid robot 2020 and the management control device 2060 of the sensing system 2100'' are the same as those of the sensing system 2100.
- the humanoid robot 2020 grasps the sensor mounting member 2030 with the right gripping portion 2265 (or the left gripping portion 2266), which is part of the robot arm 2026 constituting the second driving mechanism.
- the mounting member sensor 2033 (mounting member imaging device 2034) of the sensor mounting member 2030 can change its sensing area 2330 (imaging area 2340) by the second driving mechanism.
- the control unit 2662 instructs the robot sensor 2023 (robot imaging device 2224) functioning as the first sensor to sense the left arm of the worker 400, and the mounting member sensor 2033 (mounting member imaging device 2034) functioning as the second sensor to sense the right arm of the worker 400. Then, from the sensor information (first information and second information) acquired by each sensor, the determination unit 2661 determines whether or not a specific part that moves when the worker 400 performs a specific operation is sensed, and if it is determined that the specific part is not sensed, the control unit 2662 operates the first moving mechanism 2022 and/or the second driving mechanism 2026 so that the specific part is sensed.
- the robot sensor 2023 robot imaging device 2224
- mounting member sensor 2033 mounting member imaging device 2034
- the control unit 2662 operates the first moving mechanism 2022 and/or the second driving mechanism 2026 so that the specific parts sensed by each sensor (imaging device) are different.
- the sensor mounting member 2030 (imaging device 2040 for mounting member) is disposed at a position higher than the height (back height) of the humanoid robot 2020. This allows the movement of the worker 400 to be sensed from a more bird's-eye view position, making it easier to avoid situations where sensing is difficult due to the back of the worker 400, for example, and allows the data necessary for learning the work of the worker 400 to be acquired efficiently.
- the humanoid robot that functions as a mobile robot does not have to be one, and the sensor mounting member does not have to be one.
- the humanoid robots 2020 each holding two sensor mounting members by both holding parts 2265, 2266. In this case, too, the number of sensors can be increased, making it possible to obtain a large amount of sensor information at one time.
- two mobile robots (humanoid robots) equipped with sensors and a moving mechanism are used.
- the number of mobile robots may be more than this.
- the first sensor (first imaging device) is described as sensing the left arm of the worker, and the second sensor (second imaging device) is described as sensing the left arm of the worker.
- the specific part to be sensed is not limited to this, and the specific part sensed by each sensor is not limited to this.
- the first sensor may sense the fingertips of the worker's right hand
- the second sensor may sense the movement of the worker's neck.
- the learning of the specified actions of the worker has been described as being performed by automatic learning.
- the learning does not necessarily have to be automatic learning, and may be other known machine learning methods, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- the mobile robot and the work robot are described as being the same humanoid robot. In this case, it is possible to use the mobile robot in combination with the work robot, which can reduce the expenses and costs associated with robot production. However, the mobile robot and the work robot may be different robots.
- a worker (person) is used as the sensing target.
- a robot capable of imitating the predetermined movements of a worker may be used as the sensing target.
- FIG. 17A is a diagram showing an example of a system configuration of a sensing system according to embodiment 3 of the present disclosure.
- the sensing system includes a first humanoid robot 3020a and a second humanoid robot 3020b that function as mobile robots, and a third humanoid robot 3020c that function as a work robot. Note that the number of humanoid robots that function as mobile robots and work robots is not limited to this.
- the first humanoid robot 3020a moves to the vicinity of the worker 400 working on the work line 201 of the work area 200 upon receiving instructions from the management control device 3060 (see FIG. 18) described later, or by instructions from the first information processing device 3025a (see FIG. 18) provided in the first humanoid robot 3020a.
- the sensing system senses the predetermined motion of the worker 400 using the first robot sensor 3023a (first robot imaging device 3024a) provided in the first humanoid robot 3020a.
- the predetermined motion is wide-ranging, and includes, for example, the assembly of parts, the movement of parts, the painting of products, the movement of the worker himself, etc.
- a known image recognition technology may be used, or the worker 400 and his predetermined motion may be recognized by learning by the learning unit 2663 (see FIG. 19). The same applies to the sensing of the work robot described later.
- the sensing system learns the predetermined motion of the worker 400 by referring to the first information acquired by the first robot sensor 3023a (first robot imaging device 3024a) functioning as the first sensor.
- the sensing system also generates motion control information that gives motion instructions to the third humanoid robot 3020c by referring to the learning result of the predetermined motion.
- the second humanoid robot 3020b moves to the vicinity of the third humanoid robot 3020c working on the work line 201 in the workplace 200, in response to instructions from the management control device 3060 or from instructions from a second information processing device provided in the second humanoid robot 3020b.
- the third humanoid robot 3020c moves to the vicinity of the worker 400 in the workplace 200, in response to instructions from the management control device 3060 or from instructions from a third information processing device provided in the third humanoid robot 3020c.
- the sensing system operates the third humanoid robot 3020c by referring to the motion control information.
- the sensing system also senses the robot motion of the third humanoid robot 3020c using the second robot sensor 3023b (second robot imaging device 3024b) provided on the second humanoid robot 3020b. This allows the sensing system to confirm the robot motion of the third humanoid robot 3020c, which functions as a work robot.
- the sensing system compares the first information with the second information acquired by the second robot sensor 3023b (second robot imaging device 3024b) functioning as a second sensor, and adjusts the motion control information so that the robot motion of the third humanoid robot 3020c approximates a predetermined motion. This makes it possible to adjust the robot motion of the third humanoid robot 3020c to an appropriate motion.
- FIG. 17B is a diagram showing an example of the humanoid robot 3020 shown in FIG. 17A.
- the humanoid robot 3020 which functions as a mobile robot and a working robot, comprises a robot body 3021, a robot movement mechanism 3022, a robot sensor 3023, a robot imaging device 3024 included in the robot sensor 3023, an information processing device 3025, and a robot arm 3026.
- the humanoid robot 3020 can move using a robot movement mechanism 3022 provided below the robot body 3021, and moves to the vicinity of the work line 201 in the workplace 200 upon receiving instructions from outside the humanoid robot 3020, such as a management control device 3060, or by referring to a program stored in an information processing device 3025.
- the robot main body 3021 comprises a robot torso 3211 and a robot head 3212.
- the robot torso 3211 and the robot head 3212 constitute a torso/head drive mechanism, and are capable of changing the sensing area 3230 (imaging area 3240) of the robot sensor 3023 (robot imaging device 3024).
- the configuration of the drive mechanism is not particularly limited, and may be configured, for example, such that the robot head 3212 rotates a predetermined angle relative to the robot torso 3211, or the robot torso 3211 rotates a predetermined angle relative to the robot movement mechanism 3022, by a servo motor (not shown).
- a robot movement mechanism 3022 is provided below the robot torso 3211, a robot arm 3026 is provided on each side of the robot torso 3211, and a robot sensor 3023 is provided in the robot head 3212.
- An information processing device 3025 is also provided inside the robot main body 3021.
- the robot movement mechanism 3022 may be of any configuration, for example it may be provided with a rotating body driven by a motor, or may have legs that resemble the shape of a human leg. As an example, if the robot movement mechanism 3022 is configured to resemble the shape of a human leg, a servo motor is provided at the location that corresponds to a human joint, and the movement mechanism is formed by rotating it by a predetermined angle.
- the robot sensor 3023 which functions as the first sensor and the second sensor, is preferably provided in the robot head 3212 and senses the worker 400 and the working robot.
- the robot sensor 3023 also sequentially acquires information representing at least the distance and angle between an object around the humanoid robot 3020 on which the humanoid robot 3020 is working and the robot arm 3026.
- Examples of the robot sensor 3023 include the highest performance camera, a thermal camera, a high pixel/telephoto/ultra-wide angle/360 degree/high performance camera, radar, solid-state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or a variety of other sensor groups. These are also examples of the robot imaging device 3024.
- robot sensor 3023 examples include a vibration meter, hardness meter, micro vibration meter, ultrasonic measuring instrument, vibration measuring instrument, infrared measuring instrument, ultraviolet measuring instrument, electromagnetic wave measuring instrument, thermometer, hygrometer, spot AI weather forecast, high-precision multi-channel GPS, low altitude satellite information, long-tail incident AI data, etc.
- Examples of sensor information acquired from the robot sensor 3023 include images, distance, vibration, heat, smell, color, sound, ultrasound, radio waves, ultraviolet light, infrared light, humidity, etc., and preferably the image and distance information is acquired by the robot imaging device 3024.
- the robot sensor 3023 (robot imaging device 3024) performs this sensing every nanosecond, for example.
- the sensor information is used, for example, for motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, analysis of cornering, speed, etc.
- the robot arm 3026 comprises a right arm 3261 and a left arm 3262.
- the right arm 3261 comprises a right gripping support part 3263 and a right gripping part 3265
- the left arm 3262 comprises a left gripping support part 3264 and a left gripping part 3266.
- the right gripping support part 3263 is a mechanism for supporting the right gripping part 3265
- the left gripping support part 3264 is a mechanism for supporting the left gripping part 3266, and may be shaped like a human arm, for example.
- the gripping parts 3265 and 3266 are mechanisms for gripping parts for work, for example, and may be shaped like a human hand, for example.
- the robot arm 3026 constitutes an arm drive mechanism.
- the configuration of the drive mechanism is not particularly limited, and for example, if the robot arm 3026 is to resemble a human shape, a configuration may be adopted in which servo motors are provided at each joint location, such as a location corresponding to a human shoulder, a location corresponding to an elbow, a location corresponding to a wrist, a location corresponding to a finger joint, etc., and rotated by a predetermined angle.
- the humanoid robot 3020 may further be provided with a sensor, for example, on the robot torso 3211 (see FIG. 24B).
- the sensor is located at a different height than the robot sensor 3023 located on the robot head 3212. The different height allows the sensor to sense the movements of the worker 400 from different angles.
- FIG. 18 is a block diagram showing an example of the configuration and functions of the sensing system 3100 of this embodiment.
- the sensing system 3100 includes a first humanoid robot 3020a, a second humanoid robot 3020b, a third humanoid robot 3020c, and a management control device 3060.
- the first humanoid robot 3020a, the second humanoid robot 3020b, and the third humanoid robot 3020c are each connected to the communication unit 3064 of the management control device 3060 via wireless or wired communication, and receive instructions from the management control device 3060 and transmit information acquired by each sensor.
- the humanoid robots 3020a-c may also be connected to each other via wireless or wired communication, and send and receive information acquired by each sensor and instructions.
- the first humanoid robot 3020a which functions as a mobile robot, includes a first moving mechanism 3022a, a first robot sensor 3023a which functions as a first sensor, a first robot imaging device 3024a included in the first robot sensor 3023a, a first information processing device 3025a, a first body/head driving mechanism 3021a, and a first arm driving mechanism 3026a.
- the second humanoid robot 3020b which functions as a mobile robot and the third humanoid robot 3020c which functions as a work robot also have the same configuration as the first humanoid robot 3020a.
- the first information processing device 3025a includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the first information processing device 3025a also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the first information processing device 3025a also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the first information processing device 3025a.
- the storage device 1224 may also store first information and second information.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- the ROM 1230 stores therein a boot program, etc., which is executed by the first information processing device 3025a upon activation, and/or a program that depends on the hardware of the first information processing device 3025a.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the first information processing device 3025a, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the first information processing device 3025a.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described program or software module may be stored on a computer-readable storage medium on the first information processing device 2305a or in the vicinity of the first information processing device 3025a.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the first information processing device 3025a via the network.
- the management control device 3060 is a control device that issues instructions to the humanoid robots 3020a-c in order to realize the sensing system 3100.
- the management control device 3060 also acquires the sensor information (first information and second information) stored in the storage device 1224.
- the management control device 3060 is composed of a CPU 3060A, a RAM 3060B, a ROM 3060C, an input/output unit (I/O) 3060D, a bus 3060E such as a data bus or a control bus that connects these, and a communication unit 3068.
- a storage medium 3062 is connected to the I/O 3060D.
- a communication unit 3064 is connected to the I/O 3060D, which transmits and receives sensor information, work manual information related to the specified actions of the worker 400, schedule information, etc., between the control system of the humanoid robot 3020.
- the work manual information includes, for example, the name and content of each work item, the order of the work items, and information on the standard work time required for each work item.
- the schedule information includes, for example, information indicating the work time and start/end times for the entire work, information indicating the work time and start/end times for each work item, and information indicating the worker for each work item.
- FIG. 19 is a block diagram showing an example of the functions of the management control device 3060 in the sensing system of this embodiment.
- the management control device 3060 includes a storage medium 3062, a communication unit 3064, and a processing unit 3066.
- the storage medium 3062 includes, for example, at least one of a semiconductor storage device, a magnetic tape device, a magnetic disk device, or an optical disk device.
- the storage medium 3062 stores a driver program, an operating system program, an application program, data, etc., used for processing in the processing unit 3066.
- the storage medium 3062 stores first information and second information.
- the storage medium 3062 also stores work manual information for the worker 400.
- the storage medium 3062 may also store schedule information.
- the communication unit 3064 has a wireless communication interface circuit such as Wi-Fi (registered trademark) and/or a wired communication interface circuit such as Ethernet (registered trademark).
- the communication unit 3064 transmits and receives various information to and from the humanoid robots 3020a-c through the interface circuits.
- the processing unit 3066 has one or more processors and their peripheral circuits.
- the processing unit 3066 centrally controls the overall operation of the sensing system 3100, and is, for example, a CPU.
- the processing unit 3066 executes processing by referencing programs (driver programs, operating system programs, application programs, etc.) stored in the storage medium 3062.
- the processing unit 3066 can also execute multiple programs (application programs, etc.) in parallel.
- the processing unit 3066 includes a determination unit 3661, an adjustment unit 3662, a learning unit 3663, and an operation information generation unit 3664. Each of these units is a functional module realized by a program executed by a processor included in the processing unit 3066. Alternatively, each of these units may be implemented in the processing unit 3066 as firmware.
- the determination unit 3661 determines whether or not there is manual information related to the specified action, and if there is work manual information, determines whether the learning result of the specified action contradicts the work manual information.
- the adjustment unit 3662 compares the first information with the second information and adjusts the motion control information so that the robot motion of the third humanoid robot 3020c approximates the predetermined motion. Preferably, the adjustment unit 3662 adjusts the motion control information so that the robot motion approximates the predetermined motion during the overlap period.
- the overlap period is a period during which the first sensing period and the second sensing period overlap.
- the first sensing period is a period during which the first robot sensor 3023a (first robot imaging device 3024a) functioning as the first sensor senses the worker 400 and acquires the first information.
- the second sensing period is a period during which the second robot sensor 3023b (second robot imaging device 3024b) functioning as the second sensor senses the robot motion of the third humanoid robot 3020c and acquires the second information.
- the overlap period is a period during which the first information and the second information are acquired simultaneously.
- the learning unit 3663 learns the predetermined motion of the worker 400 by referring to the first information stored in the storage medium 3062 and/or the storage device 1224. This learning is performed by automatic learning, which is, for example, learning that automatically creates a learned model and automatically performs judgment/analysis using the learned model.
- the motion information generating unit 3664 refers to the learning results of the predetermined motion of the worker 400 by the learning unit 3663, and generates motion control information for giving motion instructions to the third humanoid robot 3020c, which functions as a work robot.
- the motion information generating unit 3664 may refer to work manual information when generating the motion control information. This makes it possible to have the third humanoid robot 3020c perform an appropriate motion (work) without reflecting any inappropriate predetermined motion of the worker 400.
- FIG. 20 is a diagram showing an example of each sensing period in the sensing system 3100.
- the first sensor senses and the second sensor senses so that an overlapping period occurs. Such sensing is instructed by the management control device 3060.
- the robot operation of the third humanoid robot 3020c is started while sensing the specified movement of the worker 400 (i.e., the first sensing period), and while sensing the robot movement of the third humanoid robot 3020c (i.e., the second sensing period), the adjustment unit 3662 adjusts the movement control information so that the robot movement approximates the specified movement of the worker 400. This makes it possible to adjust the robot movement of the third humanoid robot 3020c, which is a working robot, on the spot.
- FIG. 21 is an example of a flowchart showing the processing of the sensing system of this embodiment.
- the information processing device of each humanoid robot 3020 instructs the multiple humanoid robots 3020 (three in this embodiment) that function as mobile robots and working robots to move to the workshop 200 (step S3101).
- the movement is achieved by the operation of the robot movement mechanism 3022 of each humanoid robot 3020.
- the sensing area 3230a (imaging area 3240a) of the first robot sensor 3023a (first robot imaging device 3024a) provided on the first humanoid robot 3020a is directed to the worker 400
- the sensing area 3230b (imaging area 3240b) of the second robot sensor 3023b (second robot imaging device 3024b) provided on the second humanoid robot 3020b is directed to the third humanoid robot 3020c.
- the arrangement of such multiple humanoid robots 3020 is performed, for example, by storing a floor plan of the workplace 200 in the storage device 1224 and/or storage medium 3062 in advance and associating the position of each humanoid robot 3020 with the stored floor plan.
- the arrangement of the humanoid robots 3020 may be based on positions optimized through machine learning.
- the first robot sensor 3023a senses a predetermined movement of the worker 400 on the work line 201 (step S3102).
- the processing unit 3066 issues an instruction to the first sensor 3023a (first imaging device 3024a) so that the sensing area 3230a (imaging area 3240a) targets the predetermined movement of the worker 400, and in response, the first information processing device 3025a operates the first movement mechanism 3022a and the torso/head drive mechanism 3021a of the first humanoid robot 3020a.
- the first information acquired by the first robot sensor 3023a (first robot imaging device 3024a) is stored in the storage medium 3062 via the storage device 1224 and/or the communication unit 3064.
- the storage device 1224 and the storage medium 3062 function as a storage unit.
- the management control device 3060 learns the specified motion by referring to the first information stored in the memory unit, in other words, the stored information, and generates motion control information that gives motion instructions to the third humanoid robot 3020c that functions as a work robot by referring to the learning results (step S3103).
- S3103 is preferably performed during the first sensing period. This enables the sensing system 3100 to operate the third humanoid robot 3020c from the stage where the worker 400 is performing the specified motion for work.
- the management control device 3060 operates the third humanoid robot 3020c by referring to the operation control information (step S3104).
- the third humanoid robot 3020c operates according to the operation instructions provided by the operation control information.
- the robot movement of the third humanoid robot 3020c is sensed by the second robot sensor 3023b (second robot imaging device 3024b) (step S3105). This makes it possible to confirm the robot movement of the third humanoid robot 3020c.
- the processing unit 3066 instructs the second robot sensor 3023b (second robot imaging device 3024b) to target the sensing area 3230b (imaging area 3240b) with the robot movement of the third humanoid robot 3020c, and in response, the second information processing device of the second humanoid robot 3020b operates the second movement mechanism and second torso/head drive mechanism of the second humanoid robot 3020b.
- the second information acquired by the second sensor 3023b (second imaging device 3024b) is stored in the memory unit.
- the management control device 3060 adjusts the motion control information so that the robot motion approximates the specified motion (step S3106). Preferably, step S3106 is performed during the overlap period. To achieve this, the management control device 3060 simultaneously acquires the first information by the first sensor and the second information by the second sensor. This allows the sensing system 3100 to adjust the robot motion of the third humanoid robot 3020c to approximate the specified motion of the worker 400 from the stage where the worker 400 is performing the specified motion.
- FIG. 22 is an example of a flowchart showing more detailed processing of the operation control information generation processing shown in step S3103 of FIG. 21.
- the first information is stored in the storage unit (step S3201), and the learning unit 3663 refers to the first information stored in the storage unit to learn the predetermined motion of the worker 400 (step S3202).
- learning motion capture of the motion of the worker 400, a 3D map of the workplace 200, navigation of the movement and motion of the worker 400 in the workplace 200, cornering, speed, etc. are analyzed, and optimal motion of the humanoid robot 3020, which can also function as a work robot, is learned by automatic learning. This makes it possible to analyze the predetermined motion of the worker 400 from multiple perspectives at once, reducing the time and cost required for analyzing and programming the motion of the worker 400.
- the determination unit 3661 determines whether or not there is manual operation information related to the predetermined operation (steps S3203, S3204). If there is no manual operation information related to the predetermined operation (S3204-NO), the operation information generation unit 3664 refers to the learning result of the predetermined operation by the learning unit 3663 (step S3208) and generates operation control information that gives operation instructions to the third humanoid robot 3020c (step S3209). Then, the processing unit 3066 operates the third humanoid robot 3020c by referring to the operation control information (step S3210). This enables the third humanoid robot 3020c to perform a robot operation corresponding to the work (predetermined operation) of the worker 400.
- the judgment unit 3661 judges whether the learning result of the predetermined operation is contradictory to the work manual information (steps S3205, S3206). If the judgment unit 3661 judges that the learning result of the predetermined operation is contradictory to the work manual information (S3206-YES), the operation of the worker 400 may have been an operation that is not suitable for the contents of the work item included in the work manual information. Therefore, if it is judged that the learning result of the predetermined operation is contradictory to the work manual information, the operation information generation unit 3664 does not adopt the learning result of the predetermined operation when generating the operation control information (step S3208).
- the operation information generation unit 3664 refers to the work manual information (step S3207) and generates the operation control information (step S3209). Then, the processing unit 3066 operates the third humanoid robot 3020c by referring to the operation control information (step S3210). This prevents unnecessary or inappropriate predetermined actions by the worker 400 from being reflected in the action control information, making it possible to have the third humanoid robot 3020c perform appropriate actions.
- FIG. 23 is an example of a flowchart showing more detailed processing of the operation control information adjustment processing shown in step S3106 of FIG. 21.
- the second information is stored in the storage unit (step S3301), and the adjustment unit 3662 compares the first information with the second information (step S3302) and adjusts the motion control information so that the robot motion approximates the predetermined motion (step S3303).
- the robot motion of the third humanoid robot 3020c is adjusted to approximate the predetermined motion of the worker 400.
- first information acquired by a first sensor that senses a predetermined movement of the worker 400 is compared with second information acquired by a second sensor that senses the work robot, and the movement control information is adjusted so that the robot movement of the work robot approximates the predetermined movement. This makes it possible to check the robot movement of the work robot while adjusting it to an appropriate one.
- the worker 400 is first made to perform the first predetermined motion, and the sensing system 3100 acquires the first information. Then, while the worker 400 is performing the second predetermined motion, the sensing system 3100 generates motion control information and operates the work robot. As the worker 400 is also performing the second predetermined motion during this, the sensing system 3100 generates motion control information based on the first information from the second predetermined motion, and adjusts the motion control information so that the robot motion of the work robot approximates the predetermined motion of the worker 400. By repeating this, the sensing system 3100 becomes able to adjust the robot motion of the work robot so that the robot motion of the work robot approximates the predetermined motion of the worker 400.
- the management control device 3060 simultaneously acquires the first information and the second information, making it possible to provide a system that can adjust the operation of the work robot on the spot while comparing the specific operations of the work robot and the worker 400.
- the operation control information is generated after referring to the work manual information.
- the worker 400 does not always perform actions faithful to the work, and in some cases may perform unnecessary actions or omit necessary actions. Therefore, by referring to the work manual information, appropriate operation information can be reflected in the operation control information, making it possible to adjust the work robot to operate more appropriately.
- the learning results of the learning unit 3663 on the specified actions that contradict the work manual information are not used when generating the action control information. This makes it possible to prevent unnecessary or inappropriate specified actions by the worker 400 from being reflected in the action control information.
- FIGS. 24A and 24B are diagrams illustrating an example of a sensing system according to Modification 1 of this embodiment.
- FIG. 24 is a diagram showing an example of a system configuration of a sensing system according to Variation 1 of the third embodiment of the present disclosure.
- This sensing system is characterized in that in a humanoid robot 3020' functioning as a working robot, a head sensor 3023' (head imaging device 3024'') functions as a first sensor, and a torso sensor 3023'' (torso imaging device 3024'') functioning as a second sensor is provided in the humanoid robot 3020'.
- the management control device 3060 is not necessarily required, and the sensing system can be configured by the humanoid robot 3020' alone.
- FIG. 24B is a diagram showing an example of the work robot shown in FIG. 24A.
- the humanoid robot 3020' functioning as a work robot has the same configuration as the first humanoid robot 3020a, except that it is equipped with a torso sensor 3023'' (torso image capture device 3024'').
- the humanoid robot 3020' is equipped with a robot main body 3021', a robot movement mechanism 3022', a head sensor 3023', a head image capture device 3024' included in the head sensor 3023', a torso sensor 3023'', a torso image capture device 3024'' included in the torso sensor 3023'', an information processing device 3025', and a robot arm 3026'.
- the robot main body 3021' comprises a robot torso 3211' and a robot head 3212'.
- the robot torso 3211' and the robot head 3212' constitute a torso/head drive mechanism 3021' (see FIG. 25), and it is possible to change the sensing area 3230' (imaging area 3240') of the head sensor 3023' (head imaging device 3024') and the sensing area 3230'' (imaging area 3240'') of the torso sensor 3023'' (torso imaging device 3024'').
- the head sensor 3023' functions as a first sensor
- the torso sensor 3023'' functions as a second sensor.
- the torso sensor 3023'' senses, for example, the movement of the robot arm 3026' as the robot movement. Since the head sensor 3023' (head imaging device 3024') and the torso sensor 3023'' (torso imaging device 3024'') are arranged at different height positions, the torso sensor 3023'' (torso imaging device 3024'') functioning as the second sensor senses the predetermined movement of the sensing target from a different position from the head sensor 3023' (head imaging device 3024'). Note that the roles of the head sensor 3023' (head imaging device 3024') and the torso sensor 3023'' (torso imaging device 3024'') may be reversed.
- FIG. 25 is a block diagram showing an example of the functions of a work robot in this sensing system 3100'.
- the information processing device 3025' includes an information processing unit 3066', a communication interface 1222', and a storage device 1224', and the information processing unit 3066' includes a determination unit 3661', an adjustment unit 3662', a learning unit 3663', and a motion information generation unit 3664'. That is, in the sensing system 3100', the information processing unit 3066' performs the same processing as the processing unit 3066 of the management control device 3060.
- the information processing device 3025' is configured to be able to communicate with the head sensor 3023' (head imaging device 3024'), the torso sensor 3023'' (head imaging device 3024''), the first moving mechanism 3022', the head/torso driving mechanism 3021', and the arm driving mechanism 3026'.
- the humanoid robot 3020' of the sensing system 3100' is equipped with an information processing unit 3066' in an information processing device 3025', so that the humanoid robot 3020' alone constitutes a sensing system.
- the adjustment unit 3662' of the humanoid robot 3020' instructs the head sensor 3023' (head imaging device 3024') functioning as the first sensor to sense the predetermined movement of the worker 400, and the torso sensor 3023'' (torso imaging device 3024'') functioning as the second sensor to sense the arm 3026' of the humanoid robot 3020'. Then, from the sensor information (first information and second information) acquired by each sensor, motion control information is generated through learning by the learning unit 3663', and the adjustment unit 3662' compares the first information with the second information and adjusts the motion control information so that the robot motion of the humanoid robot 3020' approximates the predetermined movement of the worker 400.
- the humanoid robot 3020' can constitute a sensing system by itself, so that even in a place where communication with the management control device 3060 is not possible, it is possible to check the robot operation of the work robot and adjust the robot operation to an appropriate operation.
- the self-acting robot 3020' is equipped with multiple (two in this modified example) sensors (imaging devices), so that, for example, even in a place that is too narrow to sense the worker 400, it is possible to check the robot's movements and adjust the robot's movements appropriately.
- the humanoid robot functioning as the work robot does not necessarily have to be one, but may be multiple. In this case, the more humanoid robots there are, the more humanoid robots will be performing tasks, and many tasks can be processed simultaneously in parallel at once.
- FIG. 26 is a diagram showing an example of a system configuration of a sensing system according to the second modification of this embodiment.
- This sensing system is characterized in that the first humanoid robot 3020a, which has the same functions as the humanoid robot 3020' described in the first modification, senses the worker 400 and the third humanoid robot 3020c, which functions as a working robot. Specifically, an instruction is given so that the sensing area 3230a1 (imaging area 3240a1) of the head sensor 3023a1 (head imaging device 3024a1) of the first humanoid robot 3020a targets the third humanoid robot 3020c, and the sensing area 3230a2 (imaging area 3240a2) of the torso sensor 3023a2 (torso imaging device 3024a2) of the first humanoid robot 3020a targets the worker 400.
- imaging area 3240a1 imaging area 3240a1
- head sensor 3023a1 head imaging device 3024a1
- the sensing area 3230a2 imaging area 3240a2
- the torso sensor 3023a2 torso imaging device 3024a2
- the management control device 3060 is not necessarily required as long as the first humanoid robot 3020a and the third humanoid robot 3020c are configured to be able to communicate with each other.
- the sensing areas of the head sensor 3023a1 (head imaging device 3024a1) and the torso sensor 3023a2 (torso imaging device 3024a2) may be configured inversely to that described above.
- This sensing system can sense the entire third humanoid robot 3020c, which has the advantage that it is easier to check the robot movements of the third humanoid robot 3020c and control them appropriately compared to variant 1.
- Modification 1 also has the advantage that it is easier to configure a sensing system than modification 2 when there is no space to place the first humanoid robot 3020a. It also has the advantage over modification 2 in that it does not require a communication configuration between the first humanoid robot 3020a and the third humanoid robot 3020c, and the humanoid robot can be omitted as a whole.
- FIGS. 3 and 27B are diagrams illustrating an example of a sensing system according to Modification 3 of this embodiment.
- FIG. 27A is a diagram showing an example of a system configuration in a sensing system according to Variation 3 of Embodiment 3.
- This sensing system is characterized in that a first humanoid robot 3020a that functions as a mobile robot grasps a sensor mounting member 3030.
- FIG. 27B is a diagram showing an example of the sensor mounting member shown in FIG. 27A.
- the sensor mounting member 3030 comprises a mounting member main body 3031, a mounting member moving mechanism 3032, a mounting member sensor 3033, and a mounting member imaging device 3034.
- the sensor mounting member 3030 can be moved by the mounting member moving mechanism 3032 provided below the mounting member main body 3031.
- the mounting member moving mechanism 3032 does not necessarily have to be provided.
- the mounting member body 3031 is, for example, a rod- or cane-shaped member, and its material is not particularly limited.
- the length of the mounting member body 3031 is longer than the height (back height) of the humanoid robot 3020, for example 2.1 meters.
- the mounting member body 3031 is provided with a mounting member movement mechanism 3032 below, preferably at the lower end, and a mounting member sensor 3033 above, preferably at the upper end, of the mounting member body 3031.
- the mounting member moving mechanism 3032 is configured with a rotating body such as a caster, and assists the sensor mounting member 3030 in moving in accordance with the movement of the humanoid robot 3020. Note that in this embodiment, it is not assumed that the sensor mounting member 3030 moves autonomously, but a mounting member control unit (not shown) that issues instructions to the mounting member moving mechanism 3032 may be provided, and the mounting member moving mechanism 3032 may be moved based on a signal from the mounting member control unit.
- the mounting member sensor 3033 (mounting member imaging device 3034) functioning as a first sensor is provided above the mounting member main body 3031 and senses the worker 400.
- An example of the mounting member sensor 3033 is similar to the robot sensor 3023, and an example of the mounting member imaging device 3034 is similar to the robot imaging device 3024.
- an example of the acquired sensor information is also similar to the robot sensor 3023, and an example of the sensing timing of the sensor information is also similar to the robot sensor 3023.
- the mounting member imaging device 3034 is included in the mounting member sensor 3033.
- the mounting member sensor 3033 which includes the mounting member imaging device 3034, is positioned at a position higher than the height (back height) of the humanoid robot 3020. This allows the mounting member sensor 3033 to sense the movements of the worker 400 from a position higher than the robot sensor 3023.
- FIG. 28 is a diagram showing an example of the configuration and functions of a sensing system according to Variation 3 of Embodiment 3.
- sensor mounting member 3030 is configured to be able to communicate wirelessly or via wire with the first information processing device of first humanoid robot 3020a.
- sensor mounting member 3030 may be configured to be able to communicate with communication unit 3064 of management control device 3060 instead of or together with the first information processing device.
- the configurations of first humanoid robot 3020a, third humanoid robot 3020c functioning as a work robot, and management control device 3060 of sensing system 3100'' are the same as those of sensing system 3100.
- the first humanoid robot 3020a grasps the sensor mounting member 3030 with the right gripping part (or left gripping part), which is part of the robot arm constituting the arm drive mechanism.
- the mounting member sensor 3033 (mounting member imaging device 3034) of the sensor mounting member 3030 can change its sensing area 3330 (imaging area 3340) by the arm drive mechanism.
- the processing unit of management control device 3060 instructs attachment member sensor 3033 (attachment member imaging device 3034) functioning as the first sensor to sense worker 400, and first robot sensor 3023a (first robot imaging device 3024a) functioning as the second sensor to sense third humanoid robot 3020c. Then, from the sensor information (first information and second information) acquired by each sensor, operation control information is generated through learning by the learning unit of management control device 3060, and an adjustment unit of management control device 3060 compares the first information with the second information and adjusts the operation control information so that the robot operation of third humanoid robot 3020c approximates the specified operation of worker 400.
- the roles (functions as the first sensor and the second sensor) of the mounting member sensor 3033 (mounting member imaging device 3034) and the first robot sensor 3023a (first robot imaging device 3024a) may be reversed. That is, the first robot sensor 3023a (first robot imaging device 3024a) may instruct the worker 400 and the mounting member sensor 3033 (mounting member imaging device 3034) to sense the third humanoid robot 3020c.
- the mounting member sensor 3033 (mounting member imaging device 3034) is configured as the second sensor, for example, when sensing a worker 400, it is possible to check the operation of the work robot and appropriately control it even in a narrow space where it is difficult to install multiple humanoid robots that function as mobile robots.
- the sensor mounting member 3030 (imaging device 3040 for mounting member) is disposed at a position higher than the height (back height) of the humanoid robot 3020. This allows sensing of the movements of the worker 400 (or the working robot) from a more bird's-eye view, making it easier to avoid situations where sensing is difficult due to the back of the worker 400 or the working robot, for example, and allows efficient acquisition of data necessary for learning the work of the worker 400 or the working robot.
- the humanoid robot functioning as a mobile robot does not have to be one, and the sensor mounting member does not have to be one.
- the humanoid robots 3020 each holding two sensor mounting members by both holding parts 3265, 3266. In this case, too, the number of sensors can be increased, making it possible to obtain a large amount of sensor information at one time.
- the humanoid robot functioning as a work robot does not have to be one.
- two mobile robots (humanoid robots) equipped with sensors and moving mechanisms and one working robot (humanoid robot) are used, and one mobile robot is arranged for each worker 400 and each working robot.
- the relationship between the mobile robots, workers 400, and working robots is not limited to this.
- the management control device 3060 has been described as acquiring the first information and the second information simultaneously.
- the management control device 3060 may acquire the first information and the second information separately.
- the first sensing period which is the period during which the first information is acquired by sensing the worker 400
- the second sensing period which is the period during which the robot operation of the third humanoid robot 3020c is sensed and the second information is acquired, do not have to overlap. This means that the working robot does not have to be operated simultaneously in parallel with the worker 400, making it possible to flexibly adjust the robot operation in response to the specified operation of the worker 400.
- the learning of the specified actions of the worker has been described as being performed by automatic learning.
- the learning does not necessarily have to be automatic learning, and may be other known machine learning methods, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- the mobile robot and the working robot are described as being the same humanoid robot. In this case, it becomes possible to use the mobile robot in combination with the working robot, which can reduce the expenses and costs associated with robot production.
- the mobile robot and the working robot may be different robots.
- a worker (person) is used as the sensing target.
- a robot capable of imitating the predetermined movements of a worker may be used as the sensing target.
- the work robot has been described as being operated in the same work area 200 as the worker 400 and in the vicinity of the worker 400.
- the work robot does not have to be located in the vicinity of the worker, and does not have to be located in the same work area as the worker.
- FIG. 29A is a diagram showing an example of a system configuration of a behavior modification system according to embodiment 4 of the present disclosure.
- This behavior modification system includes a first humanoid robot 4020a and a second humanoid robot 4020b that function as working robots, and a third humanoid robot 4020c that functions as a mobile robot. Note that the number of humanoid robots is not limited to this.
- Each humanoid robot 4020a-c moves to the vicinity of a worker 400 working on the work line 201 of the work site 200 upon receiving instructions from a management control device 4060 (see FIG. 31) described later or from each information processing device provided in each humanoid robot 4020a-c. Then, this behavior modification system senses a predetermined behavior of the worker 400 using a third robot sensor 4023c (third robot imaging device 4024c) provided in the third humanoid robot 4020c.
- the predetermined behavior is wide-ranging, and examples include assembling parts, moving parts, painting a product, and the movement of the worker himself.
- a known image recognition technology may be used, or the worker 400 and his/her predetermined behavior may be recognized by learning using a learning unit 4663 (see FIG. 32).
- This movement modification system learns a standard movement model corresponding to the specified movement of the worker 400 based on sensing information corresponding to the specified movement of the worker 400 acquired using the third robot sensor 4023c (third robot imaging device 4024c).
- the standard movement model is a model that represents the contents of the work items of the worker 400, in other words, a collection of each movement specified in the work items. Then, this movement modification system references the standard movement model to generate a modified movement model in which the execution time of each movement in the standard movement model is set shorter than the time required for each movement when the standard movement model was generated.
- This behavior modification system operates the first humanoid robot 4020a and the second humanoid robot 4020b, which function as work robots, according to instructions from the management control device 4060 or instructions from an information processing device provided in each humanoid robot. At this time, this behavior modification system refers to the modified behavior model.
- This behavior modification system can improve the work efficiency of each humanoid robot by operating the first humanoid robot 4020a and the second humanoid robot 4020b by referring to a modified behavior model in which the execution time of each behavior is set to be short.
- a modified behavior model in which the execution time of each behavior is set to be short.
- this behavior modification system operates each humanoid robot by referring to a modified behavior model set to operate in one-tenth the execution time of the behavior in the standard behavior model. This makes it possible to complete 1,000 products per hour.
- the first humanoid robot 4020a and the second humanoid robot 4020b functioning as working robots may sense their respective robot movements using the first robot sensor 4023a (first robot image capture device 4024a) and the second robot sensor 4023b (second robot image capture device 4024b) provided therein, respectively.
- the first humanoid robot 4020a upon receiving an instruction from the management control device 4060 or in response to an instruction from the first information processing device 4025a, operates the first torso/head drive mechanism 4021a (see FIG. 31) so that the sensing area 4230a (imaging area 4240a) of the first robot sensor 4023a (first robot imaging device 4024a) targets the first grippers 4265a, 4266a of the first humanoid robot 4020a.
- the second humanoid robot 4020b operates the second torso/head drive mechanism so that the sensing area 4230b (imaging area 4240b) of the second robot sensor 4023b (second robot imaging device 4024b) targets the second grippers 4265b, 4266b of the second humanoid robot 4020b. This makes it possible to check whether the robot movements of each humanoid robot refer to the modified movement model.
- FIG. 29B is a diagram showing an example of the humanoid robot shown in FIG. 29A.
- the humanoid robot 4020 which functions as a work robot and a mobile robot, comprises a robot body 4021, a robot movement mechanism 4022, a robot sensor 4023, a robot imaging device 4024 included in the robot sensor 4023, an information processing device 4025, and a robot arm 4026.
- the humanoid robot 4020 can move using a robot movement mechanism 4022 provided below the robot body 4021, and moves to the vicinity of the work line 201 in the workplace 200 upon receiving instructions from outside the humanoid robot 4020, such as a management control device 4060, or by referring to a program stored in an information processing device 4025.
- the robot main body 4021 comprises a robot torso 4211 and a robot head 4212.
- the robot torso 4211 and the robot head 4212 constitute a torso/head drive mechanism, and are capable of changing the sensing area 4230 (imaging area 4240) of the robot sensor 4023 (robot imaging device 4024).
- the configuration of the drive mechanism is not particularly limited, and may be configured, for example, such that the robot head 4212 rotates a predetermined angle relative to the robot torso 4211, or the robot torso 4211 rotates a predetermined angle relative to the robot movement mechanism 4022, by a servo motor (not shown).
- a robot movement mechanism 4022 is provided below the robot torso 4211, a robot arm 4026 is provided on each side of the robot torso 4211, and a robot sensor 4023 is provided in the robot head 4212.
- An information processing device 4025 is also provided inside the robot main body 4021.
- the robot movement mechanism 4022 may be of any configuration, for example it may be provided with a rotating body driven by a motor, or may have legs that resemble the shape of a human leg. As an example, if the robot movement mechanism 4022 is configured to resemble the shape of a human leg, a servo motor is provided at the location corresponding to a human joint, and the movement mechanism is formed by rotating it by a predetermined angle.
- the robot sensor 4023 is preferably provided in the robot head 4212 and senses the robotic movements of the worker 400, other working robots, or the humanoid robot 4020, preferably the robotic movements of the robot arm 4026, and more preferably the robotic movements of the grippers 4255 and 4256.
- the robot sensor 4023 also sequentially acquires information representing at least the distance and angle between the robot arm 4026 and an object around the humanoid robot 4020 on which the humanoid robot 4020 is working.
- Examples of the robot sensor 4023 include the highest performance camera, a thermal camera, a high-pixel, telephoto, ultra-wide-angle, 360-degree, high-performance camera, radar, solid-state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or various other sensor groups. These are also examples of the robot imaging device 4024.
- robot sensor 4023 examples include a vibration meter, a hardness meter, a micro vibration meter, an ultrasonic measuring device, a vibration measuring device, an infrared measuring device, an ultraviolet measuring device, an electromagnetic wave measuring device, a thermometer, a hygrometer, a spot AI weather forecast, a high-precision multi-channel GPS, low-altitude satellite information, or long-tail incident AI data.
- Examples of sensing information acquired from the robot sensor 4023 include images, distance, vibration, heat, smell, color, sound, ultrasound, radio waves, ultraviolet light, infrared light, humidity, etc., and image and distance information is preferably acquired by the robot imaging device 4024.
- the robot sensor 4023 (robot imaging device 4024) performs this sensing every nanosecond, for example.
- the sensing information is used, for example, for motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, analysis of cornering, speed, etc.
- the robot arm 4026 comprises a right arm 4261 and a left arm 4262.
- the right arm 4261 comprises a right gripping support part 4263 and a right gripping part 4265
- the left arm 4262 comprises a left gripping support part 4264 and a left gripping part 4266.
- the right gripping support part 4263 is a mechanism for supporting the right gripping part 4265
- the left gripping support part 4264 is a mechanism for supporting the left gripping part 4266, and may be shaped like a human arm, for example.
- the gripping parts 4265 and 4266 are mechanisms for gripping, for example, parts for work, and may be shaped like a human hand, for example.
- the robot arm 4026 constitutes an arm drive mechanism.
- the configuration of the drive mechanism is not particularly limited, and for example, if the robot arm 4026 is to resemble a human shape, a configuration may be adopted in which servo motors are provided at each joint location, such as a location corresponding to a human shoulder, a location corresponding to an elbow, a location corresponding to a wrist, a location corresponding to a finger joint, etc., and rotated by a predetermined angle.
- the humanoid robot 4020 may further be provided with a sensor, for example, on the robot torso 4211 (see FIG. 35B).
- the sensor is located at a different height than the robot sensor 4023 located on the robot head 4212. The different height allows the sensor to sense the movements of the worker 400 from different angles.
- Figure 30 shows an example of the relationship between the standard behavior model and the behavior modification model in this behavior modification system.
- the standard action model is a model that represents a set of actions specified in a work item, and includes multiple actions. As an example, there are a total of 26 actions specified in the work item, and the actions are called action A, action B, action C, and action Z.
- the movement modification model is a model in which the execution time of each movement in the standard movement model is set shorter than the time required for each movement when the standard movement model was generated. For example, assume that the required times for each movement in the standard movement model are T A seconds for movement A and T B seconds for movement B. In this case, the movement modification model sets the execution time for movement A to t A seconds, which is shorter than T A seconds , and for movement B to t B seconds, which is shorter than T B seconds. The same applies to movements C to Z. This makes it possible to shorten the execution time when a work robot is executed using the movement modification model than when the work robot is executed using the standard movement model.
- the execution time of at least one behavior is less than the time required for that behavior in the standard behavior model, and it is not necessary that the execution time of all of the behaviors included in the standard behavior model is less than the time required for that behavior in the standard behavior model. In other words, if the execution time of at least one behavior is less than the time required for that behavior in the standard behavior model, then the execution time of each behavior in the standard behavior model will be set shorter than the time required for each behavior when the standard behavior model was generated.
- FIG. 31 is a block diagram showing an example of the configuration and functions of the operation modification system 4100 of this embodiment.
- the behavior modification system 4100 includes a first humanoid robot 4020a, a second humanoid robot 4020b, a third humanoid robot 4020c, and a management control device 4060.
- the first humanoid robot 4020a, the second humanoid robot 4020b, and the third humanoid robot 4020c are each connected to the communication unit 4064 of the management control device 4060 via wireless or wired communication, and receive instructions from the management control device 4060 and transmit information acquired by each sensor.
- the humanoid robots 4020a-c may also be connected to each other via wireless or wired communication, and send and receive information acquired by each sensor and instructions.
- the first humanoid robot 4020a which functions as a work robot, comprises a first body/head drive mechanism 4021a, a first robot movement mechanism 4022a, a first robot sensor 4023a, a first robot imaging device 4024a included in the first robot sensor 4023a, a first information processing device 4025a, and a first arm drive mechanism 4026a.
- the second humanoid robot 4020b which functions as a work robot
- the third humanoid robot 4020c which functions as a mobile robot, are also configured in the same manner as the first humanoid robot 4020a.
- the first information processing device 4025a includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the first information processing device 25a also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the first information processing device 4025a also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the first information processing device 4025a.
- the storage device 1224 may also store sensing information.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- the ROM 1230 stores therein a boot program, etc., which is executed by the first information processing device 4025a upon activation, and/or a program that depends on the hardware of the first information processing device 4025a.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the first information processing device 4025a, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the first information processing device 4025a.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described program or software module may be stored in a computer-readable storage medium on the first information processing device 4025a or in the vicinity of the first information processing device 4025a.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the first information processing device 4025a via the network.
- the management control device 4060 is a control device that issues instructions to each humanoid robot 4020a-c in order to realize the behavior modification system 4100.
- the management control device 4060 also acquires sensing information stored in the storage device of each information processing device.
- the management control device 4060 is composed of a CPU 4060A, a RAM 4060B, a ROM 4060C, an input/output unit (I/O) 4060D, a bus 4060E such as a data bus or a control bus that connects these, and a communication unit 4068.
- a storage medium 4062 is connected to the I/O 4060D.
- a communication unit 4064 is connected to the I/O 4060D, which transmits and receives sensing information, work manual information, schedule information, etc. to and from the control system of the humanoid robot 4020.
- the work manual information includes, for example, the name and content of each work item, the order of the work items, and information on the standard work time required for each work item.
- the schedule information includes, for example, information indicating the work time and start/end times for the entire work, information indicating the work time and start/end times for each work item, and information indicating the worker for each work item.
- FIG. 32 is a block diagram showing an example of the functions of the management control device 4060 in the operation modification system of this embodiment.
- the management control device 4060 includes a storage medium 4062, a communication unit 4064, and a processing unit 4066.
- the storage medium 4062 includes, for example, at least one of a semiconductor storage device, a magnetic tape device, a magnetic disk device, or an optical disk device.
- the storage medium 4062 stores driver programs, operating system programs, application programs, data, etc., used for processing in the processing unit 4066.
- the storage medium 4062 stores sensing information.
- the storage medium 4062 also stores work manual information and/or schedule information for the worker 400.
- the communication unit 4064 has a wireless communication interface circuit such as Wi-Fi (registered trademark) and/or a wired communication interface circuit such as Ethernet (registered trademark).
- the communication unit 4064 transmits and receives various information to and from the humanoid robots 4020a and 4020b through the interface circuits.
- the processing unit 4066 has one or more processors and their peripheral circuits.
- the processing unit 4066 centrally controls the overall operation of the operation modification system 4100, and is, for example, a CPU.
- the processing unit 4066 executes processing by referencing programs (driver programs, operating system programs, application programs, etc.) stored in the storage medium 4062.
- the processing unit 4066 can also execute multiple programs (application programs, etc.) in parallel.
- the processing unit 4066 includes a determination unit 4661, a control unit 4662, a learning unit 4663, and a model generation unit 4664. Each of these units is a functional module realized by a program executed by a processor included in the processing unit 4066. Alternatively, each of these units may be implemented in the processing unit 4066 as firmware.
- the determination unit 4661 determines whether the multiple sensors are sensing different targets. This determination may be made using a known image recognition technique, or may be made by referring to learning by the learning unit 4663.
- the control unit 4662 operates the first humanoid robot 4020a and/or the second humanoid robot 4020b, which function as working robots, by referring to the modified motion model generated by the model generation unit 4664.
- the control unit 4662 operates the robot movement mechanism, torso/head drive mechanism, and/or arm drive mechanism of each humanoid robot by referring to the modified motion model.
- the learning unit 4663 learns a standard action model corresponding to the predetermined action of the worker 400 based on sensing information corresponding to the predetermined action of the worker 400 acquired using the third robot sensor 4023c (third robot imaging device 4024c). This learning is performed, for example, by automatic learning, which is learning in which a learned model is automatically created and judgment/analysis is automatically performed using the learned model. Note that the learning unit 4663 may refer to work manual information and/or schedule information when generating the standard action model. This makes it possible to have each humanoid robot 4020 perform an appropriate action (task) without reflecting any inappropriate predetermined action of the worker 400.
- the model generation unit 4664 references the standard behavior model and generates a modified behavior model in which the execution time of each behavior in the standard behavior model is set shorter than the time required for each behavior when the standard behavior model was generated.
- FIG. 33 is an example of a flowchart showing the processing of the operation modification system of this embodiment.
- the third information processing device of the third humanoid robot 4020c instructs the third humanoid robot 4020c, which functions as a mobile robot, to move to the workshop 200 (step S4101).
- the movement is caused by the operation of the third robot movement mechanism of the humanoid robot 4020c.
- a movement instruction may also be given at this time to the humanoid robots 4020a and 4020b, which function as work robots.
- the sensing area 4230c (imaging area 4240c) of the third robot sensor 4023c (third robot imaging device 4024c) of the third humanoid robot 4020c targets the worker 400.
- the placement of the third humanoid robot 4020c is performed, for example, by storing a floor plan of the workplace 200 in the storage device and/or storage medium 4062 of the third humanoid robot 4020c in advance, and associating the position of the third humanoid robot 4020c with the stored floor plan.
- the placement of the third humanoid robot 4020c may be based on a position optimized through machine learning. The same applies to the placement of the first humanoid robot 4020a and the second humanoid robot 4020b.
- the third robot sensor 4023c (third robot imaging device 4024c) senses a predetermined movement of the worker 400 on the work line 201 (step S4102).
- the control unit 4662 issues an instruction to the third robot sensor 4023c (third robot imaging device 4024c) so that the sensing area 4230c (imaging area 4240c) targets the worker 400, and the third robot movement mechanism and each drive mechanism of the third humanoid robot 4020c are activated.
- the sensing information acquired by the third robot sensor 4023c (third robot imaging device 4024c) is stored in the storage medium 4062 via the storage device and/or communication unit 4064 of the third humanoid robot 4020c.
- the storage device and storage medium 4062 of each humanoid robot function as a storage unit.
- the management control device 4060 learns a standard action model corresponding to a specific action of the worker 400 based on the sensing information accumulated in the memory unit, in other words, stored, and generates a modified action model by referring to the standard action model, in which the execution time of each action in the standard action model is set shorter than the required time of each action when the standard action model was generated (step S4103).
- the required time of one action in the standard action model is 10 seconds
- the execution time of that action is set to 5 seconds in the modified action model.
- the control unit 4662 operates the first humanoid robot 4020a and the second humanoid robot 4020b, which function as working robots, by referring to the generated modified motion model (step S4104).
- the first humanoid robot 4020a and the second humanoid robot 4020b can perform the work (predetermined motion) of the worker 400 faster than the specified motion.
- the modified motion model is set to operate in half the execution time of the motion in the standard motion model, so that each humanoid robot 4020 can perform the work twice as fast as the specified motion of the worker 400.
- FIG. 34 is an example of a flowchart showing more detailed processing of the modified motion model processing shown in step S4103 of FIG. 33.
- the sensing information is stored in the memory unit (step S4201), and the learning unit 4663 learns a standard action model corresponding to the specified action of the worker 400 based on the sensing information corresponding to the specified action of the worker 400 acquired using the third robot sensor 4023c (third imaging device 4024c) (step S4202).
- the learning unit 4663 generates a standard action model based on the learning result (step S4203). Note that the learning unit 4663 may refer to the work manual information and/or the schedule information of the worker 400 when generating the standard action model.
- the model generation unit 4664 generates a modified motion model by referring to the standard motion model (step S4204).
- the modified motion model is a model in which the execution time of each motion in the standard motion model is set to be shorter than the time required for each motion when the standard motion model was generated, so that the robot motions of each humanoid robot 4020a, 4020b operated by referring to the modified motion model are faster than the robot motions of the same robots operated by referring to the standard motion model.
- the work robot can be operated by referring to a modified movement model that is set to be shorter than the time required for each movement when the standard movement model is generated, thereby enabling the work robot to work efficiently.
- the action modification system 4100 of this embodiment when generating a standard action model, the work manual information and/or the schedule information are referenced, and then the standard action model is generated.
- the worker 400 does not always perform actions faithful to the work, and in some cases may perform unnecessary actions or omit necessary actions. Therefore, by referencing the work manual information and the schedule information, it is possible to prevent unnecessary or inappropriate predetermined actions by the worker 400 from being reflected in the standard action model.
- FIGS. 35A and 35B are diagrams showing an example of a behavior modification system according to Modification Example 1 of this embodiment.
- FIG. 35A is a diagram showing an example of the system configuration of a movement modification system according to Variation 1 of the fourth embodiment of the present disclosure.
- This movement modification system senses and learns each of the predetermined movements of a plurality of sensing targets (workers 400a, 400b).
- a torso sensor 4023'' torso imaging device 4024''
- this movement modification system does not necessarily require a management control device 4060, and the movement modification system can be configured using the humanoid robot 4020' alone.
- FIG. 35B is a diagram showing an example of the humanoid robot shown in FIG. 35A.
- the humanoid robot 4020' which functions as a mobile and working robot, comprises a robot body 4021', a robot movement mechanism 4022', a head sensor 4023', a head image capture device 4024' included in the head sensor 4023', a torso sensor 4023'', a torso image capture device 4024'' included in the torso sensor 4023'', an information processing device 4025', and a robot arm 4026'.
- the robot main body 4021' comprises a robot torso 4211' and a robot head 4212'.
- the robot torso 4211' and the robot head 4212' constitute a torso/head drive mechanism 4021' (see FIG. 36), and it is possible to change the sensing area 4230' (imaging area 4240') of the head sensor 4023' (head imaging device 4024') and the sensing area 4230'' (imaging area 4240'') of the torso sensor 4023'' (torso imaging device 4024'').
- the head sensor 4023' (head imaging device 4024') and the torso sensor 4023'' (torso imaging device 4024'') are positioned at different heights, so the torso sensor 4023'' (torso imaging device 4024'') senses the specified movements of each sensing target (workers 400a, 400b) from a different position than the head sensor 4023' (head imaging device 4024').
- FIG. 36 is a block diagram showing an example of the functions of a humanoid robot in this movement modification system.
- the information processing device 4025' includes an information processing unit 4066', a communication interface 1222', and a storage device 1224', and the information processing unit 4066' includes a determination unit 4661', a control unit 4662', a learning unit 4663', and a model generation unit 4664'. That is, in the movement modification system 4100', the information processing unit 4066' performs the same processing as the processing unit 4066 of the management control device 4060.
- the information processing device 4025' is configured to be able to communicate with the head sensor 4023' (head imaging device 4024'), the torso sensor 4023'' (head imaging device 4024''), the torso/head driving mechanism 4021', the robot moving mechanism 4022', and the arm driving mechanism 4026'.
- the humanoid robot 4020' of the behavior modification system 4100' is equipped with an information processing unit 4066' in an information processing device 4025', so the humanoid robot 4020' alone constitutes the behavior modification system.
- the control unit 4662' of the humanoid robot 4020' instructs the torso sensor 4023'' (torso image capture device 4024'') to sense the worker 400a, and the head sensor 4023' (head image capture device 4024'') to sense the worker 400b.
- the present movement modification system 4100' is provided with a plurality of sensors for sensing a plurality of different sensing targets, and a plurality of pieces of sensing information corresponding to the predetermined movements of a plurality of workers are acquired using the plurality of sensors.
- the determination unit 4661' may determine whether each sensor is sensing a different target.
- FIG. 37 is an example of a flowchart showing more detailed processing of the modified behavior model generation process in the behavior modification system according to Variation 1 of the disclosed embodiment 4.
- This movement modification system is different from the fourth embodiment in that in S4102, sensing of the predetermined movements of multiple workers is performed by multiple sensors.
- this movement modification system includes step S4103', which replaces S4103.
- step S4103' first, each piece of sensing information acquired by the head sensor 4023' (head image pickup device 4024') and the torso sensor 4023'' (torso image pickup device 4024'') is stored in the memory unit (storage device 1224') (step S4201').
- the learning unit 4663' learns each of the predetermined movements of multiple workers (two in this modified example) based on multiple pieces of sensing information (two in this modified example) (step S4202').
- the learning unit 4663' also learns multiple (two in this modified example) standard action models corresponding to the respective predetermined actions of the multiple workers based on the multiple pieces of sensing information (step S4203'). For example, if the work of worker 400a is made up of actions A to M, while the work of worker 400b is made up of actions N to Z, the standard action models will be a first standard action model made up of actions A to M and a second standard action model made up of actions N to Z. Note that when generating the standard action models, the learning unit 4663' may refer to work manual information and/or schedule information.
- the model generation unit 4664' generates a modified action model that integrates at least a portion of the specified actions performed by multiple workers (step S4204').
- FIG. 38 is a diagram showing an example of the relationship between a standard operation model and an operation modification model in an operation modification system according to Variation 1 of the fourth embodiment of the present disclosure.
- motion M is a motion of movement to hand over a certain part.
- motion N is a motion of movement to receive the part.
- the model generation unit 4664' integrates motions A to L, which are part of the motions in the first standard motion model, and motions O to Z, which are part of the motions in the second standard motion model, to generate a modified motion model consisting of motions A to L and motions O to Z.
- this motion modification system 4100 when operating a humanoid robot 4020' that also functions as a working robot, it is possible to refer to the generated modified motion model and have a single humanoid robot 4020' perform the specified motions that were performed by multiple workers, omitting unnecessary motions as necessary. As a result, it is possible to have the working robot work efficiently.
- this behavior modification system if six workers initially complete 100 products per hour on one line 201 in the workplace 200, then by placing three humanoid robots 4020' on line 201, it becomes possible to complete 100 products per hour. In particular, as mentioned above, if multiple workers are performing different tasks, the humanoid robot 4020' can perform those tasks collectively, omitting unnecessary actions as necessary.
- the model generation unit 4664' may generate a second modified motion model for the modified motion model in which the execution time of each motion is set shorter than the time required for each motion when the modified motion model was generated.
- the humanoid robot 4020' by referring to the second modified motion model set to perform the motions in half the execution time of the motions when the modified motion model was generated, it becomes possible to complete 200 products per hour.
- the humanoid robot 4020' can constitute a behavior modification system by itself, so that even in places where communication with the management control device 4060 is not possible, for example, the working robot can work efficiently by using a learning model that has learned the work of a worker.
- the self-acting robot 4020' is equipped with multiple (two in this modified example) sensors (imaging devices), making it possible to conduct work learning for workers even in a space that is too small to sense multiple workers, for example.
- this movement modification system multiple standard movement models corresponding to each of the predetermined movements of multiple workers are learned, and a modified movement model that integrates at least some of the predetermined movements of the multiple workers is generated by referring to the standard movement models.
- This makes it possible to substitute for the work (predetermined movements) of multiple workers with a smaller number of work robots than the number of workers, thereby improving work efficiency.
- a second modified model is generated in which the execution time of each movement in the modified movement model is set shorter than the time required for each movement at the time the modified movement model was generated, and the second modified model is referred to in operating the work robot, making it possible to have the work robot work more efficiently.
- the humanoid robot that functions as both a mobile robot and a work robot does not have to be one as shown in the example, but may be multiple.
- the number of humanoid robots increases, the number of sensors increases by a multiple of the number of humanoid robots, making it possible to obtain more sensing information at one time, and since the number of work robots also increases, it becomes possible to improve work efficiency, for example, when making each work robot perform the same task.
- the movement modification system 4100 has been described as having one mobile robot (humanoid robot) sensing the worker 400.
- mobile robot humanoid robot
- the standard action model is generated (S4103) after sensing (S4102) of the worker 400.
- the process from sensing to generating the standard action model does not necessarily have to be performed continuously.
- sensing information is stored (S4201), and learning with reference to the sensing information (S4202) may be performed a predetermined time (24 hours or one week, etc.) after the sensing.
- S4103 to S4105 may be performed while S4102 is being performed.
- the work robot is operated with reference to the modified action model while the worker 400 is performing a predetermined action, which can further improve work efficiency.
- the learning of the specified actions of the worker has been described as being performed by automatic learning.
- the learning does not necessarily have to be automatic learning, and may be other known machine learning methods, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- the mobile robot and the work robot are described as being the same humanoid robot. In this case, it is possible to use the mobile robot in combination with the work robot, which can reduce the expenses and costs associated with robot production. However, the mobile robot and the work robot may be different robots.
- this behavior modification system is not particularly limited as long as the execution time of each behavior in the standard behavior model is set to be shorter than the time required for each behavior when the standard behavior model was generated.
- this work reproduction system when an abnormality such as an accident or malfunction occurs, the worker 400 performs the action at the time of the abnormality occurrence, and the second robot sensor 5023b (second robot imaging device 5024b) senses the predetermined action of the worker 400.
- this work reproduction system learns a standard action model corresponding to the predetermined action of the worker 400 based on the first sensing information corresponding to the predetermined action of the worker 400.
- the standard action model is an action corresponding to the predetermined action of the worker 400, and is a model that represents an action specified as an expected action in a predetermined work item.
- this work reproduction system refers to the standard action model and causes the first humanoid robot 5020a, which is the work reproduction robot, to perform the reproduction action one or more times.
- This work reproduction system inputs accident or malfunction information, and detects the occurrence of an accident or malfunction based on second sensing information corresponding to the reproduction action of the first humanoid robot 5020a obtained using a sensor. This makes it possible to analyze problems with the movements of the worker 400 and shortcomings in the standard movements used in the work of the worker 400.
- Figure 39A shows an example of sensing when a worker reproduces an action.
- This task reproduction system includes a first humanoid robot 5020a that functions as a task reproduction robot and a second humanoid robot 5020b that functions as a mobile robot. Note that the number of humanoid robots is not limited to two.
- the task reproduction system senses the motion of the worker 400 using the second robot sensor 5023b (second robot imaging device 5024b) provided on the second humanoid robot 5020b.
- the motion of the worker 400 is the motion of the worker 400 when he places a component 320 in a location different from the original placement location 310 when assembling a printed circuit board 300 on the work line 201.
- the motion of the worker 400 may be recognized by the sensor using a known image recognition technology, or the motion of the worker 400 may be recognized by learning using a learning unit 5663 (see FIG. 42). The same applies to the motion reproduced by the task reproduction robot described later.
- the second humanoid robot 5020b senses the movements of the worker 400 using the second robot sensor 5023b (second robot imaging device 5024b).
- the first sensing information acquired by the second robot sensor 5023b (second robot imaging device 5024b) is stored in the storage medium 5062 (see FIG. 41) of the management control device 5060 or in the storage device of the second humanoid robot 5020b.
- the management control device 5060 learns a standard movement model corresponding to the specified movement of the worker 400 based on the first sensing information corresponding to the specified movement of the worker 400.
- the standard movement model is stored in the storage medium 5062 of the management control device 5060, and/or in the storage device 1224 (see FIG.
- the specified actions include various actions that occur before and after the occurrence of an abnormality, such as actions to grab an object, actions to assemble parts, and actions when using tools.
- FIG. 39B is a diagram showing an example of sensing when the task reproducing robot is made to reproduce an action.
- the management control device 5060 generates an action instruction to operate the first humanoid robot 5020a by referring to the stored standard action model.
- the action instruction is an instruction generated by referring to the standard action model, and is an instruction to operate the task reproducing robot (the first humanoid robot 5020a in this embodiment).
- the management control device 5060 operates the first humanoid robot 5020a by referring to the action instruction, for example, when the first humanoid robot 5020a is placed in a reproduction site where the site where the abnormal situation occurred is reproduced.
- the management control device 5060 refers to the standard motion model and has the first humanoid robot 5020a perform the reproduction motion at least once, and preferably multiple times.
- the work process is clearly defined, and accidents and malfunctions rarely occur. Therefore, when an abnormal situation occurs, there is a possibility that the occurrence of an accident or malfunction cannot be detected by having the first humanoid robot 5020a perform the reproduction motion only once. Therefore, by having the first humanoid robot 5020a perform the reproduction motion multiple times, it becomes easier to detect the occurrence of an accident or malfunction.
- the management control device 5060 receives the accident or malfunction information and detects the occurrence of an accident or malfunction based on the second sensing information corresponding to the reproduced behavior of the first humanoid robot 5020a acquired using the second robot sensor 5023b (second robot imaging device 5024b).
- the accident information is information about the accident, such as who performed what action, when and where.
- the malfunction information is information indicating the malfunction when there was an error in the behavior of the relevant parties when an abnormal situation occurred.
- FIG 40 shows an example of a humanoid robot in this task reproduction system.
- the humanoid robot 5020 which functions as a task reproduction robot and a mobile robot, comprises a robot body 5021, a robot movement mechanism 5022, a robot sensor 5023, a robot imaging device 5024 that may be included in the robot sensor 5023, an information processing device 5025, and a robot arm 5026.
- the humanoid robot 5020 can move using a robot movement mechanism 5022 provided below the robot body 5021, and moves, for example, to the workshop 200 upon receiving instructions from outside the humanoid robot 5020, such as from a management control device 5060, or by referring to a program stored in an information processing device 5025.
- the robot main body 5021 comprises a robot torso 5211 and a robot head 5212.
- the robot torso 5211 and the robot head 5212 constitute a torso/head drive mechanism, and are capable of changing the sensing area 5230 (imaging area 5240) of the robot sensor 5023 (robot imaging device 5024).
- the configuration of the drive mechanism is not particularly limited, and may be configured, for example, such that the robot head 5212 rotates a predetermined angle relative to the robot torso 5211, or the robot torso 5211 rotates a predetermined angle relative to the robot movement mechanism 5022, by a servo motor (not shown).
- a robot movement mechanism 5022 is provided below the robot torso 5211, a robot arm 5026 is provided on each side of the robot torso 5211, and a robot sensor 5023 is provided in the robot head 5212.
- An information processing device 5025 is also provided inside the robot main body 5021.
- the robot movement mechanism 5022 may be of any configuration, for example it may be provided with a rotating body driven by a motor, or may have legs that resemble the shape of a human leg. As an example, if the robot movement mechanism 5022 is configured to resemble the shape of a human leg, a servo motor is provided at the location that corresponds to a human joint, and the movement mechanism is formed by rotating it by a predetermined angle.
- the robot sensor 5023 is provided in the robot head 5212 and is capable of sensing the movements of the worker 400 and the task reproduction robot.
- the robot sensor 5023 also sequentially acquires information representing at least the distance and angle between an object around the humanoid robot 5020 on which the humanoid robot 5020 is working and the robot arm 5026.
- Examples of the robot sensor 5023 include the highest performance camera, a thermal camera, a high pixel, telephoto, ultra-wide angle, 360 degree, high performance camera, radar, solid state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or a variety of other sensor groups. These are also examples of the robot imaging device 5024.
- robot sensor 5023 examples include a vibration meter, a hardness meter, a micro vibration meter, an ultrasonic measuring device, a vibration measuring device, an infrared measuring device, an ultraviolet measuring device, an electromagnetic wave measuring device, a thermometer, a hygrometer, a spot AI weather forecast, a high-precision multi-channel GPS, low-altitude satellite information, or long-tail incident AI data.
- Examples of sensor information acquired from the robot sensor 5023 include images, distance, vibration, heat, smell, color, sound, ultrasound, radio waves, ultraviolet light, infrared light, humidity, etc., and the image and distance information is preferably acquired by the robot imaging device 5024.
- the robot sensor 5023 (robot imaging device 5024) performs these sensing operations, for example, every nanosecond.
- the sensor information is used, for example, for motion capture of the movements of the worker 400 and the first humanoid robot 5020a, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 and the first humanoid robot 5020a in the workplace 200, analysis of cornering, speed, etc.
- the robot arm 5026 comprises a right arm 5261 and a left arm 5262.
- the right arm 5261 comprises a right gripping support part 5263 and a right gripping part 5265
- the left arm 5262 comprises a left gripping support part 5264 and a left gripping part 5266.
- the right gripping support part 5263 is a mechanism for supporting the right gripping part 5265
- the left gripping support part 5264 is a mechanism for supporting the left gripping part 5266, and may be shaped like a human arm, for example.
- the gripping parts 5265 and 5266 are mechanisms for gripping, for example, parts for work, and may be shaped like a human hand, for example.
- the robot arm 5026 constitutes a second drive mechanism.
- the configuration of the drive mechanism is not particularly limited, and for example, if the robot arm 5026 is to resemble a human shape, a configuration may be adopted in which servo motors are provided at each joint location, such as a location corresponding to a human shoulder, a location corresponding to an elbow, a location corresponding to a wrist, a location corresponding to a finger joint, etc., and rotated at a predetermined angle.
- the humanoid robot 5020 may further be provided with a sensor, for example, on the robot torso 5211 (see FIG. 46B).
- the sensor is located at a different height than the robot sensor 5023 located on the robot head 5212. The different height allows the sensor to sense the movements of the worker 400 and the first humanoid robot 5020a from different angles.
- FIG. 41 is a block diagram showing an example of the configuration and functions of the task reproduction system 5100 of this embodiment.
- the task reproduction system 5100 includes a first humanoid robot 5020a, a second humanoid robot 5020b, and a management control device 5060.
- the first humanoid robot 5020a and the second humanoid robot 5020b are each connected to a communication unit 5064 of the management control device 5060 via wireless or wired communication, and receive instructions from the management control device 5060 and transmit information acquired by each sensor.
- the first humanoid robot 5020a and the second humanoid robot 5020b may also be connected to each other via wireless or wired communication, and transmit and receive information and instructions acquired by each sensor.
- the first humanoid robot 5020a which functions as a task reproduction robot, is equipped with a first robot movement mechanism 5022a, a first robot sensor 5023a, a first robot imaging device 5024a included in the first robot sensor 5023a, a first information processing device 5025a, a first body/head drive mechanism 5021a, and a first arm drive mechanism 5026a.
- the second humanoid robot 5020b which functions as a mobile robot, has the same configuration as the first humanoid robot 5020a.
- the first information processing device 5025a includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the first information processing device 5025a also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the first information processing device 5025a also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the first information processing device 5025a.
- the storage device 1224 may also store sensing information.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- the ROM 1230 stores therein a boot program, etc., which is executed by the first information processing device 5025a upon activation, and/or a program that depends on the hardware of the first information processing device 5025a.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the first information processing device 5025a, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the first information processing device 5025a.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described program or software module may be stored on a computer-readable storage medium on the first information processing device 5025a or in the vicinity of the first information processing device 5025a.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the first information processing device 5025a via the network.
- the management control device 5060 is a control device that issues instructions to the humanoid robots 5020a and 5020b in order to realize the task reproduction system 5100.
- the management control device 5060 also acquires sensing information stored in the storage devices of the humanoid robots 5020a and 5020b.
- the management control device 5060 is composed of a CPU 5060A, a RAM 5060B, a ROM 5060C, an input/output unit (I/O) 5060D, a bus 5060E such as a data bus or a control bus that connects these, and a communication unit 5064.
- a storage medium 5062 is connected to the I/O 5060D.
- a communication unit 5064 is connected to the I/O 5060D, which transmits and receives sensing information, work manual information, schedule information, etc. to and from the control system of the humanoid robot 5020.
- the work manual information includes, for example, the name and content of each work item, the order of the work items, and information on the standard work time required for each work item.
- the schedule information includes, for example, information indicating the work time and start/end times for the entire work, information indicating the work time and start/end times for each work item, and information indicating the worker for each work item.
- FIG. 42 is a block diagram showing an example of the functions of the management control device 5060 in the work reproduction system of this embodiment.
- the management control device 5060 includes a storage medium 5062, a communication unit 5064, and a processing unit 5066.
- the storage medium 5062 includes, for example, at least one of a semiconductor storage device, a magnetic tape device, a magnetic disk device, or an optical disk device.
- the storage medium 5062 stores driver programs, operating system programs, application programs, data, etc. used for processing in the processing unit 5066.
- the storage medium 5062 stores sensing information.
- the storage medium 5062 may also store work manual information and/or schedule information for the worker 400.
- the communication unit 5064 has a wireless communication interface circuit such as Wi-Fi (registered trademark) and/or a wired communication interface circuit such as Ethernet (registered trademark).
- the communication unit 5064 transmits and receives various information to and from the humanoid robots 5020a and 5020b through the interface circuits.
- the processing unit 5066 has one or more processors and their peripheral circuits.
- the processing unit 5066 centrally controls the overall operation of the task reproduction system 5100, and is, for example, a CPU.
- the processing unit 5066 executes processing by referencing programs (driver programs, operating system programs, application programs, etc.) stored in the storage medium 5062.
- the processing unit 5066 can also execute multiple programs (application programs, etc.) in parallel.
- the processing unit 5066 includes a determination unit 5661, a control unit 5662, a learning unit 5663, a generation unit 5664, an input unit 5665, and a detection unit 5666.
- Each of these units is a functional module realized by a program executed by a processor included in the processing unit 5066. Alternatively, each of these units may be implemented in the processing unit 5066 as firmware.
- the determination unit 5661 determines whether or not a sensing target (worker 400 or first humanoid robot 5020a) is being sensed.
- the determination method may use known image recognition technology or may be based on learning by the learning unit 5663 (see FIG. 42).
- the control unit 5662 causes the first humanoid robot 5020a to perform the reproduction action one or more times by referring to the standard action model. Furthermore, when the control unit 5662 determines that the sensing target (the worker 400 or the first humanoid robot 5020a) is not being sensed, it operates the second body/head drive mechanism and the second robot movement mechanism of the second humanoid robot 5020b.
- the learning unit 5663 learns a standard action model corresponding to the specified action of the worker 400 based on the first sensing information corresponding to the specified action of the worker 400. This learning is performed by automatic learning, which is, for example, learning that automatically creates a learned model and automatically performs judgment/analysis using the learned model.
- the generation unit 5664 generates a standard operation model by referring to the learning results by the learning unit 5663.
- the generation unit 5664 also generates operation instructions.
- the input unit 5665 inputs accident or malfunction information.
- This input may be an input from outside the work reproduction system 5100, or the accident or malfunction information may be stored in advance in the storage medium 5062, the storage medium 1224 of the first information processing device 5025a, and/or the storage medium of the second information processing device, and the information may be read out in response to an instruction from the management control device 5060, and then input.
- the detection unit 5666 detects the occurrence of an accident or malfunction based on the second sensing information corresponding to the reproduced movement of the first humanoid robot 5020a acquired using the second robot sensor 5023b (second robot imaging device 5024b). Furthermore, as described below, the detection unit 5666 detects the occurrence of movement that differs from the work manual information or the schedule information based on the second sensing information corresponding to the reproduced movement of the work reproduction robot acquired using the sensor. Examples of detection targets include differences in the changes in each piece of information (data) over time, and grasping significant discrepancies between pieces of data when comparing them.
- FIG. 43 is an example of a flowchart showing the processing of the task reproduction system of this embodiment.
- the work reproduction system 5100 uses the second robot sensor 5023b (second robot imaging device 5024b) of the second humanoid robot 5020b to sense the actions of the worker 400 in the workplace 200 (step S5101).
- the actions of the worker 400 are actions that are a reproduction of the actions that the worker 400 actually performed when an abnormality occurred, and preferably, the actions are performed in the workplace 200 that faithfully reproduces the situation when the abnormality occurred.
- the control unit 5662 or the second information processing device operates the second body/head drive mechanism and the second robot movement mechanism of the second humanoid robot 5020b so that the sensing area 5230b (imaging area 5240b) of the second robot sensor 5023b (second robot imaging device 5024b) of the second humanoid robot 5020b includes the movements of the worker 400.
- the sensing information (first sensing information) acquired by the second robot sensor 5023b (second robot imaging device 5024b) is stored in the storage medium 5062 via the storage device and/or communication unit 5064 of the second information processing device.
- the storage device and storage medium 5062 of each information processing device function as a storage unit.
- the management control device 5060 learns a standard action model corresponding to the predetermined action of the worker 400 based on the first sensing information stored in the memory unit, in other words, the stored information, and generates a standard action model by referring to the learning results (step S5102).
- FIG. 44 is an example of a flowchart showing more detailed processing of the worker action learning/standard action model generation processing shown in step S5102 of FIG. 43.
- the first sensing information is stored in the storage unit (step S5201), and the learning unit 5663 learns a standard action model corresponding to a predetermined action of the worker 400 based on the first sensing information (step S5202).
- the generating unit 5664 then generates a standard action model by referring to the learning result by the learning unit 5663 (step S5203).
- learning by the learning unit 5663 may involve analysis of motion capture of the movements of the worker 400, a 3D map of the workplace 200, navigation of the movements and movements of the worker 400 in the workplace 200, cornering, speed, etc., and may learn optimal movements of the humanoid robot 5020, which can also function as a work reproduction robot, through automatic learning. This makes it possible to analyze a specific movement of the worker 400 from multiple perspectives at once, reducing the time and cost required for analyzing and programming the movements of the worker 400.
- the first humanoid robot 5020a is placed at a predetermined position (step S5103).
- the first humanoid robot 5020a can be placed at the predetermined position, for example, by storing a floor plan of the workplace 200, which is an example of the predetermined position, in a storage unit in advance, and associating the position of the first humanoid robot 5020a with the stored floor plan, and then activating the first movement mechanism 5022a of the first humanoid robot 5020a to move it to the position.
- the placement of the first humanoid robot 5020a may be based on a position optimized through machine learning.
- the predetermined position is preferably the workplace 200, which faithfully reproduces the situation when an abnormality occurs.
- the control unit 5662 refers to the standard motion model and causes the first humanoid robot 5020a to perform the reproduction motion one or more times (step S5104). In other words, the first humanoid robot 5020a performs the reproduction motion one or more times based on the motion instruction.
- the input unit 5665 inputs accident or malfunction information (step S5105).
- the task reproduction system 5100 senses the reproduction action of the first humanoid robot 5020a in the workplace 200 using the second robot sensor 5023b (second robot imaging device 5024b) of the second humanoid robot 5020b (step S5106).
- the sensing information (second sensing information) acquired by the second robot sensor 5023b (second robot imaging device 5024b) is stored in the storage unit.
- the detection unit 5666 detects the occurrence of an accident or malfunction based on the second sensing information acquired using the second robot sensor 5023b (second robot imaging device 5024b) (step S5107).
- the task reproduction robot is made to perform a reproducing action one or more times with reference to the standard action model, and the occurrence of an accident or malfunction is detected based on the second sensing information corresponding to the reproducing action of the task reproduction robot.
- the task reproduction robot is made to reproduce the action of the worker 400 at the time of the occurrence of an abnormal situation, and the occurrence of an accident or malfunction can be detected through the reproducing action of the task reproduction robot, making it easier to clarify the cause of the occurrence of the abnormal situation.
- FIG. 45 is an example of a flowchart illustrating a process of the task reproduction system according to the first modification of the fifth embodiment of the present disclosure.
- steps S5101 to S5104 are the same, but step S5105 is omitted, and step S5107' is performed as the processing after S5106.
- the work manual information or schedule information of the worker 400 is stored in the memory unit, and the detection unit 5666 detects the occurrence of an action that differs from the work manual information or schedule information based on the second sensing information corresponding to the reproduced action of the first humanoid robot 5020a acquired using the second robot sensor 5023b (second robot imaging device 5024b) (S5107').
- the work manual information or schedule information is information that represents actions and sequences that are considered to be proper. Therefore, by comparing the reproduced actions of the first humanoid robot 5020a, which were performed by referring to the standard action model, with the work manual information or schedule information, it is possible to check whether there are any flaws in the standard actions.
- Modification 2 of embodiment 5 46A and 46B are diagrams showing an example of a task reproduction system according to Modification 2 of the present embodiment.
- FIG. 46A is a diagram showing an example of a system configuration in a task reproduction system according to Modification 2 of the fifth embodiment of the present disclosure.
- This task reproduction system is characterized in that a torso sensor 5023'' (torso imaging device 5024'') is provided in a second humanoid robot 5020' that functions as a mobile robot.
- an instruction is given so that the sensing area 5230'' (imaging area 5240'') of the head sensor 5023'' (head imaging device 5024'') of the second humanoid robot 5020' targets the first humanoid robot 5020a, and the sensing area 5230'' (imaging area 5240'') of the torso sensor 5023'' (torso imaging device 5023'') of the second humanoid robot 5020' targets the worker 400.
- the management control device 5060 is not necessarily required.
- FIG. 46B is a diagram showing an example of the humanoid robot shown in FIG. 46A.
- the second humanoid robot 5020' which functions as a mobile robot, comprises a robot body 5021', a robot movement mechanism 5022', a head sensor 5023', a head imaging device 5024' included in the head sensor 5023', a torso sensor 5023'', a torso imaging device 5024'' included in the torso sensor 5023'', an information processing device 5025', and a robot arm 5026'.
- the robot main body 5021' comprises a robot torso 5211' and a robot head 5212'.
- the robot torso 5211' and the robot head 5212' constitute a torso/head drive mechanism 5021' (see FIG. 47), and it is possible to change the sensing area 5230' (imaging area 5240') of the head sensor 5023' (head imaging device 5024') and the sensing area 5230'' (imaging area 5240'') of the torso sensor 5023'' (torso imaging device 5024'').
- the head sensor 5023' (head imaging device 5024') and the torso sensor 5023'' (torso imaging device 5024'') are positioned at different height positions, so the torso sensor 5023'' (torso imaging device 5024'') and the head sensor 5023' (head imaging device 5024') sense the movements of the worker 400 and the first robot 5020a functioning as a task reproduction robot from different positions.
- the configuration of the information processing device 5025' is the same as that of the first information processing device 5025a of the first humanoid robot 5020a.
- the robot arm 5026' is also the same as that of the first humanoid robot 5020a.
- Figure 47 is a block diagram showing an example of the functions of the humanoid robot 5020' in this task reproduction system.
- the information processing device 5025' comprises an information processing unit 5066', a communication interface 1222', and a storage device 1224'
- the information processing unit 5066' comprises a judgment unit 5661', a control unit 5662c', a learning unit 5663c', a generation unit 5664', an input unit 5665', and a detection unit 5666'. That is, in the task reproduction system 5100', the information processing unit 5066' performs the same processing as the processing unit 5066 of the management control device 5060.
- the information processing device 5025' is configured to be able to communicate with the head sensor 5023' (head imaging device 5024'), the torso sensor 5023'' (head imaging device 5024''), the torso/head driving mechanism 5021', the robot moving mechanism 5022', and the arm driving mechanism 5026'.
- the memory unit 1224' may also store work manual information or process chart information.
- the second humanoid robot 5020' of the task reproduction system 5100' is equipped with an information processing unit 5066' in an information processing device 5025', and the first humanoid robot 5020a and the second humanoid robot 5020' are configured to be able to communicate with each other, making it possible to configure a task reproduction system without requiring a management control device 5060.
- the control unit 5662' of the second humanoid robot 5020' instructs the torso sensor 5023'' (torso imaging device 5024'') to sense the worker 400, and the head sensor 5023' (head imaging device 5024'') functioning as the second sensor to sense the first humanoid robot 5020a functioning as the task reproduction robot.
- the roles of the head sensor 5023' (head imaging device 5024') and the torso sensor 5023'' (torso imaging device 5024'') may be reversed.
- the configuration may be such that the head sensor 5023' (head imaging device 5024') instructs the worker 400, and the torso sensor 5023'' (torso imaging device 5024'') to sense the first humanoid robot 5020a.
- the work reproduction system 5100' has the worker 400 reproduce the actions that occurred when an abnormality occurred, and then senses the specified actions of the worker 400 using the torso sensor 5023'' (torso imaging device 5024'').
- the learning unit 5663' learns a standard action model corresponding to the specified actions of the worker 400 based on the first sensing information corresponding to the specified actions of the worker 400.
- the control unit 5662' causes the first humanoid robot 5020a to perform the reproduced action one or more times by referring to the standard action model.
- the input unit 5665' inputs accident or malfunction information.
- the detection unit 5666' detects the occurrence of an accident or malfunction based on second sensing information corresponding to the reproduced actions of the first humanoid robot 5020a obtained using the head sensor 5023' (head imaging device 5024').
- This allows the first humanoid robot 5020a, which functions as a task reproduction robot, to reproduce the actions of the worker 400 when an abnormality occurs, and also makes it possible to detect the occurrence of an accident or malfunction through the reproducing actions of the task reproduction robot, making it easier to clarify the cause of the abnormality.
- the work reproduction system 5100' has the worker 400 reproduce the actions that occurred when an abnormality occurred, and then senses the specified actions of the worker 400 using the torso sensor 5023'' (torso imaging device 5024'').
- the learning unit 5663' learns a standard action model corresponding to the specified actions of the worker 400 based on the first sensing information corresponding to the specified actions of the worker 400.
- the control unit 5662' causes the first humanoid robot 5020a to perform the reproduced action one or more times by referring to the standard action model.
- the detection unit 5666' detects the occurrence of an action that differs from the work manual information or the schedule information based on the second sensing information corresponding to the reproduced action of the first humanoid robot 5020a acquired using the head sensor 5023' (head imaging device 5024'). This allows the first humanoid robot 5020a, which functions as a task reproduction robot, to reproduce the actions of the worker 400 when an abnormality occurs, and also makes it possible to identify shortcomings in standard actions through the actions reproduced by the task reproduction robot.
- the humanoid robot 5020' can constitute a work reproduction system by itself, and therefore, it is possible to provide a work reproduction system that makes it easy to identify the cause of an abnormality, even in a location where communication with the management control device 5060 is not possible.
- the self-acting robot 5020' is equipped with multiple (two in this modified example) sensors (imaging devices), it is possible to provide a work reproduction system that makes it easy to identify the cause of an abnormality, even in a small space where it would be difficult to reproduce the work by arranging the worker 400 and the work reproduction robot in parallel.
- the humanoid robot functioning as the mobile robot does not necessarily have to be one, but may be multiple.
- the number of sensors will increase by a multiple of the number of humanoid robots, making it possible to obtain a large amount of sensing information at one time.
- the number of mobile robots may be more than this.
- the mobile robot having a sensor and a movement mechanism senses both the worker 400 and the task reproduction robot.
- the mobile robot can be used for each sensing task, reducing the cost and time required to manufacture the robot.
- the sensing of the worker 400 and the sensing of the task reproduction robot may each be performed by a different mobile robot.
- each sensing task does not have to be performed by a mobile robot.
- the learning of the specified actions of the worker has been described as being performed by automatic learning.
- the learning does not necessarily have to be automatic learning, and may be other known machine learning methods, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- FIG. 48A is a diagram showing an example of a system configuration of a task mastery system according to embodiment 6 of the present disclosure.
- This task mastery system includes a first humanoid robot 6020a that functions as a mobile robot and a second humanoid robot 6020b that functions as a task reproduction robot. Note that the number of humanoid robots is not limited to two.
- the first humanoid robot 6020a receives instructions from a management control device 6060 (see FIG. 49) described later, or from an information processing device provided in the first humanoid robot 6020a, and moves to the vicinity of the first worker 400a (skilled worker 400a) working on the work line 201 in the workplace 200.
- the skilled worker 400a is a worker who performs a predetermined motion as a model.
- the work proficiency system senses the predetermined motion of the skilled worker 400a using the first robot sensor 6023a (first robot imaging device 6024a) provided in the first humanoid robot 6020a.
- the predetermined motions are diverse and include, for example, assembling parts, moving parts, painting a product, and moving the worker himself.
- the task training system activates the first movement mechanism and the first body/head drive mechanism of the first humanoid robot 6020a so that the first robot sensor 6023a (first robot imaging device 6024a) senses the specified movement of the skilled worker 400a.
- the recognition of the specified movement of the skilled worker 400a by each sensor may use a known image recognition technology, or the specified movement of the skilled worker 400a may be recognized by learning by the learning unit 6663 (see FIG. 50).
- This work training system stores a standard action model learned based on sensing information (first sensing information) corresponding to a specific action of the skilled worker 400a.
- the standard action model is a model that represents an action that corresponds to a specific action of the skilled worker 400a and is specified as an appropriate action for a specific work item.
- This work training system refers to the standard motion model and has the second humanoid robot 6020b reproduce the motion.
- This work training system also detects the differences in the motion of the new worker 400b from the standard motion model based on sensing information (second sensing information) corresponding to the motion of the new worker 400b acquired using the second robot sensor 6023b (second robot imaging device 6024b), which is a sensor capable of sensing the motion of the second worker 400b (new worker 400b).
- the new worker 400b is the worker who is the target for the work training.
- the second humanoid robot 6020b is made to reproduce the motion by referring to a standard motion model generated based on the predetermined motion of the experienced worker 400a who serves as a model, so that the second humanoid robot 6020b can execute motions that are faithful to the task.
- the reproduced motion of the second humanoid robot 6020b can serve as a model.
- new worker 400b refer to the reproduced motion of the second humanoid robot 6020b, the worker can become proficient in the task.
- the new worker 400b performs a specified movement as part of the work, and the movement of the new worker 400b is sensed by the second robot sensor 6023b (second robot imaging device 6024b) of the second humanoid robot 6020b.
- This work training system detects the differences in the movements of the new worker 400b from the standard movement model based on the sensing information (second sensing information) acquired by the second robot sensor 6023b (second robot imaging device 6024b). This makes it possible to compare the model work of the experienced worker 400a with the work of the new worker 400b, and to help the new worker 400b become proficient in the work.
- this work training system makes it possible to send a work reproduction robot (second humanoid robot 6020b) that has learned the work of the domestic factory to the new factory and have it teach the work to new workers at the new factory.
- second humanoid robot 6020b work reproduction robot
- FIG. 48B is a diagram showing an example of the humanoid robot shown in FIG. 48A.
- the humanoid robot 6020 which functions as a mobile robot and a task reproduction robot, comprises a robot body 6021, a robot movement mechanism 6022, a robot sensor 6023, a robot imaging device 6024 included in the robot sensor 6023, an information processing device 6025, and a robot arm 6026.
- the humanoid robot 6020 can move using a robot movement mechanism 6022 provided below the robot body 6021, and moves to the vicinity of the work line 201 in the workplace 200 or performs work by receiving instructions from outside the humanoid robot 6020, such as a management control device 6060, or by referring to a program stored in an information processing device 6025.
- the robot main body 6021 comprises a robot torso 6211 and a robot head 6212.
- the robot torso 6211 and the robot head 6212 constitute a torso/head drive mechanism, and are capable of changing the sensing area 6230 (imaging area 6240) of the robot sensor 6023 (robot imaging device 6024).
- the configuration of the drive mechanism is not particularly limited, and may be configured such that, for example, the robot head 6212 rotates a predetermined angle relative to the robot torso 6211, or the robot torso 6211 rotates a predetermined angle relative to the robot movement mechanism 22, by a servo motor (not shown).
- a robot movement mechanism 6022 is provided below the robot torso 6211, a robot arm 6026 is provided on each side of the robot torso 6211, and a robot sensor 6023 is provided in the robot head 6212.
- An information processing device 6025 is also provided inside the robot main body 6021.
- the robot movement mechanism 6022 may be of any configuration, for example it may be provided with a rotating body driven by a motor, or may have legs that resemble the shape of a human leg. As an example, if the robot movement mechanism 6022 is configured to resemble the shape of a human leg, a servo motor is provided at the location that corresponds to a human joint, and the movement mechanism is formed by rotating it by a predetermined angle.
- the robot sensor 6023 is preferably provided in the robot head 6212 and senses each worker, such as the experienced worker 400a and the new worker 400b.
- the robot sensor 6023 also sequentially acquires information representing at least the distance and angle between an object around the humanoid robot 6020 on which the humanoid robot 6020 is working and the robot arm 6026.
- Examples of the robot sensor 6023 include the highest performance camera, a thermal camera, a high pixel, telephoto, ultra-wide angle, 360 degree, high performance camera, radar, solid state LiDAR, LiDAR, a multi-color laser coaxial displacement meter, vision recognition, or various other sensor groups. These are also examples of the robot imaging device 6024.
- robot sensor 6023 examples include a vibration meter, hardness meter, micro vibration meter, ultrasonic measuring instrument, vibration measuring instrument, infrared measuring instrument, ultraviolet measuring instrument, electromagnetic wave measuring instrument, thermometer, hygrometer, spot AI weather forecast, high-precision multi-channel GPS, low altitude satellite information, or long-tail incident AI data.
- Examples of sensing information acquired from the robot sensor 6023 include images, distance, vibration, heat, smell, color, sound, ultrasound, radio waves, ultraviolet light, infrared light, humidity, etc., and image and distance information is preferably acquired by the robot imaging device 6024.
- the robot sensor 6023 (robot imaging device 6024) performs this sensing every nanosecond, for example.
- the sensing information is used, for example, for motion capture of the movements of each worker, a 3D map of the workplace 200, navigation of the movements and movements of each worker in the workplace 200, analysis of cornering, speed, etc.
- the robot arm 6026 comprises a right arm 6261 and a left arm 6262.
- the right arm 6261 comprises a right gripping support part 6263 and a right gripping part 6265
- the left arm 6262 comprises a left gripping support part 6264 and a left gripping part 6266.
- the right gripping support part 6263 is a mechanism for supporting the right gripping part 6265
- the left gripping support part 6264 is a mechanism for supporting the left gripping part 6266, and may be shaped like a human arm, for example.
- the gripping parts 6265 and 6266 are mechanisms for gripping, for example, parts for work, and may be shaped like a human hand, for example.
- the robot arm 6026 constitutes an arm drive mechanism.
- the configuration of the drive mechanism is not particularly limited, and for example, if the robot arm 6026 is to resemble a human shape, a configuration may be adopted in which servo motors are provided at each joint location, such as a location corresponding to a human shoulder, a location corresponding to an elbow, a location corresponding to a wrist, a location corresponding to a finger joint, etc., and rotated by a predetermined angle.
- the humanoid robot 6020 may further be provided with a sensor, for example, on the robot torso 6211 (see FIG. 54B).
- the sensor is located at a different height than the robot sensor 6023 located on the robot head 6212. The different height positions allow the sensor to sense the movements of each worker from different angles.
- FIG. 49 is a block diagram showing an example of the configuration and functions of the task training system 6100 of this embodiment.
- the task training system 6100 includes a first humanoid robot 6020a, a second humanoid robot 6020b, and a management control device 6060.
- the first humanoid robot 6020a and the second humanoid robot 6020b are each connected to the communication unit 6064 of the management control device 6060 via wireless or wired communication, and receive instructions from the management control device 6060 and transmit information acquired by each sensor.
- the first humanoid robot 6020a and the second humanoid robot 6020b may also be connected to each other via wireless or wired communication, and transmit and receive information and instructions acquired by each sensor.
- the second humanoid robot 6020b that functions as a task reproduction robot includes a second robot moving mechanism 6022b, a second robot sensor 6023b, a second robot imaging device 6024b included in the second robot sensor 6023b, a second information processing device 6025b, a second body/head driving mechanism 6021b, a second moving mechanism 6022b, and a second arm driving mechanism 6026b.
- the second humanoid robot 6020b and the first humanoid robot 6020a that functions as a moving robot have the same configuration.
- the second information processing device 6025b includes a CPU (Central Processing Unit) 1212, a RAM (Random Access Memory) 1214, and a graphics controller 1216, which are interconnected by a host controller 1210.
- the second information processing device 6025b also includes input/output units such as a communication interface 1222, a storage device 1224, a DVD drive, and an IC card drive, which are connected to the host controller 1210 via an input/output controller 1220.
- the DVD drive may be a DVD-ROM drive, a DVD-RAM drive, etc.
- the storage device 1224 may be a hard disk drive, a solid state drive, etc.
- the second information processing device 6025b also includes input/output units such as a ROM (Read Only Memory) 1230 and a keyboard, which are connected to the input/output controller 1220 via an input/output chip 1240.
- the CPU 1212 operates according to the programs stored in the ROM 1230 and the RAM 1214, thereby controlling each unit.
- the graphics controller 1216 acquires image data generated by the CPU 1212 into a frame buffer or the like provided in the RAM 1214 or into itself, and causes the image data to be displayed on the display device 1218.
- the communication interface 1222 communicates with other electronic devices via a network.
- the storage device 1224 stores programs and data used by the CPU 1212 in the second information processing device 6025b.
- the storage device 1224 may also store sensing information.
- the DVD drive reads programs or data from a DVD-ROM or the like and provides them to the storage device 1224.
- the IC card drive reads programs and data from an IC card and/or writes programs and data to an IC card.
- the ROM 1230 stores therein a boot program, etc., which is executed by the second information processing device 6025b upon activation, and/or a program that depends on the hardware of the second information processing device 6025b.
- the input/output chip 1240 may also connect various input/output units to the input/output controller 1220 via a USB port, a parallel port, a serial port, a keyboard port, a mouse port, etc.
- the programs are provided by a computer-readable storage medium such as a DVD-ROM or an IC card.
- the programs are read from the computer-readable storage medium, installed in the storage device 1224, RAM 1214, or ROM 1230, which are also examples of computer-readable storage media, and executed by the CPU 1212.
- the information processing described in these programs is read by the second information processing device 6025b, and brings about cooperation between the programs and the various types of hardware resources described above.
- An apparatus or method may be configured by realizing the operation or processing of information in accordance with the use of the second information processing device 6025b.
- the CPU 1212 may execute a communication program loaded into the RAM 1214 and instruct the communication interface 1222 to perform communication processing based on the processing described in the communication program.
- the communication interface 1222 reads transmission data stored in a transmission buffer area provided in the RAM 1214, the storage device 1224, a DVD-ROM, or a recording medium such as an IC card, and transmits the read transmission data to the network, or writes received data received from the network to a reception buffer area or the like provided on the recording medium.
- the CPU 1212 may also cause all or a necessary portion of a file or database stored in an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc. to be read into the RAM 1214, and perform various types of processing on the data on the RAM 1214. The CPU 1212 may then write back the processed data to the external recording medium.
- an external recording medium such as the storage device 1224, a DVD drive (DVD-ROM), an IC card, etc.
- CPU 1212 may perform various types of processing on data read from RAM 1214, including various types of operations, information processing, conditional decisions, conditional branches, unconditional branches, information search/replacement, etc., as described throughout this disclosure and specified by the instruction sequences of the programs, and writes back the results to RAM 1214.
- CPU 1212 may also search for information in files, databases, etc. in the recording medium.
- the above-described program or software module may be stored in a computer-readable storage medium on the second information processing device 6025b or in the vicinity of the second information processing device 6025b.
- a recording medium such as a hard disk or RAM provided in a server system connected to a dedicated communication network or the Internet can be used as a computer-readable storage medium, thereby providing the program to the second information processing device 6025b via the network.
- the management control device 6060 is a control device that issues instructions to the humanoid robots 6020a and 6020b in order to realize the task training system 6100.
- the management control device 6060 also acquires sensing information stored in the storage devices of the humanoid robots 6020a and 6020b.
- the management control device 6060 is composed of a CPU 6060A, a RAM 6060B, a ROM 6060C, an input/output unit (I/O) 6060D, a bus 6060E such as a data bus or a control bus that connects these, and a communication unit 6064.
- a storage medium 6062 is connected to the I/O 6060D.
- a communication unit 6064 is connected to the I/O 6060D, which transmits and receives sensing information, work manual information, schedule information, etc., between the control system of the humanoid robot 6020.
- the work manual information includes, for example, the name and content of each work item, the order of the work items, and information on the standard work time required for each work item.
- the schedule information includes, for example, information indicating the work time and start/end times for the entire work, information indicating the work time and start/end times for each work item, and information indicating the worker for each work item.
- FIG. 50 is a block diagram showing an example of the functions of the management control device 6060 in the work training system of this embodiment.
- the management control device 6060 includes a storage medium 6062, a communication unit 6064, and a processing unit 6066.
- the storage medium 6062 includes, for example, at least one of a semiconductor storage device, a magnetic tape device, a magnetic disk device, or an optical disk device.
- the storage medium 6062 stores driver programs, operating system programs, application programs, data, etc., used for processing in the processing unit 6066.
- the storage medium 6062 stores sensing information.
- the storage medium 6062 also stores worker work manual information and/or process chart information.
- the communication unit 6064 has a wireless communication interface circuit such as Wi-Fi (registered trademark) and/or a wired communication interface circuit such as Ethernet (registered trademark).
- the communication unit 6064 transmits and receives various information to and from the humanoid robots 6020a and 6020b through the interface circuits.
- the processing unit 6066 has one or more processors and their peripheral circuits.
- the processing unit 6066 centrally controls the overall operation of the task training system 6100, and is, for example, a CPU.
- the processing unit 6066 executes processing by referring to programs (driver programs, operating system programs, application programs, etc.) stored in the storage medium 6062.
- the processing unit 6066 can also execute multiple programs (application programs, etc.) in parallel.
- the processing unit 6066 includes a determination unit 6661, a control unit 6662, a learning unit 6663, a generation unit 6664, and a detection unit 6665. Each of these units is a functional module realized by a program executed by a processor included in the processing unit 6066. Alternatively, each of these units may be implemented in the processing unit 6066 as firmware.
- the determination unit 6661 determines whether or not a sensing target (experienced worker 400a or new worker 400b) is being sensed.
- the determination method may use known image recognition technology or may be based on learning by the learning unit 6663 (see FIG. 51).
- the control unit 6662 refers to the standard motion model and causes the second robot 6020b to perform the reproducing motion. Furthermore, when it is determined that the sensing target is not being sensed, the control unit 6662 operates the torso/head drive mechanism 6021 and the robot movement mechanism 6022 of each humanoid robot 6020.
- the learning unit 6663 learns a standard action model based on first sensing information corresponding to a predetermined action of the skilled worker 400a.
- the first sensing information is acquired by the first robot sensor 6023a (first robot imaging device 6024a) sensing the skilled worker 400a.
- the learning of the learning unit 6663 is performed by automatic learning, which is, for example, learning to automatically create a learned model and automatically perform judgment/analysis using the learned model.
- the generating unit 6664 generates a standard operation model by referring to the learning results of the learning unit 6663.
- the generating unit 6664 also generates each sensing instruction and warning instruction, which will be described later.
- the detection unit 6665 detects how the new worker 400b's movements differ from the standard movement model based on the second sensing information corresponding to the movements of the new worker 400b acquired using the second robot sensor 6023b (second robot imaging device 6024b). The detection unit 6665 also detects how the new worker 400b's movements differ from the work manual information or the schedule information based on the sensing information corresponding to the movements of the new worker 400b acquired using the second robot sensor 6023b (second robot imaging device 6024b). Examples of detection targets include differences in the changes in each piece of information (data) over time and significant discrepancies between data when compared.
- Processing of the work proficiency system according to the sixth embodiment of the present disclosure 51 is an example of a flowchart showing the processing of the task mastery system of this embodiment.
- the processing is executed mainly by the processing unit 6066 of the management control device 6060 in cooperation with each element of the task mastery system 6100 (the management control device 6060, the first humanoid robot 6020a, and the second humanoid robot 6020b) with reference to a control program stored in advance in the storage medium 6062, the storage device of the first information processing device, and/or the storage device 1224 of the second information processing device 6025b.
- the first information processing device issues an instruction to move the first humanoid robot 6020a, which functions as a mobile robot, to the workshop 200, in response to an instruction from the processing unit 6066 or an instruction to read a program stored in the storage medium 6062 or the storage device of the first information processing device.
- the movement is achieved by the operation of the first robot movement mechanism 6022 of the first humanoid robot 6020a.
- each sensing area 6230 (imaging area 6240) of the first robot sensor 6023a (first robot imaging device 6024a) can sense a specific movement of the skilled worker 400a.
- the placement of the first humanoid robot 6020a is performed, for example, by storing a floor plan of the workplace 200 in advance in the storage medium 6062 or the storage device of the first information processing device, and associating the position a of the first humanoid robot 6020 with the stored floor plan.
- the placement of the position a of the first humanoid robot 6020 may be based on a position optimized through machine learning.
- the management control device 6060 instructs the first robot sensor 6023a (first robot imaging device 6024a) to sense a predetermined movement of the skilled worker 400a on the work line 201 (step S6101).
- the generation unit 6664 generates a first sensing instruction to operate the first robot sensor 6023a (first robot imaging device 6024a) of the first humanoid robot 6020a and the first robot movement mechanism and/or first body/head drive mechanism for the purpose of sensing the predetermined movement of the skilled worker 400a, and transmits the first sensing instruction to the first information processing device via the communication unit 6064.
- the CPU of the first information processing device receives the first sensing instruction via the communication interface of the first information processing device and starts a program that operates the first robot sensor 6023a (first robot imaging device 6024a) and the first robot movement mechanism and/or the first body/head drive mechanism.
- the storage unit stores sensing information (first sensing information) acquired by the first robot sensor 6023a (first robot imaging device 6024a).
- the storage device of the first information processing device, the storage device 1224 of the second information processing device 6025b, and the storage medium 6062 function as a storage unit.
- the communication unit 6064 acquires sensing information acquired using each sensor (imaging device) via the communication interface of each information processing device, and the storage medium 6062 stores the sensing information acquired by the communication unit 6064 via the I/O 6060D.
- the processing unit 6066 learns and generates a standard motion model based on the first sensing information accumulated in the memory unit, in other words, stored (step S6102). Since the first sensing information is sensing information corresponding to the predetermined motion of the skilled worker 400a, learning based on the first sensing information is synonymous with learning the predetermined motion of the skilled worker.
- FIG. 52 is an example of a flowchart showing more detailed processing of the worker action learning and model generation processing shown in step S6102 of FIG. 51.
- the memory unit stores the acquired first sensing information (step S6201), and the learning unit 6663 learns a standard motion model based on the first sensing information stored in the memory unit (step S6202).
- the learning motion capture of the movements of the skilled worker 400a, a 3D map of the workplace 200, navigation of the movement and movements of the skilled worker 400a in the workplace 200, cornering, speed, etc. are analyzed, and the optimal movements of the humanoid robot 6020, which can also function as a work reproduction robot, are learned by automatic learning. This makes it possible to analyze the specific movements of the skilled worker 400a from multiple perspectives at once, reducing the time and cost required for analyzing and programming the movements of the skilled worker 400a.
- the generation unit 6664 generates a standard behavior model by referring to the learning results of the learning unit 6663 (step S6203).
- control unit 6662 causes the second humanoid robot 6020b to perform a reproduction action by referring to the standard action model (step S6103). This allows the second humanoid robot 6020b to execute a reproduction action that serves as a model for other workers (new worker 400b).
- the new worker 400b who is the worker to be trained in the work, performs a predetermined movement, and then the management control device 6060 instructs the second robot sensor 6023b (second robot imaging device 6024b) to sense the movement of the new worker 400b (step S6104).
- the generation unit 6664 generates a second sensing instruction to operate the second robot sensor 6023b (second robot imaging device 6024b) of the second humanoid robot 6020b and the second robot movement mechanism 6022b and/or the second body/head drive mechanism 6021b for the purpose of sensing the movement of the new worker 400b, and transmits the second sensing instruction to the second information processing device 6025b via the communication unit 6064.
- the CPU 1212 of the second information processing device receives the second sensing instruction via the communication interface 1222 and starts a program that operates the second robot sensor 6023b (second robot imaging device 6024b) of the second humanoid robot 6020b and the second robot movement mechanism 6022b and/or the second body/head drive mechanism 6021b.
- the detection unit 6665 detects differences in the movements of the new worker 400b from the standard movement model based on the second sensing information corresponding to the movements of the new worker 400b acquired using the second robot sensor 6023b (second robot imaging device 6024b) (step S6105). This makes it possible to check whether the movements of the new worker 400b, who is the target of training to become proficient at a task, differ from the model movements.
- the generating unit 6664 may refer to the detection result of the detecting unit 6665 and generate a detection result output instruction, which is an instruction to output the detection result.
- the method of outputting the detection result is not particularly limited, and for example, an alarm (buzzer) may be provided in the humanoid robot 6020 or the management control device 6060 and the alarm may be activated, or a display function may be provided in the humanoid robot 6020b and the management control device 6060, and differences in operation may be indicated by the display function.
- FIG. 53 is an example of a flowchart showing more detailed processing of the motion detection processing shown in step S6105 of FIG. 51.
- the communication unit 6064 acquires the second sensing information via the communication interface 1222 of the second information processing device 6025b (step S6301).
- the second sensing information is sensing information acquired by the second robot sensor 6023b (second robot image capture device 6024b) and corresponds to the movement of the new worker 400b.
- the detection unit 6665 detects how the behavior of the new worker 400b differs from the standard behavior model based on the second sensing information acquired via the communication unit 6064 (step S6302).
- the detection unit 6665 also detects differences between the actions of the new worker 400b and the work manual information or schedule information based on the second sensing information acquired via the communication unit 6064 (step S6303).
- the work manual information or schedule information is information that represents actions and sequences that are essentially considered to be appropriate. Thus, by comparing the actions of the new worker 400b with the work manual information or schedule information, it becomes easier to confirm any erroneous actions by the new worker 400b.
- the generating unit 6664 may generate an alarm instruction to issue an alarm. This makes it easier for the new worker 400b to understand that the movement of the new worker 400b differs from the model predetermined movement.
- the method of issuing an alarm is not particularly limited.
- an alarm may be provided in the humanoid robot 6020 or the management control device 6060 and the alarm may be activated, or a display function may be provided in the humanoid robot 6020 or the management control device 6060 and a warning may be issued by the display function.
- control unit 6662 may refer to the standard movement model and have the second humanoid robot 6020b repeat the movement again. In this way, when the movement of the new worker 400b differs from the model predetermined movement, the new worker 400b can confirm the model predetermined movement again through the reproducing movement of the second humanoid robot 6020b, and the new worker 400b can become proficient in the work.
- a standard motion model is learned based on sensing information corresponding to a predetermined motion of the skilled worker 400a who serves as a model for the task, and the task reproducing robot operates by referring to the standard motion model.
- the task reproducing robot performs the predetermined motion as a model, and the task reproducing robot can be used as a reference for the worker to master the task.
- the task training system 6100 detects the differences between the movements of the new worker 400b and the standard movement model based on the second sensing information. This allows the new worker 400b to understand that the movements of the new worker 400b are different from the desired movements.
- differences in the movements of the new worker 400b from the work manual information or schedule information are detected based on the second sensing information.
- the experienced worker 400a performs a prescribed movement that serves as a model, but this does not mean that the movements are always faithful to the work, and in some cases the experienced worker may perform unnecessary movements or omit necessary movements. Therefore, by detecting differences by comparing with the work manual information or schedule information, which are information that represent movements and sequences that are essentially considered appropriate, it is possible to guide the movements of the new worker 400b to more desirable movements.
- (Variation 1 of the sixth embodiment) 54A and 54B are diagrams showing an example of a task training system related to variant example 1 of this embodiment.
- FIG. 54A is a diagram showing an example of a system configuration in a task proficiency system according to variant 1 of embodiment 6 of the present disclosure.
- This task proficiency system is characterized in that a torso sensor 6023'' (torso image capture device 6024'') is provided in a humanoid robot 6020' that functions as a mobile robot and a task reproduction robot.
- the management control device 6060 is not necessarily required, and the task proficiency system can be configured with the humanoid robot 6020' alone.
- FIG. 54B is a diagram showing an example of the humanoid robot shown in FIG. 54A.
- the humanoid robot 6020' which functions as a mobile robot and a task reproduction robot, comprises a robot main body 6021', a robot movement mechanism 6022', a head sensor 6023', a head imaging device 6024' included in the head sensor 6023', a torso sensor 6023'', a torso imaging device 6024'' included in the torso sensor 6023'', an information processing device 6025', and a robot arm 6026'.
- the robot main body 6021' comprises a robot torso 6211' and a robot head 6212'.
- the robot torso 6211' and the robot head 6212' constitute a torso/arm drive mechanism 6021' (see FIG. 55), and are capable of changing the sensing area 6230' (imaging area 6240') of the head sensor 6023' (head imaging device 6024') and the sensing area 6230'' (imaging area 6240'') of the torso sensor 6023'' (torso imaging device 6024'').
- the head sensor 6023' senses the experienced worker 400a, and the torso sensor 6023'' (torso imaging device 6024'') senses the new worker 400b.
- the head sensor 6023' (head imaging device 6024') and the torso sensor 6023'' (torso imaging device 6024'') are placed at different height positions, so the torso sensor 6023'' (torso imaging device 6024'') senses each movement of the sensing target from a different position from that of the head sensor 6023' (head imaging device 6024').
- the sensing target of the head sensor 6023' (head imaging device 6024') and the sensing target of the torso sensor 6023d (torso imaging device 6024d) may be configured in reverse.
- the configuration of the information processing device 6025' is the same as that of the second information processing device 6025b of the second humanoid robot 6020b.
- the robot arm 6026' is also the same as that of the second humanoid robot 6020b.
- FIG. 55 is a block diagram showing an example of the functions of a humanoid robot in this task training system.
- the information processing device 6025' includes an information processing unit 6066', a communication interface 1222', and a storage device 1224', and the information processing unit 6066' includes a determination unit 6661', a control unit 6662', a learning unit 6663', a generation unit 6664', and a detection unit 6665'. That is, in the task training system 6100', the information processing unit 6066' performs the same processing as the processing unit 6066 of the management control device 6060.
- the information processing device 6025' is configured to be able to communicate with the head sensor 6023' (head imaging device 6024'), the torso sensor 6023'' (head imaging device 6024''), the torso/head driving mechanism 6021', the robot moving mechanism 6022', and the arm driving mechanism 6026'.
- the humanoid robot 6020' of the work training system 6100' is equipped with an information processing device 6025' and an information processing unit 6066', so the humanoid robot 6020' alone constitutes the work training system.
- the humanoid robot 6020' is equipped with an information processing device 6025' that can communicate with a head sensor 6023' (head imaging device 6024'), which is a sensor capable of sensing a specific movement of the experienced worker 400a, and a torso sensor 6023' (torso imaging device 6024'), which is a sensor capable of sensing the movement of the new worker 400b, and senses the experienced worker 400b using the head sensor 6023' (head imaging device 6024'), while sensing the new worker 400b using the torso sensor 6023d (torso imaging device 6024d).
- head imaging device 6024' head imaging device 6024'
- torso imaging device 6024' torso imaging device 6024'
- the generation unit 664' generates a third sensing instruction to operate the head sensor 6023' (head imaging device 6024') and the torso sensor 6023'' (torso imaging device 6024'') as well as the robot movement mechanism 6022' and/or the torso/head drive mechanism 6021' for the purpose of sensing the predetermined movements of the experienced worker 400a and the movements of the new worker 400b.
- the control unit 6662' references the third sensing instruction and starts a program to operate each mechanism.
- the communication interface 1222' acquires sensing information acquired by each sensor (each imaging device) and transmits it to the information processing unit 6066'.
- the learning unit 6663' learns a standard action model based on the first sensing information acquired using the head sensor 6023' (head imaging device 6024').
- the generation unit 6664' generates a standard action model by referring to the learning results of the learning unit 6663'.
- the storage device (storage unit) 1224' stores the standard action model generated by the generation unit 6664'.
- the control unit 6662' can cause the humanoid robot 6020' to reproduce the movement by referring to the standard movement model stored in the storage device 1224'. Furthermore, the detection unit 6665' detects the difference between the movement of the new worker 400b and the standard movement model based on the second sensing information corresponding to the movement of the new worker 400b acquired using the torso sensor 6023d (torso image capture device 6024d).
- the storage unit 1224' also stores work manual information or schedule information, and the detection unit 6665' detects points where the movements of the new worker 400b differ from the work manual information or schedule information stored in the storage unit 1224' based on second sensing information corresponding to the movements of the new worker 400b acquired using the torso sensor 6023d (torso image capture device 6024d).
- the humanoid robot 6020' can constitute a work training system by itself, and therefore, it is possible to sense each worker and reproduce exemplary actions even in places where communication with the management control device 6060 is not possible.
- the self-acting robot 6020' is equipped with multiple (two in this modified example) sensors (imaging devices), making it possible to sense each worker and perform model reproducing actions, even in a place that is too small to sense each worker.
- the humanoid robot that functions as the mobile robot and task reproduction robot does not necessarily have to be one, but may be multiple.
- the number of sensors will increase by a multiple of the number of humanoid robots, making it possible to obtain a large amount of sensing information at one time.
- the task training system 6100 has been described as having one sensor-equipped humanoid robot for each worker.
- the number of sensor-equipped humanoid robots for each worker may be two or more.
- the experienced worker 400a and the new worker 400b are described as being present on the same work line 201.
- the experienced worker 400a and the new worker 400b may be present in different locations, and the humanoid robots that sense each worker may also be located in different locations.
- the learning of the predetermined actions of the skilled worker 400a has been described as being performed by automatic learning.
- the learning does not necessarily have to be automatic learning, and may be other known machine learning methods, such as deep learning, unsupervised/supervised learning, reinforcement learning, etc.
- the mobile robot and the task replicating robot are described as being the same humanoid robot. In this case, it is possible to use the mobile robot in combination with the task replicating robot, which can reduce the expenses and costs associated with robot production. However, the mobile robot and the task replicating robot may be different robots.
- a humanoid robot equipped with a sensor (imaging device), a moving mechanism, and a driving mechanism is used to sense each worker.
- imaging device imaging device
- it does not have to be a humanoid robot equipped with a moving mechanism and a driving mechanism.
- the second humanoid robot 6020b is caused to perform a reproducing action by referring to the standard action model (S6103), and then the action is detected (S6105).
- the processing of S6103 and S6105 does not necessarily have to be in this order, and the processing of S6103 may be performed after the processing of S6105.
- the task reproducing robot (humanoid robot 6020') that constitutes the task learning system alone has been described as being equipped with a sensor.
- the task reproducing robot does not necessarily need to be equipped with a sensor as long as it is equipped with a sensor capable of sensing the motions of a worker and an information processing device capable of communicating with the sensor.
- the sensing information acquired by sensing the specified motion of the first worker corresponds to "first sensing information corresponding to the specified motion of a skilled worker”
- the sensing information acquired by sensing the motion of the second worker corresponds to "second sensing information corresponding to the motion of a new worker.”
- Working robot adjustment system 200 Work place 201 Work line 400 Worker 20 Humanoid robot (mobile robot, working robot) 23 Robot sensor 24 Robot imaging device 25 Information processing device 30 Sensor mounting member 33 Sensor for mounting member 34 Imaging device for mounting member 60 Management control device
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Strategic Management (AREA)
- Mechanical Engineering (AREA)
- Economics (AREA)
- Robotics (AREA)
- Entrepreneurship & Innovation (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Marketing (AREA)
- Physics & Mathematics (AREA)
- Game Theory and Decision Science (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Educational Administration (AREA)
- Development Economics (AREA)
- Manufacturing & Machinery (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Primary Health Care (AREA)
- Manipulator (AREA)
Abstract
Description
図1は、作業ロボット調整方法を説明するための図である。なお、図1が煩雑になることを避けるため、各センサのセンシング領域の符号については、後述する取付部材用センサ33のセンシング領域330aのみ付している。また、作業ロボット調整方法を行うための作業ロボット調整システム100の詳細については、図3を用いて後述する。
図4は、本実施形態の作業ロボット調整方法の処理を示すフローチャートの一例である。
図5は、センサ取付部材の変形例の一例である。
図6A、図6Bは、センシングシステムを説明するための図である。
図10は、本実施形態のセンシングシステムの処理を示すフローチャートの一例である。
本実施形態に係るセンシングシステム2100によれば、センシング対象である作業員400が所定動作時に可動する所定部位がセンシングされていない場合に、所定部位がセンシングされるように第1移動機構2022aが作動するため、作業員400の所定動作により所定部位がセンシングされないといった事態を防ぐことができ、充分なセンシングを行うことが可能となる。
図13A.13Bは、本実施形態の変形例1に係るセンシングシステムの一例を示す図である。
本センシングシステムによれば、人型ロボット2020cは単独でセンシングシステムを構成することができることから、例えば、管理制御装置2060との通信ができないような場所においても、充分にセンシングを行うことが可能となる。
図15A、15Bは、本実施形態の変形例2に係るセンシングシステムの一例を示す図である
本センシングシステムによれば、取付部材用センサ2033(取付部材用撮像装置2034)を第2センサとして構成していることから、例えば、作業員400をセンシングするには狭い場所においても、充分にセンシングを行うことが可能となる。
図17A、17Bは、センシングシステムを説明するための図である。
図21は、本実施形態のセンシングシステムの処理を示すフローチャートの一例である。
本実施形態に係るセンシングシステム3100によれば、作業員400の所定動作をセンシングする第1センサにより取得される第1情報と、作業ロボットをセンシングする第2センサにより取得される第2情報とを比較して、作業ロボットのロボット動作が所定動作に近似するように動作制御情報が調整される。これにより、作業ロボットのロボット動作を確認しつつ、作業ロボットの動作を適切なものに調整することができる。
図24A、24Bは、本実施形態の変形例1に係るセンシングシステムの一例を示す図である。
本センシングシステムによれば、人型ロボット3020'は単独でセンシングシステムを構成することができることから、例えば、管理制御装置3060との通信ができないような場所においても、作業ロボットのロボット動作を確認しつつ、ロボット動作を適切な動作に調整することが可能となる。
図26は、本実施形態の変形例2に係るセンシングシステムでの、システム構成の一例を示す図である。
図27A、27Bは、本実施形態の変形例3に係るセンシングシステムの一例を示す図である。
本センシングシステムによれば、取付部材用センサ3033(取付部材用撮像装置3034)を第2センサとして構成していることから、例えば、作業員400をセンシングする際に、移動式ロボットとして機能する人型ロボットを複数台設置するには狭い場所においても、作業ロボットの動作を確認しつつ、適切に制御することが可能となる。
図29A、29Bは、動作改変システムを説明するための図である。
図33は、本実施形態の動作改変システムの処理を示すフローチャートの一例である。
本実施形態に係る動作改変システム4100によれば、標準動作モデル生成時の各動作の所要時間よりも短く設定した改変動作モデルを参照して作業ロボットを作動させることができるため、作業ロボットに効率的に作業させることを可能とする。
図35A、35Bは、本実施形態の変形例1に係る動作改変システムの一例を示す図である。
本動作改変システムによれば、人型ロボット4020'は単独で動作改変システムを構成することができることから、例えば、管理制御装置4060との通信ができないような場所においても、作業員の作業が学習された学習モデルを用いて作業する作業ロボットにおいて、作業ロボットが効率的に作業できるようになる。
図39A、39Bは、作業再現システムを説明するための図である。
図43は、本実施形態の作業再現システムの処理を示すフローチャートの一例である。
本実施形態に係る作業再現システム5100によれば、標準動作モデルを参照して作業再現ロボットに再現動作を1回以上行わせ、当該作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、事故又は誤動作の発生を検知する。これにより、異常事態発生時の作業員400の動作を作業再現ロボットに再現させた上で、作業再現ロボットの再現動作を通じて事故又は誤動作の発生を検知できることとなり、異常事態の発生要因を解明することが容易になる。
図45は、本開示に係る実施形態5の変形例1に係る作業再現システムの処理を示すフローチャートの一例である。
図46A、46Bは、本実施形態の変形例2に係る作業再現システムの一例を示す図である。
本作業再現システムによれば、人型ロボット5020'は単独で作業再現システムを構成することができることから、例えば、管理制御装置5060との通信ができないような場所においても、異常事態の発生要因を解明することが容易な作業再現システムを提供することができる。
図48A、48Bは、作業習熟システムを説明するための図である。
図51は、本実施形態の作業習熟システムの処理を示すフローチャートの一例である。当該処理は、予め、記憶媒体6062、第1情報処理装置の記憶装置及び/又は第2情報処理装置6025bの記憶装置1224に記憶されている制御プログラムを参照して、主に管理制御装置6060の処理部6066により、作業習熟システム6100(管理制御装置6060、第1人型ロボット6020a及び第2人型ロボット6020b)の各要素と協働して実行される。
本実施形態に係る作業習熟システム6100によれば、作業のお手本となる習熟作業員400aの所定動作に対応するセンシング情報に基づいて標準動作モデルが学習され、当該標準動作モデルを参照して作業再現ロボットが作動することとなる。これにより、作業再現ロボットの動作がお手本としての所定動作を行うこととなり、作業再現ロボットの動作を作業員の参考にさせることで、作業員の作業の習熟を図ることができる。
図54A、54Bは、本実施形態の変形例1に係る作業習熟システムの一例を示す図である。
本作業習熟システムによれば、人型ロボット6020'は単独で作業習熟システムを構成することができることから、例えば、管理制御装置6060との通信ができないような場所においても、各作業員のセンシング及びお手本となる再現動作を行うことが可能となる。
200 作業場
201 作業ライン
400 作業員
20 人型ロボット(移動式ロボット、作業ロボット)
23 ロボット用センサ
24 ロボット用撮像装置
25 情報処理装置
30 センサ取付部材
33 取付部材用センサ
34 取付部材用撮像装置
60 管理制御装置
Claims (36)
- センサを有する移動式ロボットを作業員が動作する環境に移動させ、
前記センサを用いて作業員の動作を記録し、
前記記録に基づいて、作業員の動作を学習し、
前記学習に基づいて、作業員の動作と同じ動作を作業ロボットに行わせ、
作業員の動作と前記作業ロボットの動作とが一致するように調整する、
工程を有する作業ロボット調整方法。 - 前記移動式ロボットと前記作業ロボットは同一のロボットである、請求項1に記載の作業ロボット調整方法。
- 前記センサは、撮像装置を含み、
前記撮像装置は、前記移動式ロボットより高い位置に配置されている、
請求項1又は2に記載の作業ロボット調整方法。 - センシング対象の所定動作をセンシングするための第1センサと、
前記第1センサと異なる位置から前記センシング対象の所定動作をセンシングするための第2センサと、
前記第1センサと、第1移動機構とを備える第1移動式ロボットと、
前記第1センサ、前記第2センサ及び前記第1移動機構と通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記第1センサにより取得される第1情報及び前記第2センサにより取得される第2情報から、前記センシング対象が所定動作時に可動する所定部位がセンシングされるか否かを判定する判定部と、
前記判定部により前記所定部位がセンシングされていないと判定された場合、前記所定部位がセンシングされる様に前記第1移動機構を作動させる制御部と、を有する、
ことを特徴とするセンシングシステム。 - 前記第2センサは、第2移動機構を備える第2移動式ロボットに配置され、
前記制御部は、前記判定部により前記所定部位がセンシングされていないと判定された場合、前記所定部位がセンシングされる様に前記第1移動機構及び前記第2移動機構を作動させる、
請求項4に記載のセンシングシステム。 - 前記制御部は、前記第1センサが前記所定部位の一部をセンシングするように前記第1移動機構を作動させ、且つ、前記第2センサが前記所定部位の他部をセンシングするように前記第2移動機構を作動させる、請求項5に記載のセンシングシステム。
- 作業ロボットをさらに備え、
前記管理制御装置は、
前記第1情報、及び、前記第2情報を記憶する記憶部と、
前記記憶部に記憶された前記第1情報及び前記第2情報を参照して前記所定動作を学習する学習部と、
前記学習部による学習結果を参照して、前記作業ロボットに動作指示を与える動作情報を生成する動作情報生成部と、をさらに有する、
請求項4~6の何れか一項に記載のセンシングシステム。 - 前記記憶部は、予め、前記センシング対象の作業マニュアル情報、又は、工程表情報を記憶し、
前記動作情報生成部は、前記学習部による学習結果、及び、前記センシング対象の作業マニュアル情報又は工程表情報を参照して、前記作業ロボットに動作指示を与える動作情報を生成する、請求項7に記載のセンシングシステム。 - センシング対象の所定動作をセンシングするための第1センサと、
前記第1センサと異なる位置から前記センシング対象の所定動作をセンシングするための第2センサと、
前記第1センサと、第1移動機構とを備える第1移動式ロボットと、
前記第1センサ、前記第2センサ及び前記第1移動機構と通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記第1センサにより取得される第1情報及び前記第2センサにより取得される第2情報から、前記センシング対象が所定動作時に可動する所定部位がセンシングされるか否かを判定し、
前記判定により前記所定部位がセンシングされていないと判定された場合、前記所定部位がセンシングされる様に前記第1移動機構を作動させる、
ことを特徴とするセンシング方法。 - センシング対象の所定動作をセンシングするための第1センサと、
前記第1センサと異なる位置から前記センシング対象の所定動作をセンシングするための第2センサと、
前記第1センサと、第1移動機構とを備える第1移動式ロボットと、
前記第1センサ、前記第2センサ及び前記第1移動機構と通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記第1センサにより取得される第1情報及び前記第2センサにより取得される第2情報から、前記センシング対象が所定動作時に可動する所定部位がセンシングされるか否かを判定し、かつ前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが同一であるか否かを判定する判定部と、
前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが同一であると判定された場合、前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが異なる様に前記第1移動機構を作動させる制御部と、を有する、
ことを特徴とするセンシングシステム。 - 移動式ロボットであって、
移動機構と、
センシング対象をセンシングする第1センサと、
前記第1センサと異なる位置から前記センシング対象をセンシングする第2センサと、
前記第2センサの位置を移動させることが可能な駆動機構と、
前記第1センサ、前記第2センサ、前記移動機構、及び、前記駆動機構を制御する情報処理部と、を備え、
前記情報処理部は、
前記第1センサにより取得される第1情報及び前記第2センサにより取得される第2情報から、前記センシング対象が所定動作時に可動する所定部位がセンシングされるか否かを判定する判定部と、
前記判定部により前記所定部位がセンシングされていないと判定された場合、前記所定部位がセンシングされる様に前記移動機構又は前記駆動機構を作動させる制御部と、を有する、
ことを特徴とする移動式ロボット。 - 移動式ロボットであって、
移動機構と、
センシング対象をセンシングする第1センサと、
前記第1センサと異なる位置から前記センシング対象をセンシングする第2センサと、
前記第2センサの位置を移動させることが可能な駆動機構と、
前記第1センサ、前記第2センサ、前記移動機構、及び、前記駆動機構を制御する情報処理部と、を備え、
前記情報処理部は、
前記第1センサにより取得される第1情報及び前記第2センサにより取得される第2情報から、前記センシング対象が所定動作時に可動する所定部位がセンシングされるか否かを判定し、かつ前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが同一であるか否かを判定する判定部と、
前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが同一であると判定された場合、前記第1センサによりセンシングされる所定部位と前記第2センサによりセンシングされる所定部位とが異なる様に前記移動機構又は前記駆動機構を作動させる制御部と、を有する、
ことを特徴とする移動式ロボット。 - センシング対象の所定動作をセンシングするための第1センサと、
動作指示により動作する作業ロボットと、
前記作業ロボットのロボット動作をセンシングするための第2センサと、
前記第1センサ、前記第2センサ、及び、前記作業ロボットと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記第1センサにより取得される第1情報を参照して前記所定動作を学習する学習部と、
前記学習部による前記所定動作の学習結果を参照して、前記作業ロボットに前記動作指示を与える動作制御情報を生成する動作情報生成部と、
前記第1情報と前記第2センサにより取得される第2情報とを比較して、前記作業ロボットの前記ロボット動作が前記所定動作に近似するように前記動作制御情報を調整する調整部と、を有する、
ことを特徴とするセンシングシステム。 - 前記管理制御装置は、前記第1センサによる前記第1情報の取得と、前記第2センサによる前記第2情報の取得を同時に行う、請求項13に記載のセンシングシステム。
- 前記管理制御装置は、前記第1センサによる前記第1情報の取得と、前記第2センサによる前記第2情報の取得を、別々に行う、請求項13に記載のセンシングシステム。
- 前記管理制御装置は、前記第1情報、前記第2情報、及び、前記所定動作に関連する作業マニュアル情報を記憶する記憶部を更に有し、
前記動作情報生成部は、前記学習部による前記所定動作の学習結果、及び、前記作業マニュアル情報を参照して、前記動作制御情報を生成する、請求項13~15の何れか一項に記載のセンシングシステム。 - 前記動作情報生成部は、前記作業マニュアル情報と相反する前記学習部による前記所定動作の学習結果を、動作制御情報を生成する際に採用しない、請求項16に記載のセンシングシステム。
- 前記第1センサ又は前記第2センサは、移動機構を備える移動式ロボットに配置され、
前記移動式ロボットは、前記管理制御装置と通信可能である、
請求項13に記載のセンシングシステム。 - センシング対象の所定動作をセンシングするための第1センサと、
動作指示により動作する作業ロボットと、
前記作業ロボットのロボット動作をセンシングするための第2センサと、
前記第1センサ、前記第2センサ、及び、前記作業ロボットと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記第1センサにより取得される第1情報を参照して前記所定動作を学習し、
前記学習による前記所定動作の学習結果を参照して、前記作業ロボットに前記動作指示を与える動作制御情報を生成し、
前記第1情報と前記第2センサにより取得される第2情報とを比較して、前記作業ロボットの前記ロボット動作が前記所定動作に近似するように前記動作制御情報を調整する、
ことを特徴とするセンシング方法。 - 動作指示により動作する作業ロボットであって、
センシング対象の所定動作をセンシングするための第1センサと、
前記作業ロボットのロボット動作をセンシングするための第2センサと、
前記第1センサ及び前記第2センサと通信可能な情報処理部と、を備え、
前記情報処理部は、
前記第1センサにより取得される第1情報を参照して前記所定動作を学習する学習部と、
前記学習部による前記所定動作の学習結果を参照して、前記作業ロボットに前記動作指示を与える動作制御情報を生成する動作情報生成部と、
前記第1情報と前記第2センサにより取得される第2情報とを比較して、前記作業ロボットの前記ロボット動作が前記所定動作に近似するように前記動作制御情報を調整する調整部と、を有する、
ことを特徴とする作業ロボット。 - 作業ロボットと、
センサと、
前記作業ロボット及び前記センサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記センサを用いて取得した前記センシング対象の所定動作に対応するセンシング情報に基づいて、前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して、前記標準動作モデルにおける各動作の実行時間を、前記標準動作モデルの生成時の各動作の所要時間よりも短く設定した改変動作モデルを生成するモデル生成部と、
前記改変動作モデルを参照して前記作業ロボットを作動させる制御部と、を備える、
ことを特徴とする動作改変システム。 - 前記管理制御装置は、前記センシング対象の作業マニュアル情報、又は、工程表情報を記憶する記憶部を更に有し、
前記学習部は、前記センシング情報、及び、前記センシング対象の作業マニュアル情報又は工程表情報を参照して、前記標準動作モデルを生成する、
請求項21に記載の動作改変システム。 - 作業ロボットと、
複数の異なるセンシング対象をそれぞれセンシングするための複数のセンサと、
前記作業ロボット及び前記複数のセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
前記複数のセンサを用いて取得した前記複数のセンシング対象の所定動作に対応する複数のセンシング情報に基づいて、前記複数のセンシング対象の各所定動作、及び前記複数のセンシング対象の各所定動作に対応した複数の標準動作モデルを学習する学習部と、
前記複数の標準動作モデルを参照して、前記複数のセンシング対象による所定動作の少なくとも一部を統合した改変動作モデルを生成するモデル生成部と、
前記改変動作モデルを参照して前記作業ロボットを作動させる制御部と、を備える、
ことを特徴とする動作改変システム。 - 前記作業ロボットは複数台設けられ、
前記制御部は、前記複数台の作業ロボットを作動させる、
請求項21から23のいずれか一項に記載の動作改変システム。 - センサを用いて取得したセンシング対象の所定動作に対応するセンシング情報に基づいて、前記センシング対象の所定動作に対応した標準動作モデルを学習し、
前記標準動作モデルを参照して、前記標準動作モデルにおける各動作の実行時間を、前記標準動作モデルの生成時の各動作の所要時間よりも短く設定した改変動作モデルを生成し、
前記改変動作モデルを参照して作業ロボットを作動させる、
ことを特徴とする動作改変方法。 - 作業ロボットであって、
前記作業ロボットを動作させるための駆動機構と、
センサを用いて取得したセンシング対象の所定動作に対応するセンシング情報に基づいて、前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して、前記標準動作モデルにおける各動作の実行時間を、前記標準動作モデルの生成時の各動作の所要時間よりも短く設定した改変動作モデルを生成するモデル生成部と、
前記改変動作モデルを参照して前記駆動機構を制御させる作業ロボットを作動させる制御部と、
を有することを特徴とする作業ロボット。 - 作業再現ロボットと、
前記作業再現ロボットの動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせる制御部と、
事故又は誤動作情報を入力する入力部と、
前記センサを用いて取得した前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記事故又は誤動作の発生を検知する検知部と、を備える、
ことを特徴とする作業再現システム。 - 作業再現ロボットと、
前記作業再現ロボットの動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせる制御部と、
前記センシング対象の作業マニュアル情報、又は、工程表情報を記憶する記憶部と、
前記センサを用いて取得した前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記作業マニュアル情報、又は、前記工程表情報と異なる動作の発生を検知する検知部と、を備える、
ことを特徴とする作業再現システム。 - 作業再現ロボットと、
前記作業再現ロボットの動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習し、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせ、
事故又は誤動作情報を入力し、
前記センサを用いて取得した前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記事故又は誤動作の発生を検知する、
ことを特徴とする作業再現方法。 - 作業再現ロボットと、
前記作業再現ロボットの動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習し、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせ、
前記センシング対象の作業マニュアル情報、又は、工程表情報を記憶し、
前記センサを用いて取得した前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記作業マニュアル情報、又は、前記工程表情報と異なる動作の発生を検知する、
ことを特徴とする作業再現方法。 - 作業再現ロボットであって、
前記作業再現ロボットの動作をセンシング可能なセンサと通信可能な情報処理装置と、を備え、
前記情報処理装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせる制御部と、
事故又は誤動作情報を入力する入力部と、
外部の前記センサを用いて取得された前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記事故又は誤動作の発生を検知する検知部と、を備える、
ことを特徴とする作業再現ロボット。 - 作業再現ロボットであって、
前記作業再現ロボットの動作をセンシング可能なセンサと通信可能な情報処理装置と、を備え、
前記情報処理装置は、
センシング対象の所定動作に対応する第1センシング情報に基づいて前記センシング対象の所定動作に対応した標準動作モデルを学習する学習部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を1回以上行わせる制御部と、
前記センシング対象の作業マニュアル情報、又は、工程表情報を記憶する記憶部と、
前記センサを用いて取得された前記作業再現ロボットの再現動作に対応する第2センシング情報に基づいて、前記作業マニュアル情報、又は、前記工程表情報と異なる動作の発生を検知する検知部と、
ことを特徴とする作業再現ロボット。 - 作業再現ロボットと、
新規作業員の動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
熟練作業員の所定動作に対応する第1センシング情報に基づいて学習した標準動作モデルを記憶する記憶部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を行わせる制御部と、
前記センサを用いて取得した前記新規作業員の動作に対応する第2センシング情報に基づいて、前記新規作業員の動作が前記標準動作モデルと異なる点を検出する検出部と、を備える、
ことを特徴とする作業習熟システム。 - 前記記憶部は、更に、作業マニュアル情報、又は、工程表情報を記憶し、
前記検出部は、前記センサを用いて取得した前記新規作業員の動作に対応する第2センシング情報に基づいて、前記新規作業員の動作が、前記作業マニュアル情報、又は、前記工程表情報と異なる点を検出する、
請求項33に記載の作業習熟システム。 - 作業再現ロボットと、
新規作業員の動作をセンシング可能なセンサと、
前記作業再現ロボット及びセンサと通信可能な管理制御装置と、を備え、
前記管理制御装置は、
熟練作業員の所定動作に対応する第1センシング情報に基づいて学習した標準動作モデルを記憶し、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を行わせ、
前記センサを用いて取得した前記新規作業員の動作に対応する第2センシング情報に基づいて、前記新規作業員の動作が前記標準動作モデルと異なる点を検出する、
ことを特徴とする作業習熟方法。 - 作業再現ロボットであって、
新規作業員の動作をセンシング可能なセンサと通信可能な情報処理装置と、を備え、
前記情報処理装置は、
熟練作業員の所定動作に対応する第1センシング情報に基づいて学習した標準動作モデルを記憶する記憶部と、
前記標準動作モデルを参照して前記作業再現ロボットに再現動作を行わせる制御部と、
前記センサを用いて取得された前記新規作業員の動作に対応する第2センシング情報に基づいて、前記新規作業員の動作が前記標準動作モデルと異なる点を検出する検出部と、を備える、
ことを特徴とする作業再現ロボット。
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202380073373.9A CN120076904A (zh) | 2022-10-18 | 2023-10-17 | 作业机器人调整方法、感测系统、感测方法、移动式机器人、动作改变系统、动作改变方法、作业机器人、作业再现系统、作业再现方法、作业熟悉系统、作业熟悉方法以及作业再现机器人 |
| EP23879802.9A EP4606530A1 (en) | 2022-10-18 | 2023-10-17 | Work robot adjustment method, sensing system, sensing method, mobile robot, operation modification system, operation modification method, work robot, work reproduction system, work reproduction method, work mastering system, work mastering method, and work reproducing robot |
Applications Claiming Priority (12)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2022167221A JP2024059511A (ja) | 2022-10-18 | 2022-10-18 | 作業ロボット調整方法 |
| JP2022-167221 | 2022-10-18 | ||
| JP2022-178895 | 2022-11-08 | ||
| JP2022178895A JP2024068441A (ja) | 2022-11-08 | 2022-11-08 | センシングシステム、センシング方法及び移動式ロボット |
| JP2022180376A JP2024070031A (ja) | 2022-11-10 | 2022-11-10 | センシングシステム、センシング方法及び作業ロボット |
| JP2022-180376 | 2022-11-10 | ||
| JP2022182641A JP2024072051A (ja) | 2022-11-15 | 2022-11-15 | 動作改変システム、動作改変方法及び作業ロボット |
| JP2022-182641 | 2022-11-15 | ||
| JP2022184345A JP2024073243A (ja) | 2022-11-17 | 2022-11-17 | 作業再現システム、作業再現方法及び作業再現ロボット |
| JP2022-184345 | 2022-11-17 | ||
| JP2022-184856 | 2022-11-18 | ||
| JP2022184856A JP2024073888A (ja) | 2022-11-18 | 2022-11-18 | 作業習熟システム、作業習熟方法及び作業再現ロボット |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2024085159A1 true WO2024085159A1 (ja) | 2024-04-25 |
Family
ID=90737884
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2023/037595 Ceased WO2024085159A1 (ja) | 2022-10-18 | 2023-10-17 | 作業ロボット調整方法、センシングシステム、センシング方法、移動式ロボット、動作改変システム、動作改変方法、作業ロボット、作業再現システム、作業再現方法、作業習熟システム、作業習熟方法及び作業再現ロボット |
Country Status (3)
| Country | Link |
|---|---|
| EP (1) | EP4606530A1 (ja) |
| CN (1) | CN120076904A (ja) |
| WO (1) | WO2024085159A1 (ja) |
Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007226515A (ja) | 2006-02-23 | 2007-09-06 | Mhi Environment Engineering Co Ltd | 安全教育システム及び安全教育方法 |
| JP2009196040A (ja) * | 2008-02-21 | 2009-09-03 | Panasonic Corp | ロボットシステム |
| JP2020086697A (ja) | 2018-11-20 | 2020-06-04 | 株式会社日立製作所 | 作業習熟支援装置、及び作業習熟支援システム |
| JP2021053708A (ja) * | 2019-09-26 | 2021-04-08 | ファナック株式会社 | 作業員の作業を補助するロボットシステム、制御方法、機械学習装置、及び機械学習方法 |
| JP2021130156A (ja) | 2020-02-19 | 2021-09-09 | 株式会社東芝 | 作業ロボットの操作支援システム及び操作支援方法 |
| JP2022042867A (ja) | 2020-09-03 | 2022-03-15 | 倉敷紡績株式会社 | ロボット制御方法 |
| JP2022113042A (ja) | 2021-01-22 | 2022-08-03 | オムロン株式会社 | 作業推定装置、作業推定装置の制御方法、情報処理プログラム、および記録媒体 |
| JP2022128114A (ja) * | 2021-02-22 | 2022-09-01 | 川崎重工業株式会社 | メンテナンス支援システム |
| JP2022167221A (ja) | 2021-04-22 | 2022-11-04 | 進也 樋口 | 高度処理装置 |
| JP2022178895A (ja) | 2021-05-21 | 2022-12-02 | 国立大学法人 宮崎大学 | 前回り受身習得用具 |
| JP2022180376A (ja) | 2016-09-14 | 2022-12-06 | ソルヴェイ(ソシエテ アノニム) | 6員環環状サルフェートを含有する電解質 |
| JP2022182641A (ja) | 2021-05-28 | 2022-12-08 | 酒井重工業株式会社 | 建設車両のブレーキ機構 |
| JP2022184345A (ja) | 2021-06-01 | 2022-12-13 | セイコーエプソン株式会社 | 印刷装置 |
| JP2022184856A (ja) | 2020-08-19 | 2022-12-13 | Dic株式会社 | 硬化性樹脂、硬化性樹脂組成物、及び、硬化物 |
-
2023
- 2023-10-17 WO PCT/JP2023/037595 patent/WO2024085159A1/ja not_active Ceased
- 2023-10-17 EP EP23879802.9A patent/EP4606530A1/en active Pending
- 2023-10-17 CN CN202380073373.9A patent/CN120076904A/zh active Pending
Patent Citations (14)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2007226515A (ja) | 2006-02-23 | 2007-09-06 | Mhi Environment Engineering Co Ltd | 安全教育システム及び安全教育方法 |
| JP2009196040A (ja) * | 2008-02-21 | 2009-09-03 | Panasonic Corp | ロボットシステム |
| JP2022180376A (ja) | 2016-09-14 | 2022-12-06 | ソルヴェイ(ソシエテ アノニム) | 6員環環状サルフェートを含有する電解質 |
| JP2020086697A (ja) | 2018-11-20 | 2020-06-04 | 株式会社日立製作所 | 作業習熟支援装置、及び作業習熟支援システム |
| JP2021053708A (ja) * | 2019-09-26 | 2021-04-08 | ファナック株式会社 | 作業員の作業を補助するロボットシステム、制御方法、機械学習装置、及び機械学習方法 |
| JP2021130156A (ja) | 2020-02-19 | 2021-09-09 | 株式会社東芝 | 作業ロボットの操作支援システム及び操作支援方法 |
| JP2022184856A (ja) | 2020-08-19 | 2022-12-13 | Dic株式会社 | 硬化性樹脂、硬化性樹脂組成物、及び、硬化物 |
| JP2022042867A (ja) | 2020-09-03 | 2022-03-15 | 倉敷紡績株式会社 | ロボット制御方法 |
| JP2022113042A (ja) | 2021-01-22 | 2022-08-03 | オムロン株式会社 | 作業推定装置、作業推定装置の制御方法、情報処理プログラム、および記録媒体 |
| JP2022128114A (ja) * | 2021-02-22 | 2022-09-01 | 川崎重工業株式会社 | メンテナンス支援システム |
| JP2022167221A (ja) | 2021-04-22 | 2022-11-04 | 進也 樋口 | 高度処理装置 |
| JP2022178895A (ja) | 2021-05-21 | 2022-12-02 | 国立大学法人 宮崎大学 | 前回り受身習得用具 |
| JP2022182641A (ja) | 2021-05-28 | 2022-12-08 | 酒井重工業株式会社 | 建設車両のブレーキ機構 |
| JP2022184345A (ja) | 2021-06-01 | 2022-12-13 | セイコーエプソン株式会社 | 印刷装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| EP4606530A1 (en) | 2025-08-27 |
| CN120076904A (zh) | 2025-05-30 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| KR102139513B1 (ko) | 인공지능 vils 기반의 자율주행 제어 장치 및 방법 | |
| JP6915605B2 (ja) | 画像生成装置、ロボット訓練システム、画像生成方法、及び画像生成プログラム | |
| CN103105193A (zh) | 精密机器视觉检测系统和用于操作该系统的方法 | |
| US9701019B2 (en) | System and method for the automatic generation of robot programs | |
| JP5760035B2 (ja) | 動き予測制御付き機械視覚装置および方法 | |
| CN111090255A (zh) | 可编程逻辑控制器和主单元 | |
| JP2020013527A (ja) | プログラマブルロジックコントローラおよびプログラム作成支援装置 | |
| CN101034418A (zh) | 用于机器人仿真的装置、程序、记录介质以及方法 | |
| CN101973032A (zh) | 一种焊接机器人线结构光视觉传感器离线编程系统和方法 | |
| US11360456B2 (en) | Apparatus and method for identifying differences between a real installation and a digital twin of the installation | |
| CN112139683B (zh) | 评价装置、评价方法、评价系统及记录介质 | |
| US20060111813A1 (en) | Automated manufacturing system | |
| US20190289196A1 (en) | Image inspection device, image inspection method and computer readable recording medium | |
| CN115038554A (zh) | 基于传感器的用于自主机器的复杂场景的构建 | |
| Miura et al. | Autoware toolbox: Matlab/simulink benchmark suite for ros-based self-driving software platform | |
| CN109318228A (zh) | 桌面级六自由度机械臂快速控制原型实验系统 | |
| EP3670108A1 (en) | Robot teaching programming method, apparatus and system, and computer-readable medium | |
| KR20190003983A (ko) | 광학 접합을 위한 자동 접합 시퀀스의 결정 | |
| KR20210045811A (ko) | 세그먼트 기반의 가상 물리 시스템을 위한 시스템 모델링 방법 및 시스템 연동 방법 | |
| CN115469564A (zh) | 车辆的自动泊车测试系统、方法、车辆及存储介质 | |
| WO2024085159A1 (ja) | 作業ロボット調整方法、センシングシステム、センシング方法、移動式ロボット、動作改変システム、動作改変方法、作業ロボット、作業再現システム、作業再現方法、作業習熟システム、作業習熟方法及び作業再現ロボット | |
| CN112613469B (zh) | 目标对象的运动控制方法及相关设备 | |
| KR102528433B1 (ko) | 갠트리탑재형 카메라를 이용한 곡부재 제작 장치 | |
| JP7179238B1 (ja) | 表示データ生成プログラム、表示データ生成装置及び表示データ生成方法 | |
| US4988200A (en) | Apparatus for automatic tracking and contour measurement |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23879802 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202380073373.9 Country of ref document: CN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 202517046501 Country of ref document: IN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 2023879802 Country of ref document: EP |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| ENP | Entry into the national phase |
Ref document number: 2023879802 Country of ref document: EP Effective date: 20250519 |
|
| WWP | Wipo information: published in national office |
Ref document number: 202380073373.9 Country of ref document: CN |
|
| WWP | Wipo information: published in national office |
Ref document number: 202517046501 Country of ref document: IN |
|
| WWP | Wipo information: published in national office |
Ref document number: 2023879802 Country of ref document: EP |