[go: up one dir, main page]

WO2019064919A1 - Dispositif d'apprentissage de robot - Google Patents

Dispositif d'apprentissage de robot Download PDF

Info

Publication number
WO2019064919A1
WO2019064919A1 PCT/JP2018/028996 JP2018028996W WO2019064919A1 WO 2019064919 A1 WO2019064919 A1 WO 2019064919A1 JP 2018028996 W JP2018028996 W JP 2018028996W WO 2019064919 A1 WO2019064919 A1 WO 2019064919A1
Authority
WO
WIPO (PCT)
Prior art keywords
robot
task
unit
program
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2018/028996
Other languages
English (en)
Japanese (ja)
Inventor
吉田 昌弘
常田 晴弘
ナット タン ドアン
小菅 昌克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Corp
Original Assignee
Nidec Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Corp filed Critical Nidec Corp
Priority to JP2019544371A priority Critical patent/JPWO2019064919A1/ja
Publication of WO2019064919A1 publication Critical patent/WO2019064919A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/42Recording and playback systems, i.e. in which the programme is recorded from a cycle of operations, e.g. the cycle of operations being manually controlled, after which this record is played back on the same machine

Definitions

  • the present invention relates to a robot teaching device.
  • Patent No. 4312481 gazette
  • the waiting time by the interlock when the waiting time by the interlock is too long, it is preferable to set the timeout value of the waiting time in order to cause a decrease in production efficiency in the line of the factory where the robot is arranged. It is considered that the waiting time timeout value should be an appropriate value for each work element such as gripping and placing included in a series of operations performed by the robot. However, in the conventional off-line teaching, the operator can not easily set the timeout value of the waiting time by the interlock for each work element.
  • the first exemplary invention of the present application creates a task which is information on a working element of a robot with respect to an object based on an operation input of an operator, and a passing point before reaching a target point of the working element.
  • a task generation unit configured to set information on at least one of a point and a second point which is a passing point after reaching the target point to the task, and the task based on an operation input of an operator
  • An interlock setting unit configured to set a timeout value of a standby time for waiting the operation of the robot at at least one of the set first point and the second point.
  • the operator when performing off-line teaching of a robot, the operator can easily set the timeout value of the waiting time by the interlock for each work element.
  • FIG. 1 is a view showing an entire configuration of a robot teaching device according to an embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of each device included in the robot teaching device of the embodiment.
  • FIG. 3 is a view showing a three-dimensional CAD window according to an example of the embodiment.
  • FIG. 4 is a view showing a teaching window according to an example of the embodiment.
  • FIG. 5 is a diagram for conceptually explaining a task.
  • FIG. 6 is a view showing transition of a teaching window according to an example of the embodiment.
  • FIG. 7 is a diagram showing a setting example of the TCP of the object.
  • FIG. 8 is a diagram showing an operation example of the hand of the robot in the virtual space.
  • FIG. 9 is a diagram for explaining how an operator sets TCP.
  • FIG. 1 is a view showing an entire configuration of a robot teaching device according to an embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of each device included in the robot teaching device of the embodiment.
  • FIG. 10 is a diagram showing an example of the task list.
  • FIG. 11 is a view showing a display example of an error task in the task list of FIG.
  • FIG. 12 is a diagram showing transition of a teaching window according to an example of the embodiment.
  • FIG. 13 is a diagram showing a display example of a hierarchical list in the embodiment.
  • FIG. 14 is a functional block diagram of the robot teaching device according to the embodiment.
  • FIG. 15 is an example of a sequence chart showing processing of the robot teaching device according to the embodiment.
  • FIG. 16 is an example of a sequence chart showing processing of the robot teaching device according to the embodiment.
  • FIG. 17 is a diagram showing another display example of the hierarchical list in the embodiment.
  • the robot teaching device 1 of the present embodiment provides off-line teaching in which the operator creates teaching data for teaching the operation of the robot without actually operating the robot.
  • work element means the minimum unit of work performed by the robot in a series of work such as "take” or "place” an object.
  • Object means an object that is the target of the robot's work, and is not limited to the object that the robot grips (for example, the workpiece that is the processing object), but an object related to the robot's work (for example, the robot Also includes a shelf) on which an object to be gripped is placed.
  • FIG. 1 is a view showing an overall configuration of a robot teaching device 1 of the present embodiment.
  • FIG. 2 is a diagram showing a hardware configuration of each device included in the robot teaching device 1 of the present embodiment.
  • the robot teaching device 1 includes an information processing device 2 and a robot control device 3.
  • the information processing device 2 and the robot control device 3 are communicably connected by, for example, a communication network cable EC.
  • the information processing apparatus 2 is an apparatus for teaching an operation to a robot installed in a factory line.
  • the information processing device 2 is provided to perform off-line teaching by the operator, and is disposed, for example, at a position away from the factory where the robot is installed (for example, a work place of the operator).
  • the robot control device 3 executes a robot program transmitted from the information processing device 2.
  • the robot control device 3 is not connected to the robot, but when connected to the robot, it is possible to send a control signal according to the execution result of the robot program to the robot to operate the robot. Therefore, preferably, the robot control device 3 is disposed in the vicinity of the actual robot.
  • the information processing device 2 includes a control unit 21, a storage 22, an input device 23, a display device 24, and a communication interface unit 25.
  • the control unit 21 includes a central processing unit (CPU), a read only memory (ROM), and a random access memory (RAM).
  • the ROM stores a three-dimensional CAD application program and teaching software.
  • the CPU executes three-dimensional CAD application software (hereinafter referred to as “CAD software” as appropriate) on the ROM and teaching software in the RAM.
  • the teaching software and the CAD software execute processing in cooperation via an API (Application Program Interface).
  • the control unit 21 continuously displays images in frame units in order to perform moving image reproduction by CAD software.
  • the storage 22 is a large-capacity storage device such as a hard disk drive (HDD) or a solid state drive (SSD), and is configured to be sequentially accessible by the CPU of the control unit 21.
  • the storage 22 stores data of a three-dimensional model that is referred to when executing CAD software.
  • the storage 22 stores data of a three-dimensional model of a robot and an object (for example, a pen, a cap, a product, a pen tray, a cap tray, a product tray described later).
  • the storage 22 stores execution log data acquired from the robot control device 3.
  • the execution log data includes robot program execution results and robot state data to be described later.
  • the robot state data is used by three-dimensional CAD to reproduce the motion of the robot in a virtual space by animation (animation).
  • the input device 23 is a device for receiving an operation input by an operator, and includes a pointing device.
  • the display device 24 is a device for displaying execution results of teaching software and CAD software, and includes a display drive circuit and a display panel.
  • the communication interface unit 25 includes a communication circuit for performing network communication with the robot control device 3.
  • the robot control device 3 includes a control unit 31, a storage 32, and a communication interface unit 33.
  • the control unit 31 includes a CPU, a ROM, a RAM, and a control circuit.
  • the control unit 31 executes a robot program received from the information processing device 2 and outputs execution log data.
  • the execution log data includes the execution result of the robot program and robot state data of the robot that executes the work described in the robot program.
  • the robot state data is data of a physical quantity that indicates the state of the robot according to the passage of time. Examples of physical quantities that indicate the state of the robot include the joint angle of the arm of the robot, and the velocity and acceleration of the arm.
  • the storage 32 is a mass storage device such as an HDD or an SSD, and is configured to be sequentially accessible by the CPU of the control unit 31.
  • the storage 32 stores robot programs and execution log data.
  • the communication interface unit 33 includes a communication circuit for performing network communication with the information processing device 2.
  • the information processing device 2 provides off-line teaching for the robot.
  • the teaching software and the CAD software are cooperatively executed to provide the useful user interface described below.
  • the information processing apparatus 2 is made to execute CAD software and teaching software.
  • the execution result of CAD software is displayed in a CAD window, and the execution result of teaching software is displayed in a teaching window.
  • the operator causes both the CAD window and the teaching window to be displayed on the information processing device 2 or causes the information processing device 2 to display it while switching between the CAD window and the teaching window, and performs an operation related to teaching.
  • FIG. 3 shows a CAD window W1 according to an example of the present embodiment.
  • CAD CAD window W1
  • FIG. 3 an image in a state where the robot R, the pen tray 11, the cap tray 12, the jig 13, and the product tray 14 are arranged in a virtual space on the table T (hereinafter referred to as “CAD as appropriate”
  • the image is displayed.
  • the robot R performs a series of operations for assembling a product (a finished product with the cap fitted on the pen) by fitting the cap on the pen.
  • a pen group P composed of a plurality of pens is disposed on the pen tray 11, and a cap group C composed of a plurality of caps is disposed on the cap tray 12.
  • the jig 13 is a member for the robot R to temporarily arrange a pen and fit a cap.
  • the product tray 14 is a member for placing a product.
  • each pen included in the pen group P, each cap included in the cap group C, the pen tray 11, the cap tray 12, the jig 13, and the product tray 14 are work targets of the robot R. It is an example of an object. Further, a product in which a cap is fitted to a pen is also an example of an object.
  • FIG. 4 shows a teaching window W2 according to an example of the present embodiment. Displayed in the teaching window W2 are a robot R included in the CAD image of FIG. 3 and a hierarchical list (an example of hierarchical data) indicating the hierarchical relationship of objects.
  • the teaching software can create hierarchical lists in conjunction with CAD software.
  • a tree structured data format is prepared by the teaching software as a hierarchical list.
  • the operator drags the robot R and the object in the CAD image to a desired node of the data format of the tree structure while selecting the robot R and the object in the CAD image with the CAD window and the teaching window displayed. .
  • the hierarchical list can be completed by sequentially performing this operation on all objects required to teach the robot R.
  • the name of each node displayed in the hierarchical list the name of the original three-dimensional model may be applied as it is, but the name may be changed later.
  • FIG. 3 exemplifies a CAD image when there is one robot, when two or more robots exist, the two or more robots can be registered in the hierarchical list.
  • nodes 61 to 63 mean the following contents, and TCP (Tool Center Point) described later is set, respectively.
  • the hand sequence is set for each object held by the hand 52 because it depends on the object held by the hand 52 of the robot R. If a task to be described later is created when the hand sequence is not set, a program based on the task can not be executed, so a warning display may be output to the display device 24.
  • a robot area RA (example of a first area) for hierarchically displaying the components of the robot R (Robot_R, Hand in FIG. 4) and the components of the object are hierarchized
  • the hierarchical data is displayed by being divided into an object area PA (an example of a second area) to be displayed in a similar manner.
  • the components of the object include jig (JIG), pen tray (PenTray), cap tray (CapTray), product tray (ProductTray), pens (Pen1, Pen2, ..., Pen12), caps (Cap1, Cap2, ... ,, Cap12) and products (PenProduct1, PenProduct2, ..., PenProduct12) are included.
  • the nodes of the jig (JIG) for example, the following nodes are provided corresponding to the operation for the jig.
  • Node 64 (PenProduct): jig in a state of holding product (PenProduct)
  • Node 65 (PenJ): jig in a state of holding pen (Pen)
  • Node 66 (CapJ): holding a cap (Cap) Jig of state
  • PenTray Below the nodes of the pen tray (PenTray), nodes corresponding to the pens Pen1, Pen2, ..., Pen12 are provided. Below the nodes of the cap tray (CapTray), nodes corresponding to the caps Cap1, Cap2, ..., Cap12 are provided. Below the nodes of the product tray (ProductTray), nodes corresponding to the products PenProduct 1, PenProduct 2,..., PenProduct 12 are provided.
  • Robot R in the hierarchical list Robot_R in FIG. 4
  • objects jig (JIG), pen tray (PenTray), cap tray (CapTray), product tray (ProductTray), pens Pen1, Pen2, ..., Pen12 , Caps Cap1, Cap2, ..., Cap12, and products PenProduct1, PenProduct2, ..., PenProduct12
  • JIG pen tray
  • CapTray cap tray
  • ProductTray product tray
  • pens Pen1, Pen2, ..., Pen12 Caps Cap1, Cap2, ..., Cap12
  • PenProduct1, PenProduct2, ..., PenProduct12 are in a state associated with the data of the corresponding three-dimensional model. Therefore, even if there is a change in the robot R and the three-dimensional model of the object in the hierarchical list after the hierarchical list is created, it is not necessary to register again in the hierarchical list.
  • (2-2) Grouping Two or more objects registered in the hierarchical list can be grouped.
  • the procedure for grouping is as follows. For example, in FIG. 4, when grouping the pens Pen1 and Pen2 included in the pen tray (PenTray), the pointing device specifies the node corresponding to the pen Pen1 and Pen2 in the hierarchical list and performs the right click operation. Then, an operation to select "create group" is performed. By performing this operation, pens Pen 1 and Pen 2 are grouped.
  • the two or more objects grouped in the hierarchical list are surrounded by, for example, a rectangular frame as shown in FIG. 4 so that the operator can recognize that the two or more objects are grouped. It is preferable to perform display processing, such as using a common font or giving the same color.
  • the example shown in FIG. 4 is a case where objects corresponding to two consecutive nodes are grouped, but grouping is performed when objects corresponding to two non-consecutive nodes are grouped
  • the hierarchical list may be updated such that two of the two nodes continue. For example, in the case where the pens Pen1 and Pen3 are grouped, the order of the lower nodes of the nodes of the pen tray (PenTray) is updated to pens Pen1, Pen3, Pen2,.
  • FIG. 5 is a diagram for conceptually explaining a task.
  • a task is information on a work element which is a minimum unit of work performed by a robot in a series of work.
  • a series of work (hereinafter referred to as “job”) performed by the robot R is configured by a plurality of work elements and movement between the work elements. Therefore, in the present embodiment, a plurality of tasks are defined for the job performed by the robot R.
  • a line with an arrow conceptually indicates the trajectory of the hand 52 of the robot R.
  • the trajectories are at approach points AP1 and AP2 (examples of the first point, respectively) which are passing points before reaching the target point TP of the work element, the target point TP, and the passing point after reaching the target point TP. It includes certain departure points DP1 and DP2 (examples of the second point).
  • the target point TP indicates the position of an object that is the target of the work element, and may be the TCP of the object described later.
  • the movement of the robot R before reaching the approach point AP1 corresponds to the movement between work elements (that is, the movement between the previous work element and the work elements shown in FIG. 5).
  • the motion of the robot R from the approach point AP1 to the departure point DP2 corresponds to one work element and one task.
  • the movement of the robot R after the departure point DP2 corresponds to the movement between work elements (that is, the movement between the work element shown in FIG. 5 and the next work element). That is, the task may include, in addition to the information on the work element, information on at least one of the target point TP, the approach point AP, and the departure point DP.
  • FIG. 5 exemplifies the case where the interlock is set at the approach points AP1 and AP2.
  • Interlock is a process that causes the robot R to stand by at the target point TP, the approach point AP, and / or the departure point DP based on the input signal in order to avoid interference with other robots etc. It is.
  • the task may include information on the interlock setting including the point of setting the interlock and the timeout value of the waiting time due to the interlock.
  • FIG. 6 is a view showing transition of a window for teaching according to an example of the present embodiment.
  • the operator points to one of the nodes 61 to 63 of the hand of the hand of the robot R (Robot_R) included in the robot area RA.
  • Robot_R the operator points to one of the nodes 61 to 63 of the hand of the hand of the robot R (Robot_R) included in the robot area RA.
  • a teaching window W3 for task creation in FIG. 6 is displayed.
  • the teaching window W3 is a screen for performing detailed setting of tasks, and displays items of task name (Name), type of work element (Function), and target of work element (Target).
  • name type of work element
  • Tuget target of work element
  • any work from a pull-down menu consisting of candidates of a plurality of types of work elements set in advance (for example, Pick up, Place, etc.) It is configured to be able to select an element.
  • an object to be a work object is selected with the pointing device from the object area PA of the hierarchical list, and left-click operation is selected in the item of Target (Target) in the teaching window W3.
  • the target object is input.
  • the name (Name) of the task is automatically determined and displayed based on the data.
  • the operator points at node 61 in robot area RA and the pointing device Right click on and select "Create task”.
  • the pen tray is displayed in the item of the target (Target) of the teaching window W3.
  • “PenTray” is input.
  • "Pick up” is selected from candidates for a plurality of types of work elements.
  • the teaching window W3 shown in FIG. 6 is displayed. .
  • a task with the name "Pickup_Pen1_From_PenTray” is created as information related to the work element "take pen 1 from pen tray".
  • the operator intuitively performs the task by designating the target of the work element (here, “Pen Pen1”) and the start point (for example, “Pen tray”) or the end point of the work element on the hierarchical list. Can be created.
  • the name of the task is the work content of the work element (for example, "Pickup"), the work target (for example, "Pen Pen 1"), the target (for example, "pen tray”) which is the start point of the work element, or the end point
  • the contents of the task can be immediately understood from the name of the task because it is automatically created so as to include the object.
  • tasks may be created collectively for two or more objects grouped in the hierarchical list.
  • TCP Tool Center Point; an example of a reference point
  • TCP Tool Center Point
  • the TCP means a reference point (or a work point) of the object in the virtual space, which is a reference for work on the object of the robot R, and is set for the hand 52 of the robot R and each object .
  • TCP When setting TCP, it is set also about the local coordinate system which made TCP the origin.
  • the local coordinate system of the object when rotating the Euler angle of the object, it is sufficient to rotate it around TCP, and as described later, the approach point AP and departure point DP are automatically set. There are advantages such as what can be done.
  • FIG. 7 shows an example of TCP setting of the cap Cap1 as an example.
  • the TCP is automatically set to the barycentric position of the object. Since the data of the three-dimensional model is associated with the object, the position of the center of gravity can be obtained relatively easily from the data of the three-dimensional model.
  • the approach point AP and the departure point DP respectively create a task at a position 1.5 times the total length of the object (cap Cap1 in the example of FIG. 7) on the Z axis of the local coordinate system set for the object. Sometimes set automatically.
  • the position of 1.5 times the total length is merely an example, and may be appropriately set, for example, to any value between 1 and 2 times the total length.
  • FIG. 1 is 1.5 times the total length
  • the approach point AP and departure point DP of the object may be set automatically. For example, it is assumed that pens Pen1 and Pen2 are grouped. Coordinates of the TCP which is a standard of work for the pen Pen1 are set to (100, 100, 0) in the world coordinate system, and coordinates of the approach point AP and the departure point DP of the pen Pen1 are (100, 100, 20) in the world coordinate system. Assume that the case is set to).
  • the coordinates of the approach point AP and departure point DP of pen Pen 2 are (120, 100, 20) in the world coordinate system. It is set automatically. That is, the approach point AP and the departure point DP of the pen Pen 2 are set so that the TCP offset amount (+20 in the X coordinate) of the pen Pen 1 and the pen Pen 2 is reflected. In other words, the approach point AP and departure point of the pen Pen2 are such that the approach point AP and departure point DP of the pen Pen1 and pen Pen2 have the same coordinates in the local coordinate system based on TCP of Pen Pen1 and Pen Pen2, respectively. DP is set.
  • FIG. 8 is a view showing an example of the operation of the hand 52 in the virtual space when taking the cap Cap1 from the cap tray 12.
  • the hand 52 first moves along the track TR1 so that the TCP1 set in the hand 52 (that is, the TCP of the hand set in the node 62 in FIG. 4) coincides with the approach point AP.
  • the hand 52 moves along the track TR 2 such that the TCP 1 set in the hand 52 matches the TCP 2 set in the cap Cap 1 of the cap tray 12.
  • the hand 52 moves along the track TR3 such that the TCP1 set in the hand 52 coincides with the departure point DP.
  • the trajectories TR2 and TR3 are trajectories along the Z axis of the local coordinate system set in the cap Cap1 of the cap tray 12. Therefore, when the hand 52 takes the cap Cap1, it does not interfere with the cap adjacent to the cap Cap1 in the cap tray 12. Also, the trajectory of the hand 52 is determined based on the TCP 1 set in the hand 52 and the TCP 2 of the cap Cap 1 in the cap tray 12 and its local coordinate system. Therefore, the hand 52 can grip the cap Cap1 with high accuracy.
  • FIG. 9 shows a method of setting the TCP based on the operator's operation input.
  • a TCP model TCP_m having a local coordinate system LA is provided to set TCP.
  • the operator moves the TCP model to the desired position of the object (cap Cap1 in FIG. 9) and sets the position of the TCP model TCP_m, the corresponding node in the hierarchical list (for example, Cap1 of FIG. Drag the TCP model TCP_m to the node).
  • the TCP of the object for example, the cap Cap1
  • the cap Cap1 can be set to the coordinates set on the CAD software.
  • the teaching window W3 is provided with a button b1 ("detail setting").
  • the operator can select one of “approach point, departure point setting”, “motion parameter setting”, and “interlock setting” by operating the button b1 (“detail setting”).
  • “Approach point, departure point setting” the operator can change, delete, and add a new approach point and / or departure point of the automatically created approach point and / or departure point.
  • Motion parameters are parameters relating to the motion of the hand 52 of the robot R between adjacent approach points AP included in the task, between the approach point AP and the target point TP, and between the target point TP and the departure point DP. It is.
  • motion parameter setting the motion parameter can be changed from the default value.
  • interlock setting it is possible to change from the default value the timeout value of the interlock waiting time and the setting of the operation when the waiting time exceeds the timeout value and it is judged as an error.
  • a button b3 (“check”) is provided in the teaching window W3 of FIG. Although it is not essential to operate button b3 ("check") when creating a task, by operating button b3 ("check"), the operator can depart from the approach point AP included in the task in advance. The operation of the robot R up to the point DP can be confirmed in the CAD window.
  • the operation of the button b3 (“check”) is when the task name (Name) is displayed (that is, when data has already been input for each item of the work element type (Function) and the target (Target)) It is effective for
  • the operation check result by the operation of the button b3 (“check”) is performed by transmitting the program created based on the task to the robot control device 3.
  • the robot control device 3 executes the received program, and calculates robot state data, which is information indicating the state of the robot R according to the passage of time, as the execution result.
  • the robot state data is, for example, information such as a change in joint angle of the arm 51 according to the passage of time, a trajectory of the hand 52 according to the passage of time, and the like.
  • the robot state data is returned from the robot control device 3 to the information processing device 2.
  • the CAD software operates the robot R and the three-dimensional model of the object in the virtual space based on the robot state data, and displays a moving image of the movement of the robot R corresponding to the task.
  • the cause of the error may be displayed in the column of Status in the teaching window W3 of FIG.
  • the cause of the error includes, for example, overspeeding, reaching of the target point, reaching of the singular point and the like. If the operation of the robot R is performed normally, the margin for the moving speed set in the motion parameter, the margin for the target point, or the margin to the singular point, etc., are displayed in the Status column. , And information useful to the operator may be displayed.
  • the teaching window W3 in FIG. 6 is provided with a button b2 ("Create”).
  • the button b2 "Create”
  • the task set by the teaching window W3 is registered in a task list described later.
  • the operation check by the operation of the button b3 (“check”) is not performed before the button b2 (“create”) is operated, the operation of the button b3 (“check”) is not performed.
  • the operation check may be automatically performed to notify the presence or absence of an error.
  • the task-based program is an example of a robot program for causing the robot R to execute a work element corresponding to the task.
  • the program for executing the work element corresponding to the task shown in FIG. 5 is composed of the following functions, each of which executes the motion of the robot R (meaning the “movement of the robot R”). It is written by the program to make it Note that the following move (AP1) may be separately defined as movement between work elements.
  • the task list is information on a list of a plurality of tasks corresponding to each of a plurality of work elements included in a job performed by the robot R.
  • the tasks created for the specific job by the teaching window W3 are sequentially registered in the task list corresponding to the job.
  • the order of the plurality of tasks included in the task list indicates the execution order of the plurality of work elements respectively corresponding to the plurality of tasks.
  • the teaching window W4 of FIG. 10 displays an example of a task list corresponding to a job ("Pen Assembly") which is a series of operations of "fitting a cap on a pen and assembling a product".
  • An example of this task list includes tasks corresponding to the following six work elements (i) to (vi).
  • the teaching window W3 for creating a task shows a case where a task having a name shown in parentheses is created.
  • different display modes a task where the result of the operation check in the teaching window W3 indicates an error (referred to as “error task”) and a task other than the error task in the task list) For example, different character colors, different background colors may be displayed).
  • the “tasks other than error tasks” include tasks that did not indicate an error as a result of the operation check and tasks that have not been checked.
  • the error task, the OK task that is, the task that did not indicate an error as a result of the operation check
  • the task for which the operation check has not been performed may be displayed in different display modes.
  • task display grouped into groups may be performed in the task list.
  • the operator can set the selected tasks in an arbitrary order in the task list by performing the drag operation while selecting any task with the pointing device in the task list.
  • a robot program is created in the information processing device 2, and the robot program is executed in the robot control device 3.
  • the robot control device 3 calculates robot state data, which is information indicating the state of the robot R according to the passage of time, as the execution result of the robot program.
  • the robot state data is, for example, information such as a change in joint angle of the arm 51 according to the passage of time, a trajectory of the hand 52 according to the passage of time, and the like.
  • the robot state data is returned from the robot control device 3 to the information processing device 2.
  • the CAD software operates the robot R and the three-dimensional model of the object in the virtual space based on the robot state data, and displays a motion image (animation) of the motion of the robot R corresponding to the task as a simulation output.
  • the cause of the error may be displayed in the column of the status of the teaching window W4 of FIG.
  • the cause of the error includes, for example, overspeeding, reaching of the target point, reaching of the singular point and the like.
  • an error task whose simulation result indicates an error and a task other than the error task in the task list have different display modes (for example, display modes having different character colors and different background colors) It may be displayed in).
  • a display example of an error task in the task list of FIG. 10 is shown in the teaching window W5 of FIG.
  • This display example shows the case where the last task "Place_to_PenProduct1_in_ProductTray" in the task list is an error task, and by adding a mark M1 (a frame surrounding the task) to the error task, it is not an error task.
  • the display mode is different from the task of.
  • a simulation may be performed on work elements corresponding to a part of tasks selected by the operator in the task list.
  • FIG. 12 is a diagram showing transition of a window for teaching according to an example of the present embodiment.
  • FIG. 13 is a diagram showing a display example of a hierarchical list in the present embodiment.
  • At least one object in the example shown in FIG. 4, pens Pen1, Pen2,..., Pen12, and Cap Cap1, Cap1 in the example shown in FIG. 4
  • the hand 52 of the robot R example of robot components
  • Cap12, and Cap12, and at least one of the products PenProduct1, PenProduct2, ..., PenProduct12 is determined to be reachable.
  • an operator may be able to select a robot to be subjected to area check among the plurality of robots. Since the robot can take a plurality of postures, the area check of the robot may be performed in any of the plurality of postures selected by the operator.
  • the teaching window W6 of FIG. 12 is displayed.
  • the teaching window W6 it is possible to select a posture for area check. An area check is performed by selecting at least one posture in the teaching window W6 and selecting the button b5 ("OK").
  • the execution of the area check requires inverse kinematics calculations, which are performed in the robot controller 3. That is, the information processing device 2 passes, to the robot control device 3, information on coordinate data based on the robot R and the three-dimensional model of each object, and coordinate data of TCP of each object.
  • the robot control device 3 can determine that the object can be reached if the joint angle of the arm 51 of the robot R can be obtained by inverse kinematics with respect to the specific object based on the information.
  • the area check as illustrated in FIG. 13, in the case of the area check for a pen, for each pen included in the pen tray 11, the unreachable pen (example of the unreachable object) This is done by displaying the mark M2 ("NG"). Another display example is to display an unreachable object in a color different from the reachable object.
  • FIG. 14 is a functional block diagram of the robot teaching device 1 according to the embodiment.
  • the robot teaching device 1 includes a display control unit 101, a task creating unit 102, a task updating unit 103, a task replacing unit 104, a setting updating unit 105, a grouping unit 106, a program creating unit 107, and a program executing unit.
  • a state information calculation unit 109, a simulation unit 110, a determination unit 111, a posture selection unit 112, a reference point setting unit 113, a model placement unit 114, and an interlock setting unit 115 are provided.
  • the robot teaching device 1 further includes a task database 221, a hierarchical list database 222, a three-dimensional model database 223, and an execution log database 224.
  • the CPU included in the control unit 21 executes the teaching software and / or the CAD software to perform the processing.
  • the display control unit 101 controls the display device 24 to display the execution results of the teaching software and the CAD software.
  • the control unit 21 of the information processing device 2 generates image data including the output of teaching software and CAD software, buffers it, and transmits it to the display device 24.
  • the display device 24 drives the display drive circuit to display an image on the display panel.
  • the task creation unit 102 has a function of creating a task that is information related to work elements of the robot with respect to the object based on the operation input of the operator.
  • the control unit 21 of the information processing device 2 receives an operation input of the operator from the input device 23, the control unit 21 of FIG.
  • the task is created as a file including information of the target of the work element (Target) and recorded in the storage 22.
  • the control unit 21 determines the task name in accordance with a predetermined rule, based on the type (Function) of the work element and the target (Target) of the work element.
  • the storage 22 (an example of a first storage unit) stores a task database 221 including tasks created by the task creation unit 102.
  • the control unit 21 of the information processing device 2 sequentially creates tasks corresponding to a plurality of work elements included in a job performed by the robot, and thereby creates a plurality of tasks associated with the job as a task list.
  • each task is recorded in a state associated with a specific job.
  • the display control unit 101 refers to the task database 221 of the storage 22 and displays a task list, which is a list of a plurality of tasks, on the display device 24.
  • a task list which is a list of a plurality of tasks, on the display device 24.
  • the names of the tasks in the displayed task list are configured such that the work contents of the work elements of the robot can be recognized, so that a series of work contents can be intuitively understood by the operator intuitively. It has become.
  • the task creation unit 102 sets the target point of the work element, the approach point (example of the first point) which is the passing point before reaching the target point, and the departure point (the first passing point after reaching the target point).
  • the information processing apparatus may be provided with a function of setting information on at least one of the two examples) to the task.
  • the control unit 21 of the information processing device 2 determines the approach point and the departure point based on the TCP and the local coordinate system of the object to be the target of the work element (Target). For example, in the Z axis of the local coordinate system of the object, the position of the predetermined magnification of the entire length of the object is used as the approach point and the departure point.
  • the control unit 21 updates the task database 221 so that the information on the determined approach point and departure point is included in the task.
  • the display control unit 101 displays hierarchical data in which the components of the robot and the components of the object are hierarchically described on the display device 24.
  • the control unit 21 of the information processing device 2 creates a hierarchical list by linking teaching software and CAD software. That is, the control unit 21 receives an operation input for dragging a robot and an object in a CAD image to a desired node of the data format of the tree structure in a state where the robot and the object are selected by the pointing device of the input device 23. Create a list
  • the control unit 21 records the created hierarchical list in the hierarchical list database 222 of the storage 22. In the hierarchical list, each node corresponding to the component of the robot and the component of the object in the data of the tree structure is associated with the corresponding data of the three-dimensional model recorded in the three-dimensional model database 223 It has become.
  • the task creating unit 102 has a function of creating a task based on an operation input of an operator specifying a component of a robot of hierarchical data and a component of an object.
  • the control unit 21 of the information processing device 2 displays a node indicating a hand of a robot corresponding to an operation of gripping a specific target and a node corresponding to the target from the hierarchical list. Create a task based on the operation input of the selected operator. By using the hierarchical list, the operator can create tasks intuitively according to the contents of work elements.
  • the display control unit 101 hierarchically displays the robot area RA (first area) for hierarchically displaying the components of the robot and the components of the object.
  • the hierarchical list is displayed on the display unit 24 by dividing it into the target object area PA (second area).
  • a node indicating a robot hand corresponding to an operation of gripping a specific object and a node corresponding to the object are selected. Sometimes, it becomes easier to find the desired node.
  • the task update unit 103 deletes one of the tasks in the task list based on the operation input by the operator on the task list displayed on the display device 24, and stores the task in the storage 22 (example of the first storage unit) It has a function of adding a task included in the task database 221 to the task list or changing the contents of any task in the task list.
  • the control unit 21 of the information processing device 2 accesses the storage 22 based on the operation input of the operator accepted by the input device 23, and the task list included in the task database 221 rewrite. Since the operator can edit the contents of the job in task units, the teaching work is made efficient.
  • the order of tasks included in the task list displayed on the display device 24 may define the execution order of a plurality of tasks.
  • the task exchange unit 104 changes the order of the tasks included in the task list based on the operation input by the operator on the task list displayed on the display device 24.
  • the control unit 21 of the information processing device 2 accesses the storage 22 based on the operation input of the operator accepted by the input device 23, and the control unit 21 of the task list included in the task database 221. Update the order. Since the operator can change the execution order on a task-by-task basis, the job by the robot can be optimized, and the teaching work can be made efficient.
  • the setting updating unit 105 deletes, changes, or adds information on at least one of the approach point and the departure point set as the task by the task creating unit 102 based on the operation input by the operator. It has a function.
  • the control unit 21 of the information processing device 2 accesses the storage 22 based on the operation input of the operator accepted by the input device 23 in the task creation window (see FIG. 6). And update the task approach point and departure point information contained in the task database 221. Since the setting updating unit 105 can delete, change, or add the approach point and the departure point of the work element to the desired position by the operator, the robot can be taught the desired operation of the operator.
  • the operator since the execution result of the task can be reproduced as a moving image by CAD software, the operator deletes, changes, or adds the approach point and / or the departure point after seeing the moving image reproduction result. Can be taught to optimize the motion of the robot.
  • the grouping unit 106 has a function of grouping the two or more objects based on the operation input of the operator specifying the two or more objects from the hierarchical list displayed on the display device 24.
  • the task creating unit 102 may set the information on at least one of the approach point and the departure point set for any of the grouped two or more objects to the above two or more points. Also set for other objects of the object of.
  • the control unit 21 accepts the operation input of the operator from the input device 23 in the information processing apparatus 2, two or more objects designated by the operation input are regarded as the same group
  • the task database 221 is updated to be associated.
  • the control unit 21 sets an approach point of another grouped object.
  • Update the task database 221 so that The setting of approach points and departure points of other grouped objects is performed in such a manner as to reflect the TCP offset amounts of two or more grouped objects, as described above. Grouping two or more objects makes it easy to set an approach point and / or departure point for a plurality of objects.
  • the program creation unit 107 has a function of creating a program for causing the robot 5 to execute a work element corresponding to the task based on the task created by the task creation unit 102.
  • the control unit 21 of the information processing device 2 performs the type of work element included in the task, the target of the work element, the approach point, the departure point, the motion parameter, and the interlock Based on the settings, a function is created in which a program for causing the robot to execute the work element corresponding to the task is written.
  • the program creating unit 107 refers to the information included in the task, and rewrites the coordinate position etc. in the format of the predetermined program according to the type of the work element stored in the storage 22 (example of the first storage unit). Automatically create a program.
  • the program execution unit 108 is generated by the program generation unit 107 based on at least one selected task based on the operation input of the operator selecting at least one task from the task list displayed on the display device 24. It has a function to execute a robot program including a program.
  • the control unit 21 of the information processing device 2 transmits a robot program created based on at least one task to the robot control device 3 via the communication interface unit 25.
  • the control unit 31 executes the robot program.
  • a program created based on each task is composed of a collection of functions.
  • the control unit 31 sequentially executes programs created based on the tasks included in the task list.
  • the display control unit 101 displays an error task which is a task whose execution result by the program execution unit 108 indicates an error in the task list and a task other than the error task in the task list in different display modes.
  • the control unit 31 of the robot control device 3 executes the robot program by sequentially executing the program created based on each task included in the task list. Then, the control unit 31 records execution log data including execution results for each task (result of success or error and error cause in case of error) in the execution log database 224.
  • the cause of the error is, for example, the speed exceeding, the target point unreached, the singular point reaching, and the like.
  • the control unit 31 transmits execution log data to the information processing device 2 via the communication interface unit 33.
  • the control unit 21 of the information processing device 2 displays the task list on the display device 24 so that the execution result included in the execution log data is reflected.
  • the task list is displayed such that an error task whose execution result indicates an error and a task other than the error task have different display modes (see, for example, FIG. 11). Therefore, the operator can immediately recognize the work element in which the error occurs in the job, and can cope with the approach point of the task, the departure point, or the correction of the setting value of the motion parameter.
  • the storage 22 (an example of a second storage unit and a third storage unit) stores a three-dimensional model database 223 including a three-dimensional model of the robot R in a virtual space and information of a three-dimensional model of an object.
  • the state information calculation unit 109 has a function of calculating robot state data (example of state information) which is information indicating the state of the robot according to the passage of time based on the execution result by the program execution unit 108.
  • the simulation unit 110 has a function of operating a three-dimensional model in a virtual space and displaying it on the display device 24 based on the robot state data obtained by the state information calculation unit 109.
  • the control unit 31 of the robot control device 3 executes the programs created based on each task in order, as described above.
  • the robot program received from the processing device 2 is executed.
  • the control unit 31 acquires robot state data in the entire job as the execution result of the robot program.
  • the robot state data is data of a physical quantity (for example, joint angle of arm, velocity and acceleration of arm, trajectory of each part) indicating the state of the robot according to the passage of time.
  • the control unit 21 of the information processing device 2 acquires execution log data including robot state data from the robot control device 3 and records the execution log data in the execution log database 224.
  • control unit 21 causes the robot and the three-dimensional model of the object to operate on the virtual space based on the execution log data and displays the three-dimensional model on the display device 24. Therefore, since the operator can visually confirm the motion of the robot of each task, the set value of each task (for example, approach point, departure point, motion parameter, etc.) and the arrangement of the object (for example, FIG. 3) It becomes easy to reconsider the arrangement of the cap tray 12 and the like.
  • the determination unit 111 determines that the at least one target virtual space.
  • the function of determining whether it is reachable by the hand 52 of the robot R (example of components of the robot) is possible.
  • the function corresponds to the area check described above.
  • the control unit 21 of the information processing device 2 transmits coordinate data based on the robot and the three-dimensional model of the object, data on the TCP coordinates of the object via the communication interface unit 25. Is sent to the robot control device 3.
  • the control unit 31 of the robot control device 3 determines, based on the received data, whether or not the joint angle of the arm of the robot can be obtained by inverse kinematics with respect to a specific object. If the joint angle can be determined, it means that the robot's hand can reach the object, and if the joint angle can not be determined, it means that the robot's hand can not reach the object. Since the determination unit 111 can determine whether the robot hand can reach the specific object before creating the robot program, the optimization of the arrangement of the object can be studied in advance, and the efficiency of the teaching operation can be improved. .
  • the display control unit 101 is configured such that an unreachable object that is an object determined to be unreachable by the determination unit 111 among the plurality of objects is distinguishable from an object other than the unreachable object.
  • Hierarchical data may be displayed. In the area check, as shown in FIG. 13, preferably, an object (unreachable object) determined to be unreachable among a plurality of objects can be identified as an object other than the unreachable object. Display a hierarchical list so that Although the unreachable object is identifiably displayed in FIG. 13 by the presence or absence of the mark M2, various identification methods can be adopted. For example, instead of the example of FIG. 13, display processing such as making the unreachable pen (Pen) red or reducing the luminance may be performed.
  • the posture selection unit 112 has a function of selecting any one of a plurality of predetermined postures for the robot based on the operation input of the operator. In that case, on the condition that the robot takes the posture selected by the posture selection unit 112, the determination unit 111 determines whether or not at least one object can be reached by the component of the robot.
  • the control unit 31 of the robot control device 3 fixes the joint angle of the arm of the robot by inverse kinematics with respect to a specific object in a state in which the robot is fixed to the selected posture. It is determined whether it can be determined.
  • the operator can consider in advance the arrangement of the object, etc., while considering various postures of the robot.
  • the reference point setting unit 113 calculates the TCP (example of the reference point) in the virtual space, which is the reference of the work on the object of the robot, based on the position in the virtual space of the three-dimensional model of the object. To associate the object with the object.
  • the control unit 21 of the information processing device 2 sets the TCP of the target object based on the data of the three-dimensional model of the target object to be processed and Set the local coordinate system of the target object as a reference.
  • the teaching software and the CAD software are linked, the TCP of the object can be calculated automatically and accurately.
  • the reference point setting unit 113 preferably calculates the barycentric position of the object as TCP.
  • the trajectory of the hand of the robot is determined based on the TCP set to the hand corresponding to the operation of gripping the object, and the TCP of the object as the object and its local coordinate system. Therefore, the work by the hand can be taught with high accuracy. Further, by calculating the gravity center position of the target as TCP, it is possible to specify as TCP a position where it is easy to avoid interference with an object other than the target (other target, equipment, etc.).
  • the model placement unit 114 has a function of placing and displaying a three-dimensional model of an object, a TCP, and a local coordinate system of the three-dimensional model with the TCP as an origin in a virtual space.
  • the control unit 21 of the information processing device 2 associates the TCP and the data of the local coordinate system of the three-dimensional model whose origin is the TCP with each object. It is recorded in the three-dimensional model database 223 of the storage 22.
  • the control unit 21 refers to the three-dimensional model database 223 to display the three-dimensional model of the object, the TCP of the object, and the local coordinate system on the display 24.
  • the association of the TCP with the target may not be automatic, but may be performed based on the operator's operation input. That is, the display control unit 101 displays a three-dimensional model in the virtual space, and the reference point setting unit 113 determines based on an operation input of an operator that associates a predetermined point in the virtual space with an object of hierarchical data.
  • the predetermined point may be associated with an object as TCP.
  • the reference point setting unit 113 may notify that the TCP is not associated with the object.
  • the input device 23 of the information processing apparatus 2 sets the TCP model (see FIG.
  • the control unit 21 updates the three-dimensional model database 223 of the storage 22 so as to associate the predetermined point with the target as the TCP of the target.
  • the control unit 21 can prompt the operator to associate the TCP with the object, for example, by outputting a warning display on the display device 24.
  • the display control unit 101 may display the TCP associated with the object hierarchically below the object on the hierarchical data.
  • the control unit 21 of the information processing device 2 reads out the TCP associated with each object in the hierarchical list database 222 and sets the lower layer of each object in the hierarchical list. Display the coordinates of TCP. As a result, the operator can recognize the TCP coordinates of each object at a glance.
  • the interlock setting unit 115 has a function of setting a time-out value of the waiting time for waiting the robot operation at at least one of the target point, the approach point and the departure point set to the task based on the operation input of the operator.
  • the control unit 21 of the information processing device 2 is based on the operation input of the operator accepted by the input device 23 (for example, the operation input in the teaching window W3 of FIG. 6).
  • the storage 22 is accessed, and the timeout value of the interlock waiting time set at the specific approach point or departure point included in the task included in the task database 221 is rewritten.
  • the interlock setting unit 115 allows the operator to easily set the timeout value of the standby time due to the interlock for each work element when performing off-line teaching of the robot.
  • the simulation unit 110 may operate the three-dimensional model in the virtual space by invalidating the standby time due to the interlock. That is, when the simulation program is executed, the three-dimensional model is operated without waiting for the interlock. As a result, the operator can check the execution result of the program paying attention to the movement of the robot without stopping the robot in the virtual space.
  • FIG. 15 is an example of a sequence chart showing an area check process of the robot teaching device according to the present embodiment.
  • FIG. 16 is an example of a sequence chart showing job execution processing of the robot teaching device according to the present embodiment.
  • the determination unit 111 determines whether or not the target specified at the time of selection of “area check” can be reached by the hand 52 of the robot R in the virtual space, and displays the reachability result on the display device 24. In addition, it is also possible to perform an area check by designating two or more objects. Specifically, the process of the determination unit 111 is performed as follows.
  • the information processing device 2 transmits coordinate data based on the robot R and the three-dimensional model of each object and TCP coordinate data of each object to the robot control device 3 via the communication interface unit 25 (step S14).
  • the robot control device 3 calculates the joint angle of the arm 51 of the robot R by inverse kinematics with respect to each designated object based on the received data (step S16).
  • the robot control device 3 determines whether the object of the arm 51 can be reached based on whether or not the joint angle of the arm 51 is obtained in step S16 (step S18).
  • the processes of steps S16 and S18 are performed one by one for all the postures of the robots R selected by the posture selection unit 112.
  • the robot control device 3 returns the area check result (result of reachability) for each posture of the robot R to the information processing device 2 when the processing for all the postures is completed (step S20: YES) (step S22) ).
  • the information processing device 2 displays the received area check result on the display device 24, for example, as shown in FIG. 13 (step S24).
  • step S30 YES
  • the program creation unit 107 creates a robot program for causing the robot R to execute a job.
  • the robot program includes a plurality of functions in which programs for causing the robot R to execute the work elements corresponding to the tasks in the task list corresponding to the job are described.
  • the created robot program is transmitted from the information processing device 2 to the robot control device 3 (step S34), and executed by the robot control device 3. That is, the program execution unit 108 executes the robot program in task units (step S36).
  • the state information calculation unit 109 calculates robot state data which is information indicating the state of the robot R according to the passage of time based on the execution result by the program execution unit 108, and stores execution log data including the robot state data. Record in 32 (step S38).
  • the processes of steps S36 and S38 are performed until all tasks included in the task list are completed.
  • the robot control device 3 transmits the execution log data recorded in the storage 32 to the information processing device 2 (step S42).
  • the information processing device 2 records the received execution log data in the execution log database 224.
  • the simulation unit 110 operates the three-dimensional model in the virtual space based on robot state data (that is, robot state data obtained by the state information calculation unit 109) included in the execution log data received in step S42. And display on the display 24 (step S44).
  • the determination unit 111 specifies the degree of proximity of the hand 52 of the robot R to an unreachable object that is an object determined to be not reachable by the hand 52 of the robot R (an example of a component of the robot).
  • the display control unit 101 displays so that the degree of proximity of the hand 52 of the robot R to the unreachable object can be recognized.
  • a display example of an index indicating the degree of proximity of the hand 52 is shown in FIG.
  • the mark M3 is displayed for the unreachable object.
  • the mark M3 has a display format of "NG (*) (*: 1 to 5)", and the number in the parenthesis can indicate the degree of proximity (for example, the smaller the number is, the closer the image is). ing.
  • the degree of proximity can be calculated by the difference value between the determined joint angle of the arm 51 and the limit angle of the joint angle.
  • (5-2) Modification 2 In the second modification, information is provided on how much the object determined to be reachable by the hand 52 of the robot R can afford to the hand 52. Providing such information can also support off-line teaching. For example, useful information on the relocation of the object can be obtained.
  • the determination unit 111 specifies the margin of arrival of the hand 52 of the robot R with respect to the reachable object which is an object determined to be reachable by the hand 52 of the robot R, and the display control unit 101 The margin of arrival of the hand 52 of the robot R with respect to the reachable object is displayed so as to be recognizable. For example, as shown in FIG. 17, the mark M4 is displayed for the reachable object.
  • the mark M4 has a display format of "OK (*) (*: 1 to 5)", and the margin number (for example, the larger the number is, the more margin) is indicated by the number in the parentheses. .
  • the margin can be calculated by the difference value between the determined joint angle of the arm 51 and the limit angle of the joint angle.
  • the determination unit 111 may determine whether or not each of the plurality of objects can be reached by the hand 52 of the robot R. . That is, based on the operation input which selected robot R in the hierarchical list, the reachability of all objects may be determined. This saves the operator the trouble of selecting an object to be subjected to the area check. In addition, determining the reachability for all objects can provide useful information for considering the overall arrangement of all objects.
  • the program creation unit 107 may create a program to determine that an error occurs when the standby time due to the interlock reaches the timeout value. If the standby time by the interlock reaches the timeout value when the robot program is executed, the operator can recognize, for example, that there is a defect in the input / output signal to the robot R by judging as an error. .
  • the program creation unit 107 may create a program so as to set the robot R to a predetermined reference posture when it is determined that an error occurs due to interlock timeout. If it is determined as an error as shown in the fourth modification, it is preferable to stop the operation of the robot R and return it to the reference posture (for example, the initial posture) rather than performing the operation after the error occurs. There is a case.
  • the program creating unit 107 determines that an error occurs due to interlock timeout, the program creating unit 107 refers to the task database 221 of the storage 22 (example of the first storage unit) and selects one of the tasks based on the operator's operation input.
  • a program may be created to cause the robot R to execute work elements corresponding to any of the tasks.
  • the robot R may be made to execute a specific work element. An example of a particular work element is to place the object currently being grasped on an error shelf.
  • the present invention is not limited to the above-mentioned embodiment. Further, various modifications and changes can be made to the embodiment described above without departing from the spirit of the present invention.
  • the robot teaching device according to the above-described embodiment does not have to have all the functions described in the functional block diagram of FIG. 14, and may have at least a part of the functions.
  • the robot teaching device is illustrated as including the two devices of the information processing device and the robot control device, but the invention is not limited thereto, and may be configured as an integrated device.
  • the input device of the information processing apparatus includes the pointing device
  • the present invention is not limited thereto, and another device may be used.
  • a display panel provided with a touch input function may be used to receive touch input by an operator.
  • program creating unit 108 ... program executing unit, 109 ... state information calculating unit, 110 ... simulation unit, 11 determination unit 112 posture selection unit 113 reference point setting unit 114 model arrangement unit 115 interlock setting unit 221 task database 222 hierarchical list database 223 three-dimensional model database 224 ... execution log database, RA ... robot area, PA ... object area, W1 to W6 ... window, b1 to b4 ... button, AP ... approach point, TP ... target point, DP ... departure point, T ... table

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

La présente invention vise à permettre à un opérateur de régler facilement une valeur de temporisation d'un temps de veille déclenché par un verrouillage, pour chaque action lors de la mise en œuvre d'un apprentissage autonome d'un robot. Un mode de réalisation de la présente invention concerne un dispositif d'apprentissage de robot comprenant : une unité de création de tâches (102) qui crée des tâches, chaque tâche se présentant sous la forme d'informations associées à une action mise en œuvre par un robot relativement à un objet cible, sur la base d'une entrée de fonctionnement effectuée par un opérateur, et qui définit dans chaque tâche des informations associées à un premier point, qui est un point de passage avant qu'un point cible de l'action ne soit atteint, et/ou à un second point, qui est un point de passage après que le point cible a été atteint ; et une unité de réglage de verrouillage (115) qui définit une valeur de temporisation d'un temps de veille pour amener l'activité du robot à se mettre en veille lors de l'atteinte du premier point et/ou du second point définis dans la tâche, sur la base d'une entrée de fonctionnement effectuée par l'opérateur.
PCT/JP2018/028996 2017-09-26 2018-08-02 Dispositif d'apprentissage de robot Ceased WO2019064919A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2019544371A JPWO2019064919A1 (ja) 2017-09-26 2018-08-02 ロボット教示装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-184653 2017-09-26
JP2017184653 2017-09-26

Publications (1)

Publication Number Publication Date
WO2019064919A1 true WO2019064919A1 (fr) 2019-04-04

Family

ID=65903572

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/028996 Ceased WO2019064919A1 (fr) 2017-09-26 2018-08-02 Dispositif d'apprentissage de robot

Country Status (2)

Country Link
JP (1) JPWO2019064919A1 (fr)
WO (1) WO2019064919A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021091055A (ja) * 2019-12-12 2021-06-17 株式会社キーエンス 測定装置
JPWO2022009921A1 (fr) * 2020-07-10 2022-01-13
WO2023203635A1 (fr) * 2022-04-19 2023-10-26 ファナック株式会社 Dispositif de simulation permettant de calculer l'état de fonctionnement d'un dispositif de robot

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08292809A (ja) * 1995-04-20 1996-11-05 Mitsubishi Electric Corp 数値制御方法および数値制御装置
WO1997011416A1 (fr) * 1995-09-19 1997-03-27 Kabushiki Kaisha Yaskawa Denki Processeur de langage robotique
JPH09212219A (ja) * 1996-01-31 1997-08-15 Fuji Facom Corp 三次元仮想モデル作成装置及び制御対象物の監視制御装置
WO2016103307A1 (fr) * 2014-12-26 2016-06-30 川崎重工業株式会社 Procédé permettant de générer un programme de fonctionnement de robot et dispositif permettant de générer un programme de fonctionnement de robot
JP2016209969A (ja) * 2015-05-12 2016-12-15 キヤノン株式会社 情報処理方法、および情報処理装置
JP2017024097A (ja) * 2015-07-17 2017-02-02 ファナック株式会社 ロボットによる自動組立システム

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08292809A (ja) * 1995-04-20 1996-11-05 Mitsubishi Electric Corp 数値制御方法および数値制御装置
WO1997011416A1 (fr) * 1995-09-19 1997-03-27 Kabushiki Kaisha Yaskawa Denki Processeur de langage robotique
JPH09212219A (ja) * 1996-01-31 1997-08-15 Fuji Facom Corp 三次元仮想モデル作成装置及び制御対象物の監視制御装置
WO2016103307A1 (fr) * 2014-12-26 2016-06-30 川崎重工業株式会社 Procédé permettant de générer un programme de fonctionnement de robot et dispositif permettant de générer un programme de fonctionnement de robot
JP2016209969A (ja) * 2015-05-12 2016-12-15 キヤノン株式会社 情報処理方法、および情報処理装置
JP2017024097A (ja) * 2015-07-17 2017-02-02 ファナック株式会社 ロボットによる自動組立システム

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021091055A (ja) * 2019-12-12 2021-06-17 株式会社キーエンス 測定装置
JP7538595B2 (ja) 2019-12-12 2024-08-22 株式会社キーエンス 測定装置
JPWO2022009921A1 (fr) * 2020-07-10 2022-01-13
WO2022009921A1 (fr) * 2020-07-10 2022-01-13 ファナック株式会社 Dispositif de génération de trajectoire et dispositif de commande d'emplacement automatique
CN115769157A (zh) * 2020-07-10 2023-03-07 发那科株式会社 轨道生成装置以及自动位置控制装置
US20230256600A1 (en) * 2020-07-10 2023-08-17 Fanuc Corporation Trajectory generation device and automatic position control device
JP7509880B2 (ja) 2020-07-10 2024-07-02 ファナック株式会社 軌道生成装置および自動位置制御装置
US12290932B2 (en) * 2020-07-10 2025-05-06 Fanuc Corporation Trajectory generation device and automatic position control device
CN115769157B (zh) * 2020-07-10 2025-06-13 发那科株式会社 轨道生成装置以及自动位置控制装置
WO2023203635A1 (fr) * 2022-04-19 2023-10-26 ファナック株式会社 Dispositif de simulation permettant de calculer l'état de fonctionnement d'un dispositif de robot
JPWO2023203635A1 (fr) * 2022-04-19 2023-10-26

Also Published As

Publication number Publication date
JPWO2019064919A1 (ja) 2020-10-15

Similar Documents

Publication Publication Date Title
US11958190B2 (en) Information processing method and information processing apparatus
WO2019064916A1 (fr) Simulateur de robot
EP1310844B1 (fr) Dispositif de simulation
JP2019171501A (ja) ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム
JP7259860B2 (ja) ロボットの経路決定装置、ロボットの経路決定方法、プログラム
JP2019171498A (ja) ロボットプログラム実行装置、ロボットプログラム実行方法、プログラム
JP7151713B2 (ja) ロボットシミュレータ
KR20160002329A (ko) 로봇 시뮬레이터 및 로봇 시뮬레이터의 파일 생성 방법
JP2019018272A (ja) モーション生成方法、モーション生成装置、システム及びコンピュータプログラム
WO2019064919A1 (fr) Dispositif d'apprentissage de robot
US20220281103A1 (en) Information processing apparatus, robot system, method of manufacturing products, information processing method, and recording medium
JP7167925B2 (ja) ロボット教示装置
WO2020059342A1 (fr) Simulateur de robot
US20240165811A1 (en) Device for setting safety parameters, teaching device and method
JP7099470B2 (ja) ロボット教示装置
JP7024795B2 (ja) ロボット教示装置
JP2021151696A (ja) 情報処理方法、情報処理装置、ロボット装置、情報処理プログラム、およびコンピュータ読み取り可能な記録媒体
JP2019171500A (ja) ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム
JP2019171499A (ja) ロボットの干渉判定装置、ロボットの干渉判定方法、プログラム
JPH06134684A (ja) ロボット軌道を教示するための方法
WO2020066947A1 (fr) Dispositif de détermination d'itinéraire de robot, procédé de détermination d'itinéraire de robot, et programme
TW202506358A (zh) 資料提供裝置、機器人系統、資料提供方法、機器人的控制方法、資料提供程式及控制程式
CN120603684A (zh) 示教数据编辑装置、机器人示教系统以及示教数据编辑方法
CN115735165A (zh) 数值控制系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18863160

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019544371

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18863160

Country of ref document: EP

Kind code of ref document: A1