Detailed Description
Next, modes for carrying out the present disclosure will be described with reference to the drawings.
Fig. 1 is an external perspective view of a robot. Fig. 2 is a block diagram showing an electrical connection relationship between the robot and the control device.
The robot system 1 is a work robot system, for example, picks up a workpiece W (element) supplied from a workpiece supply unit P and mounts the workpiece W on a substrate S. As shown in fig. 1 and 2, the robot system 1 includes a work table 2, a robot 10, a hand-held camera 60 attached to a robot arm 20 of the robot 10, a work camera 70 as an external camera, and a control device 90 for controlling the robot 10. The work table 2 is provided with a main lever 3a extending in the vertical direction and a sub lever 3b extending in the vertical direction so as to be separated from the main lever 3 a.
In the present embodiment, the robot 10 is a SCARA robot, and includes a base 11 and a robot arm 20.
The base 11 is fixed to the table 2, and supports the base end side of the robot arm 20. The robot arm 20 includes a first arm 21, a first arm driving unit 30, a second arm 22, a second arm driving unit 40, a shaft 23, and a shaft driving unit 50. The base end portion of the first arm 21 is coupled to the base 11 via a first joint axis J1, and the first arm 21 is configured to be rotatable (horizontally rotatable) in a horizontal plane with respect to the base 11 by rotation of the first joint axis J1. The base end portion of the second arm 22 is coupled to the tip end portion of the first arm 21 via a second joint axis J2, and the second arm 22 is configured to be rotatable (horizontally rotatable) in a horizontal plane with respect to the first arm 21 by rotation of the second joint axis J2. The shaft 23 is coupled to the distal end portion of the second arm 22 via the third joint shaft J3, and is rotatable about the axis of the third joint shaft J3 with respect to the second arm 22 and is configured to be movable up and down in the axial direction of the third joint shaft J3. In the robot system 1 of the present embodiment, a workpiece holding portion 24 that picks up and holds the workpiece W is provided as an end effector at the tip of the shaft 23. Examples of the workpiece holding portion 24 include a suction nozzle that sucks the workpiece W by negative pressure, a mechanical chuck that holds the workpiece W by a pair of claws, and an electromagnetic chuck that sucks the workpiece W by an electromagnet.
The first arm driving unit 30 includes a motor 32 and an encoder 34. The rotation shaft of the motor 32 is coupled to the first joint shaft J1 via a reduction gear, not shown. The first arm driving unit 30 drives the motor 32 to transmit torque to the first joint shaft J1 via the speed reducer, thereby rotating the first arm 21 about the first joint shaft J1 as a fulcrum. The encoder 34 is mounted on the rotation shaft of the motor 32, and is configured as a rotary encoder for detecting the rotational displacement amount of the motor 32.
The second arm driving unit 40 includes a motor 42 and an encoder 44, similar to the first arm driving unit 30. The rotation shaft of the motor 42 is coupled to the second joint shaft J2 via a reduction gear, not shown. The second arm driving unit 40 drives the motor 42 to transmit torque to the second joint shaft J2 via the speed reducer, thereby rotating the second arm 22 about the second joint shaft J2 as a fulcrum. The encoder 44 is mounted on the rotation shaft of the motor 42, and is configured as a rotary encoder for detecting the rotational displacement amount of the motor 42.
The shaft driving section 50 includes motors 52a and 52b and encoders 54a and 54b. The rotation shaft of the motor 52a is connected to the shaft 23 via a belt not shown. The shaft driving unit 50 drives the motor 52a to rotate the shaft 23 around the shaft. The rotation shaft of the motor 52b is connected to the shaft 23 via a ball screw mechanism, not shown. The shaft driving unit 50 drives the motor 52b to convert the rotational motion of the motor 52b into linear motion by a ball screw mechanism, thereby raising and lowering the shaft 23. The encoder 54a is configured as a rotary encoder for detecting the rotational displacement amount of the shaft 23. The encoder 54b is a linear encoder for detecting the elevation position of the shaft 23.
As shown in fig. 2, the control device 90 includes a CPU91, a ROM92 storing processing programs, a RAM93 as a work memory, a storage device 94 such as an HDD and an SSD, and an input/output interface (not shown). The position signals from the encoders 34, 44, 54a, 54b, the image signals from the hand-held camera 60, and the like are input to the control device 90 via the input-output interface. Drive signals for the motors 32, 42, 52a, 52b and the like are output from the control device 90 via the input-output interface.
The hand-held camera 60 is mounted to the front end portion of the second arm 22. The hand-held camera 60 photographs the workpiece W of the workpiece supply section P from above, and outputs the photographed image to the control device 90. The control device 90 recognizes the position of the workpiece W by processing the captured image. Further, since the hand-held camera 60 is attached to the second arm 22, the robot arm 20 cannot move the hand-held camera 60 in the up-down direction in the present embodiment.
The workpiece camera 70 is provided between the workpiece supply portion P and the substrate S on the stage 2. The workpiece camera 70 is accommodated in an accommodating box 71 having a rectangular opening 71a in an upper portion thereof. The workpiece camera 70 photographs the workpiece W held by the workpiece holding unit 24 of the robot 10 from below, and outputs the photographed image to the control device 90. The control device 90 determines whether the workpiece W is normally held by the workpiece holding portion 24 by processing the captured image.
Next, the operation of the robot system 1 configured as described above will be described. The actions of the robot system 1 include a work position recognition action, a pickup confirmation action, and an installation action. In the work position recognition operation, the hand-held camera 60 photographs the work W to recognize the position of the work W. In the pickup operation, the fingertip (workpiece holding section 24) of the robot system 1 is moved to the recognized position of the workpiece W to pick up the workpiece W. In the pickup confirmation operation, the workpiece W picked up by the fingertip is moved upward of the workpiece camera 70, and the workpiece W is photographed by the workpiece camera 70 to confirm whether or not the pickup state of the workpiece W is good. In the mounting operation, the picked-up workpiece W is mounted on the substrate S.
Here, a coordinate system of the robot system 1 will be described. In the present embodiment, as shown in fig. 2 and 3, a world coordinate system Σ is provided as the coordinate system of the robot system 1 w Base coordinate system sigma b A camera coordinate system sigma c And a mechanical interface coordinate system Σ m . In the world coordinate system sigma w The origin is set at the front end of the main lever 3a, the X axis is set in a direction passing through the front end of the main lever 3a and the front end of the sub lever 3b, the Z axis is set in the vertical direction, and the Y axis is set in a direction orthogonal to the X axis and the Z axis. In the base coordinate system Sigma b In the above, the origin is set on the bottom surface of the base 11 of the robot 10, and the X-axis, Y-axis and Z-axis are set as the direction and world coordinate system Σ, respectively w Corresponding axes of (a) are identical. In the mechanical interface coordinate system Sigma m In the above, the origin is set in the fingertip of the robot arm 20, and the X-axis, Y-axis and Z-axis are set as the direction and world coordinate system Σ, respectively w Corresponding axes of (a) are identical. In camera coordinate system Sigma c Wherein the origin is set at the bottom surface of the second arm 22 and the center of the third joint axis J3, and the X-axis, Y-axis and Z-axis are set as the direction and world coordinate system Sigma, respectively w Corresponding axes of (a) are identical. World coordinate system sigma w Base coordinate system sigma b A camera coordinate system sigma c And a mechanical interface coordinate system Σ m The conversion matrix is used to convert to each other.
Identifying movements at work positionIn the meantime, the CPU91 of the control device 90 first sets a slave base 11 (base coordinate system Σ shown in fig. 1) for photographing the workpiece W supplied from the workpiece supply portion P b ) Observed target position X of hand-held camera 60 btag 、Y btag 、Z btag . Next, the CPU91 passes through the target position X of the hand-held camera 60 btag 、Y btag 、Z btag Solving inverse kinematics, thereby calculating a position for the hand-held camera 60 and the target position X btag 、Y btag 、Z btag Angle θ of the first joint axis J1 in agreement J1 And the angle theta of the second joint axis J2 J2 。
Next, the CPU91 considers error factors such as torsion and deflection of the first arm 21 and deflection of the second arm 22, and the like, and based on the angle θ of the first joint axis J1 J1 And the angle theta of the second joint axis J2 J2 Solving for positive kinematics, thereby calculating the estimated position X of the hand-held camera 60 best 、Y best 、Z best . That is, the angle θ of the first and second joint axes J1, J2 is assumed to be calculated by inverse kinematics J1 、θ J2 Control as an angle command value to bring the position of the hand-held camera 60 into contact with the target position X btag 、Y btag 、Z btag In the case of coincidence, the CPU91 calculates the distance from the target position X due to the error such as torsion, deflection of the first arm 21, deflection of the second arm 22, or the like btag 、Y btag 、Z btag Estimated position X, which is the position of deviation best 、Y best 、Z best 。
CPU91 calculates estimated position X best 、Y best 、Z best Then, by taking the calculated estimated position X best 、Y best 、Z best With the target position X btag 、Y btag 、Z btag The difference between them to calculate the position offset delta Xb 、Δ Yb 、Δ Zb . Next, CPU91 sets target position X by btag 、Y btag 、Z btag Offset position offset delta Xb 、Δ Yb 、Δ Zb Solving inverse kinematics based on the quantities of (2) thereby calculating a vector for use inMoving the hand-held camera 60 to the target position X btag 、Y btag 、Z btag Angle command value θ of first joint axis J1 of (2) J1 * And an angle command value θ for the second joint axis J2 J2 * . Then, the CPU91 controls the motors 31 and 32 by feedback control so that the angle of the first joint axis J1 detected by the encoder 34 and the calculated angle command value θ J1 * And matches the angle of the second joint axis J2 detected by the encoder 44 with the calculated angle command value θ J2 * And consistent.
Next, the CPU91 photographs the workpiece W supplied from the workpiece supply section P with the hand-held camera 60, and performs image processing on the photographed image, thereby recognizing the position of the workpiece W. Then, the CPU91 shifts to a pickup operation in which the target position X of the fingertip (work holding portion 24) for picking up the work W is set btag 、Y btag Setting the position of the identified workpiece W and moving the fingertip to the target position X btag 、Y btag 、Z btag To pick up. Target position X btag 、Y btag The setting of (a) is performed based on the identified position of the workpiece W. Target position Z btag The setting of (a) is performed based on the height information of the workpiece W inputted in advance.
In the pickup operation, the CPU91 first calculates a position for moving the fingertip to the set target position (X by inverse kinematics btag ,Y btag ,Z btag ) Angle θ of first joint axis J1 of (2) J1 Angle θ of second joint axis J2 J2 Angle (shaft angle) θ of shaft 23 J4 And lifting position Z of shaft 23 s . Next, the CPU91 considers error factors such as torsion and deflection of the first arm 21 and deflection of the second arm 22, and the like, and calculates θ based on the calculated error factors J1 、θ J2 、θ J4 、Z s Calculating the estimated position X of the fingertip by positive kinematics best 、Y best 、Z best . Next, CPU91 obtains estimated position X of the fingertip best 、Y best 、Z best With the target position X btag 、Y btag 、Z btag The difference between them to calculate the position offset delta Xb 、Δ Yb 、Δ Zb . Next, CPU91 sets target position X by btag 、Y btag 、Z btag Offset position offset amount delta Xb 、Δ Yb 、Δ Zb Solving inverse kinematics on the basis of the quantity of (2) thereby calculating a motion vector for moving the fingertip to the target position X btag 、Y btag 、Z btag Angle command value θ of first joint axis J1 of (2) J1 * Angle command value θ of second joint axis J2 J2 * Angle command value θ of shaft 23 J4 * And a lifting position command value Z of the shaft 23 s * . Then, the CPU91 controls the corresponding motors 31, 32, 33a, 33b by feedback control based on the respective instruction values.
In this way, the robot system 1 picks up the workpiece W by recognizing the position of the workpiece W by photographing the workpiece W with the hand-held camera 60 attached to the robot arm 20 and moving the fingertip of the robot arm 20 to the recognized position. Therefore, if the positional relationship between the robot system 1 and the hand-held camera 60 is not properly grasped, the position of the work W cannot be accurately recognized by the hand-held camera 60. That is, the workpiece W cannot be picked up appropriately by the fingertip of the robot system 1. Therefore, in the present embodiment, the position deviation amount of the hand-held camera 60 from the design position is measured in advance, and the measured position deviation amount is reflected in the control in the work position recognition operation, whereby the position of the work W can be accurately recognized. The robot system 1 is assumed to be corrected.
Fig. 5 is an explanatory diagram showing a process of measuring the position of the hand-held camera. In step S10, the end effector EE (marker pin) with the marker is mounted to the front end of the shaft 23. Fig. 6 is an external view of the marked end effector EE. The marked end effector EE is a pin-shaped member extending in the vertical direction in a state of being attached to the shaft 23. A mark M is provided at the front end of the pin of the marked end effector EE.
In step S20, the robot system 1 is caused to perform a step of bringing the object camera 70 into the world coordinate system Σ w Position (hand-held camera position measuring work camera position X) wwc 、Y wwc 、θ wwc ) Hand-held camera position for measuringAnd a position measuring process of the workpiece camera for position measurement. Fig. 7 is a flowchart showing an example of the hand-held camera position measurement processing performed by the CPU91 of the control device 90. In the hand-held camera position measurement processing, the CPU91 first receives an input of a control point Zm (unique value) of the marked end effector EE from the operator (step S100). The control point Zm is the distance between the front end of the shaft 23 and the front end of the marked end effector EE (mark M). The coordinates of the front end of the marked end effector EE are set to the slave machine interface coordinate system Σ m Is separated from the origin of Zm downward in the Z-axis direction. Next, the CPU91 stands by until the operator presses the work camera position measurement button (step S110). When it is determined that the workpiece camera position measurement button is pressed, the CPU91 moves the marked end effector EE upward of the workpiece camera 70 (step S120), and photographs the mark M provided on the distal end surface of the marked end effector EE with the workpiece camera 70 to measure the position of the workpiece camera 70 (workpiece camera position X wwc 、Y wwc 、θ wwc ) (step S130). Fig. 8 is an explanatory diagram showing a case where the position of the hand-held camera position measuring work camera is measured. As shown in the figure, the CPU91 photographs the mark M with the work camera 70 while moving the tip of the end effector EE with the mark in the X-axis direction and the Y-axis direction, respectively, to pass through the designed center position of the work camera 70. The CPU91 acquires the movement tracks of the mark M appearing in the obtained photographed image in the X-axis direction and the Y-axis direction. Then, as shown in fig. 9, the CPU91 calculates differences between the intersections of the image center of the captured image and the acquired movement tracks in the X-axis direction and the Y-axis direction of the captured image, respectively, to thereby calculate the positional offsets Δx, Δy of the workpiece camera 70. Then, as shown in fig. 9, the CPU91 calculates the rotational offset amount Δθ of the workpiece camera 70 from the difference in the rotational direction between the X-axis direction of the captured image and the X-axis direction of the movement locus. Then, the CPU91 obtains the position obtained by shifting the center position of the work camera 70 by the amounts Δx and Δy as the work camera position X wwc 、Y wwc And find out the rotation direction of the workpiece camera 70The angle after the rotation offset delta theta is biased upwards as the workpiece camera position theta wwc . CPU91 then determines the workpiece camera position X wwc 、Y wwc 、θ wwc Then, the measured position X of the workpiece camera wwc 、Y wwc 、θ wwc Register in the storage device 94 (step S140), and end the present process.
In step S30, a hole gauge 80 for measuring the position of the hand-held camera is attached to an opening 71a provided in the upper portion of the housing box 71. Fig. 10 is an explanatory view showing a state of mounting the hole gauge. As shown in the figure, the hole gauge 80 is fixed by a fixing member 82 in a state of being fitted in the opening 71a of the housing case 71. The hole gauge 80 has a center hole 81a and three-point holes 81b, the center hole 81a being located directly above the workpiece camera 70 and penetrating up and down, and the three-point holes 81b being located around the center hole 81a at intervals in the circumferential direction and penetrating up and down, respectively. The number of the three-point holes 81b is not limited to three, and may be two or four or more.
In step S40, the robot system 1 is caused to execute the step of using the workpiece camera 70 to perform the step of detecting the center hole 81a of the hole gauge 80 in the world coordinate system Σ w Is the position (center hole position X) wwv 、Y wwv 、Z wwv ) Hole position measurement processing for measuring the position of a hand-held camera. Fig. 11 is a flowchart showing an example of the hand-held camera position measurement hole position measurement process executed by the CPU91 of the control device 90. In the hand-held camera position measurement hole position measurement process, the CPU91 first waits until the hand-held camera position measurement button is pressed by the operator (step S200). When it is determined that the hand-held camera position measurement button is pressed, the CPU91 photographs the hole gauge 80 with the work camera 70, and based on the obtained photographed image, the CPU photographs the distance a between the three-point holes 81b of the hole gauge 80 m Measurement is performed (step S210). Fig. 12 is an explanatory view showing a case where a three-point hole of the hole gauge is photographed by a work camera. Distance a of three-point hole 81b m The measurement of (2) is performed by measuring the distance between the three-point holes 81b appearing in the captured image. Then, the CPU91 calculates the distance a of the three-point hole 81b based on the measured distance a m Calculated by the following formula (1)Center hole position Z for measuring position of hand-held camera wwv (step S220), and the calculated center hole position Z wwv Is registered in the storage device 94 (step S230). In the formula (1), a 0 The distance in design of the three-point hole 81b is represented by WD, the working distance of the work camera 70 is represented by Z, and the photographing height of the work camera 70 is represented by Z.
[ formula 1]
Next, the CPU91 performs a step of imaging the hole gauge 80 in the world coordinate system Σ based on the captured image obtained in step S210 w Center hole position X in (a) wwv 、Y wwv Measurement is performed (step S240). Then, the CPU91 calculates the measured center hole position X wwv 、Y wwv Register to the storage device 94 (step S250), and end the present process.
In step S50, the robot system 1 is caused to execute the operation of holding the angle (position θ) of the camera 60 ci ) And performing measurement of the angle of the hand-held camera. Fig. 13 is a flowchart showing an example of the hand-held camera angle measurement process executed by the CPU91 of the control device 90. In the hand-held camera angle measurement process, the CPU91 first photographs the hand-held camera position measurement hole gauge 80 with the work camera 70 to measure that the three-point hole 81b of the hole gauge 80 is in the world coordinate system Σ w Phase θ of (a) wwv (step S300), and the measured three-point hole phase θ wwv Registered in the storage device 94 (step S310). Three-point hole phase theta wwv The measurement of (2) is performed by obtaining the angle of a triangle formed by connecting the three-point holes 81b appearing in the obtained captured image with respect to the X-axis. Next, as shown in fig. 14, the CPU91 moves the control point of the hand-held camera 60 to the hand-held camera position measurement work camera position X registered in step S20 wwc 、Y wwc And a hand-held camera position measurement center hole position Z registered in step S40 wwv (step S320). Next, the CPU91 photographs the hole gauge 80 with the hand-held camera 60, and based on the obtained photographed image and the stepS300 similarly, the three-point hole 81b of the hole gauge 80 is measured in the world coordinate system Sigma w Phase θ of (a) whv (step S330). Then, the CPU91 calculates the three-point hole phase θ based on the three-point hole phase θ measured by using the workpiece camera 70 in steps S300 and S310 wwv And the three-point hole phase θ measured using the hand-held camera 60 in step S330 whv To calculate the hand-held camera position θ ci (step S340), the calculated hand-held camera position θ ci Register to the storage device 94 (step S350), and end the present process. Fig. 15 is an explanatory diagram showing three-point hole phases θ measured by the work camera and the hand-held camera, respectively. As shown, the hand-held camera position θ ci Is calculated by measuring the three-point hole phase theta from below by the workpiece camera 70 wwv With the three-point hole phase theta measured from above by the hand-held camera 60 whv Difference θe (=θ) wwv -θ whv ) The calculation is performed, and the designed hand-held camera position (eigenvalue) is offset by the amount of the difference θe in the rotation direction.
In step S60, the robot system 1 is caused to execute the position of the hand-held camera 60 (hand-held camera position X ci 、Y ci ) And performing a hand-held camera position measurement process for measurement. Fig. 16 is a flowchart showing an example of the hand-held camera position measurement processing executed by the control device. In the hand-held camera position measurement process, the CPU91 first waits until the hand-held camera position measurement button is pressed by the operator (step S400). When determining that the hand-held camera position measurement button is pressed, the CPU91 moves the control point of the hand-held camera 60 to the hand-held camera position measurement workpiece camera position X registered in step S20 wwc 、Y wwc And a hand-held camera position measurement center hole position Z registered in step S40 wwv (step S410). Next, the CPU91 photographs the hole gauge 80 with the hand-held camera 60, and determines the center hole position X from the obtained photographed image whv 、Y whv (step S420). Next, as shown in fig. 18, the CPU91 calculates the image center of the captured image and the measured center hole position X whv 、Y whv Distance L between them (step S430). When the center hole position X is set with the center of the image as the origin whv Let a be the center hole position Y whv When b is set, the distance L can be calculated by the following equation (2). Then, the CPU91 determines whether the calculated distance L is equal to or less than a predetermined allowable value Lref (step S440). When it is determined that the distance L is not equal to or less than the allowable value Lref, the CPU91 returns to step S410, repeats the process until the distance L is equal to or less than the allowable value Lref, and proceeds to step S450 when the distance L is equal to or less than the allowable value Lref. Further, the process may proceed to step S450 regardless of whether or not the distance L is equal to or smaller than the allowable value Lref.
[ formula 2]
When determining that the distance L is equal to or less than the allowable value Lref, the CPU91 moves the control point of the end effector EE (tag pin) with the tag to the hand-held camera position measurement work camera position X registered in step S20 wwc 、Y wwc And a hand-held camera position measurement hole position Z registered in step S40 wwv (step S450). Next, the CPU91 rotates the end effector EE with the mark to angles of 0 °, 90 °, 180 °, 270 °, and photographs the mark M with the work camera 70 at each angle to measure the world coordinate system Σ w The marking pin positions (X0, Y0), (X90, Y90), (X180, Y180), (X270, Y270) (step S460). Next, the CPU91 calculates the average value X of the marker pin positions (X0, Y0), (X90, Y90), (X180, Y180), (X270, Y270) by the following formula (3) mc 、Y mc . Then, the CPU91 calculates the hand-held camera position measurement center hole position X measured by the work camera 70 in step S40 by the following equation (4) wwc 、Y wwc With marking pin position (average value) X mc 、Y mc The difference between them, thereby calculating the center position X of the end effector from the center position X of the hole for measuring the position of the hand-held camera wwv 、Y wwv Position offset X of (2) me 、Y me (step S190). FIG. 19 shows the tag pin position (average value) X mc 、Y mc Hole center position X for hand-held camera position measurement wwv 、Y wwv And the position offset X me 、Y me Relationship between them.
[ formula 3]
(X me ,Y me )=(X mc -X wwc ,Y mc -Y wwc )…(4)
Then, the CPU91 shifts the position by the amount X me 、Y me From the world coordinate system Σ w Conversion to the camera coordinate System Sigma c And calculates the hand-held camera position X to be designed mo 、Y mo Offset converted positional offset X rme 、Y rme And the new hand-held camera position X obtained ci 、Y ci (step S470). To shift the position by an amount X me 、Y me From the world coordinate system Σ w To the camera coordinate system c Conversion is performed by shifting the position by the amount X me 、Y me Rotating world coordinate System Sigma w And camera coordinate system sigma c The amount of the phase difference θrm therebetween is sufficient. When the world coordinate system is sigma w The phase of the origin of the base of the robot 10 is θ rw Each rotation angle of the first joint axis J1 and the second joint axis J2 when the center hole 81a of the hole gauge 81 is imaged by the hand-held camera 60 is defined as θ J1 、θ J2 When this is done, the phase difference θrm is calculated by the following equation (5). Therefore, the converted positional shift amount X rme 、Y rme The rotation matrix of the following formulas (6) and (7) can be used for calculation. New hand-held camera position X mm 、Y mm Hand-held camera position X capable of being used as an eigenvalue mo 、Y mo The calculation is performed by the following formula (8), which becomes the hand-held camera position X ci 、Y ci . CPU91 calculates a new hand-held camera position X in this way ci 、Y ci Then, the calculated hand-held camera position X ci 、Y ci Registered in the storage device 94 (step S480).
[ equation 4]
θ rm =θ rw +θ J1 +θ J2 …(5)
X me =X me cos(-θ rm )-Y me sin(-θ rm )…(6)
Y me =X me sin(-θ rm )+Yy me cos(-θ rm )…(7)
(X mm ,y mm )=(X mo -X rme ,Y mo -Y rme )…(8)
Then, the CPU91 calculates the hand-held camera position change amount E by the following expression (9) (step S490), and determines whether or not the hand-held camera position change amount E is equal to or less than a predetermined allowable value Eref (step S500). When it is determined that the hand-held camera position change amount E is not equal to or less than the allowable value Eref, the CPU91 returns to step S400, and when it is determined that the hand-held camera position change amount E is equal to or less than the allowable value Eref, the present process ends. Thereby, the storage 94 stores the exact hand-held camera position X ci 、Y ci 、θ ci . Accordingly, the CPU91 uses the hand-held camera position X stored in the storage 94 ci 、Y ci 、θ ci The position of the hand-held camera 60 is controlled, and the position of the workpiece W (object) imaged by the hand-held camera 60 can be accurately grasped, so that the work on the workpiece W can be performed more accurately.
[ equation 5]
In step S70, the marked end effector EE and the hand-held camera position measuring hole gauge 80 are removed. Thereby, the hand-held camera position determination process is completed.
Here, the correspondence between the main elements of the embodiments and the main elements of the present disclosure described in the claims will be described. That is, in the present embodiment, the robot 10 corresponds to a robot, the hand-held camera 60 corresponds to a hand-held camera, the work camera 70 corresponds to an external camera, the hole gauge 80 corresponds to a gauge member, the center hole 81a corresponds to a first measurement hole, the CPU91 of the control device 90 that performs the processing of steps S240 and S250 of the hand-held camera position measurement processing corresponds to a first position acquisition unit, the CPU91 of the control device 90 that performs the processing of steps S410 and S420 of the hand-held camera position measurement processing corresponds to a second position acquisition unit, and the CPU91 of the control device 90 that performs the processing of steps S450 to S480 of the hand-held camera position measurement processing corresponds to a position shift measurement unit. Further, the CPU91 of the control device 90 that executes the hand-held camera position measurement processing of the workpiece camera position measurement processing corresponds to the external camera position acquisition unit, the CPU91 of the control device 90 that executes the processing of steps S300 and S310 of the hand-held camera angle measurement processing corresponds to the first phase acquisition unit, the CPU91 of the control device 90 that executes the processing of steps S320 and S330 of the hand-held camera angle measurement processing corresponds to the second phase acquisition unit, and the CPU91 of the control device 90 that executes the processing of step S340 of the hand-held camera angle measurement processing corresponds to the phase shift measurement unit. The CPU91 of the control device 90 that executes the processing of steps S210 and S220 of the hand-held camera position measurement hole position measurement processing corresponds to the height acquisition unit.
The present disclosure is not limited to the above embodiments, and may be implemented in various ways as long as the technical scope of the present disclosure is provided.
For example, in the above-described embodiment, the robot 10 is configured as a horizontal multi-joint robot (SCARA robot), but the present invention is not limited thereto, and may be configured as any other configuration such as a vertical multi-joint robot.
As described above, the camera positional deviation measuring device of the present disclosure measures positional deviation of a hand-held camera in a robot system including: a robot having a robot arm and the hand-held camera mounted to the robot arm; and an external camera provided outside the robot, wherein the camera positional deviation measuring device comprises: a gauge member provided above the external camera and having a first measurement hole penetrating up and down; a first position acquisition unit that captures an image of the gauge member with the external camera and acquires a position of the first measurement hole from the obtained captured image of the first measurement hole; a second position acquisition unit that moves the hand-held camera to the position acquired by the first position acquisition unit, photographs the gauge member with the hand-held camera, and acquires the position of the first measurement hole from the obtained photographed image of the first measurement hole; and a positional deviation measuring unit that moves the fingertip of the robot to the position of the first measurement hole acquired by the second position acquiring unit, photographs the fingertip of the robot with the external camera, and measures the positional deviation of the camera from the obtained photographed image.
The disclosed camera position deviation measuring device comprises a gauge component which is arranged above an external camera and provided with a first measuring hole which penetrates vertically. The positional deviation measuring device obtains the position of the first measurement hole by photographing the gauge member with an external camera, and moves the hand-held camera to the obtained position, and obtains the position of the first measurement hole by photographing the gauge member with the hand-held camera. The positional displacement measuring device moves the fingertip of the robot to the acquired position of the first measurement hole, photographs the fingertip of the robot with an external camera, and measures the positional displacement of the camera. This enables the positional displacement of the camera attached to the arm of the robot to be appropriately measured.
In the positional deviation measuring device of the camera of the present disclosure, the positional deviation measuring device of the camera may include: a gauge member provided above the external camera and having a plurality of second measurement holes penetrating up and down differently from the first measurement holes; an external camera position acquisition unit that moves a fingertip of the robot to a position above the external camera, photographs the fingertip with the external camera, and acquires a position of the external camera from the obtained photographed image of the fingertip; a first phase acquisition unit that photographs the gauge member with the external camera, and acquires a phase of the second measurement hole from the obtained photographed image of the second measurement hole; a second phase acquisition unit that moves the hand-held camera to the position of the external camera acquired by the external camera position acquisition unit, photographs the gauge member with the hand-held camera, and acquires the phase of the second measurement hole from the obtained photographed image of the second measurement hole; and a phase shift measurement unit that measures a phase shift of the hand-held camera based on the phase of the second measurement hole acquired by the first phase acquisition unit and the second phase acquisition unit, respectively. In this way, the phase shift can be appropriately measured in addition to the positional shift of the hand-held camera using the gauge member and the external camera. The second measurement holes may be formed around the first measurement hole of the gauge member at intervals in the circumferential direction.
In the camera misalignment measuring apparatus of the present disclosure, the camera misalignment measuring apparatus may include: a gauge member provided above the external camera and having a plurality of second measurement holes penetrating up and down differently from the first measurement holes; and a height acquisition unit that captures an image of the gauge member with the external camera, and acquires the height of the first measurement hole based on the captured image of the second measurement hole, wherein the second position acquisition unit moves the hand-held camera to the position acquired by the first position acquisition unit and the height acquired by the height acquisition unit. Thus, even if the installation height of the gauge member includes an error, the positional displacement of the hand-held camera can be measured well.
In the camera misalignment measuring apparatus of the present disclosure, the camera misalignment measuring apparatus may include: a gauge member provided above the external camera and having a plurality of second measurement holes penetrating up and down differently from the first measurement holes; and a height acquisition unit that captures an image of the gauge member with the external camera and acquires the height of the first measurement hole based on the captured image of the second measurement hole, wherein the positional deviation measurement unit moves the hand-held camera to the position acquired by the second position acquisition unit and the height acquired by the height acquisition unit. Thus, even if the installation height of the gauge member includes an error, the positional displacement of the hand-held camera can be measured well.
The present disclosure is also directed to a camera misalignment measuring apparatus, but may be directed to a camera misalignment measuring method.
Industrial applicability
The present disclosure can be used in the manufacturing industry of robot systems, and the like.
Description of the reference numerals
1 robot system 2, stage 3a main lever 3b sub lever 10 robot 11, base 20, mechanical arm 21, first arm 22, second arm 23, shaft 24, workpiece holding section 30, first arm driving section 31, motor 32, motor 33a, 33b, motor 34 encoder 40, second arm driving section 42, motor 44 encoder 50, shaft driving section 52a, motor 52b, 54a encoder 54b encoder 60, hand-held camera 70, workpiece camera 71, housing box 71a, opening 80, hole gauge 81a, center hole 81b, three-point hole 82, fixture 90 control device 91CPU 92ROM 93RAM 94, end effector J1, second joint axis J2, third joint axis J3, third joint axis M, and end effector S substrate with EE, and mark