TWI891549B - A calibration method for robotic arm tool center point without a tool apex - Google Patents
A calibration method for robotic arm tool center point without a tool apexInfo
- Publication number
- TWI891549B TWI891549B TW113140452A TW113140452A TWI891549B TW I891549 B TWI891549 B TW I891549B TW 113140452 A TW113140452 A TW 113140452A TW 113140452 A TW113140452 A TW 113140452A TW I891549 B TWI891549 B TW I891549B
- Authority
- TW
- Taiwan
- Prior art keywords
- tool
- center
- sensor
- robot arm
- sphere
- Prior art date
Links
Landscapes
- Manipulator (AREA)
Abstract
Description
一種機械手臂的工具校正方法,特別是一種機械手臂的無刀尖點工具的校正方法。A method for calibrating a tool of a robot arm, in particular a method for calibrating a tool without a tool tip of a robot arm.
過去傳統製造業強調的是規模經濟,要有夠大量的產量才能夠降低成本,提高獲利,但隨著消費型態的轉變,消費者逐漸從「買廠商賣的產品」,轉變為「買我需要、想要的產品」,也因此少量多樣的趨勢逐漸在各種產業中蔓延,而因為網路資訊發達,讓消費者對於新事物的渴望與期待,也使得產品的生命週期逐漸縮短;以汽車產業為例,新車型的生命周期大幅縮減的趨勢反映了製造業從傳統單一流水線的大規模生產,轉向更加注重多樣化和小批量生產的模式,這種轉變迫使製造業必須尋找新的解決策略。Traditional manufacturing used to emphasize economies of scale, requiring high production volumes to reduce costs and increase profits. However, with shifting consumer patterns, consumers are gradually shifting from "buying what manufacturers sell" to "buying what I need and want." Consequently, the trend toward smaller quantities and a wider variety of products has spread across various industries. Furthermore, the rise of online information has fueled consumers' desire and anticipation for new things, shortening product lifecycles. For example, the automotive industry's trend of significantly shortening the lifecycle of new models reflects the manufacturing industry's shift from traditional, single-line, large-scale production to a model that prioritizes diversity and small batches. This shift has forced the industry to seek new solutions.
另一方面,面對人口老齡化和少子化趨勢,勞動力密集型的製造業遇到了勞動力短缺和成本上升的挑戰;此外,隨著經驗豐富的老師傅陸續退休,再加上製造業面臨的缺工問題,使得經驗與技能的傳承也變得越來越困難。這些生產過程的變革促進了機械手臂技術的進步,從最初的簡單任務如上下料,逐步擴展到更複雜的加工階段。On the other hand, faced with an aging population and declining birthrates, labor-intensive manufacturing industries are facing challenges from labor shortages and rising costs. Furthermore, the retirement of experienced workers, coupled with the labor shortage facing the manufacturing industry, is making the transfer of experience and skills increasingly difficult. These changes in production processes have fueled advancements in robotic arm technology, expanding from simple tasks like loading and unloading to more complex processing stages.
若要將機械手臂導入具有精度需求的加工應用,除了機械手臂本身需經過校正外,尚需確認機械手臂加工系統內各物件位置的相對關係,如工具中心點(Tool Center Point)及工件座標系(Work Object coordinate)的準確座標值,此座標通常需要經過校正後才可取得。傳統的工具中心校正大多以目測法控制機械手臂動作,使工具中心以四個不同的刀具姿態到達空間中相同位置,以進行工具中心校正,但人工校正耗時且存在人為誤差。To implement a robotic arm in precision-critical machining applications, not only does the robotic arm itself need to be calibrated, but the relative positions of various objects within the robotic arm's machining system must also be confirmed. For example, the precise coordinates of the tool center point (TCP) and the workpiece coordinate system (WOB coordinate) must be accurately determined. These coordinates typically require calibration. Traditional tool center calibration often relies on visual control of the robotic arm's motion, ensuring that the tool center reaches the same position in space with four different tool postures. However, manual calibration is time-consuming and subject to human error.
現有的自動校正方法,則大多仰賴實體刀尖點之辨識,因此若工具不存在刀尖點則必須以具有尖點之治具(或稱尖棒)取代原有工具,使尖棒與實際工具具有相同位置之工具中心點,才可滿足校正方法之限制;但此方法在校正完成後需換回實際所需工具,而拆裝動作將造成額外裝配誤差而不易滿足精度需求;對於無刀尖點之工具中心點校正,目前僅少數方法可進行校正且具有其限制:現有方法需以感測器取代工具,並透過三次元量測設備量測指定特徵點之三維座標,再透過幾何估算方式取得感測點與工具中心點之關係完成校正,此方法雖然不需要使用尖棒,但於校正完成後亦須將感測器拆除裝回實際應用工具,進而引入裝配誤差。Most of the existing automatic calibration methods rely on the identification of the physical tool tip. Therefore, if the tool does not have a tool tip, the original tool must be replaced with a fixture with a sharp point (or a pointed rod) so that the tool center point of the pointed rod and the actual tool have the same position to meet the limitations of the calibration method. However, this method requires the actual tool to be replaced after calibration, and the disassembly and assembly action will cause additional assembly errors and is difficult to meet the accuracy requirements. Currently, only a few methods are available for calibrating the tool center point of a sharp point, and these methods have their limitations: existing methods require replacing the tool with a sensor, measuring the three-dimensional coordinates of a specified feature point using 3D measurement equipment, and then completing the calibration by geometrically estimating the relationship between the sensing point and the tool center point. Although this method does not require the use of a sharp rod, it still requires removing the sensor and reinstalling it on the actual application tool after calibration, which can introduce assembly errors.
為解決機械手臂無刀尖點工具之校正問題,本發明提供一種機械手臂無刀尖點工具中心的校正方法,其步驟包含: 步驟1:將已知半徑之一圓球安裝於一機械手臂的一法蘭面,並利用一視覺感測器取得該圓球資訊進而校正該圓球的一球心相對於該機械手臂該法蘭面之座標; 步驟2:利用一輪廓感測器與該視覺感測器同時取得該圓球資訊,並結合該輪廓感測器與該視覺感測器之資訊,校正該機械手臂與該輪廓感測器之相對位置關係;以及 步驟3:最後將實際應用所需一工具安裝於該機械手臂該法蘭面,利用該視覺感測器與該輪廓感測器取得該工具資訊,並根據所得資訊產生對應控制指令使該工具移動至指定位置,以校正該工具軸向與該工具中心點位置,完成無刀尖點該工具之校正。 To address the calibration issues associated with robotic arm tools without tool tip points, the present invention provides a method for calibrating the center of robotic arm tools without tool tip points. The method comprises the following steps: Step 1: Mounting a sphere of known radius on a flange surface of a robotic arm and using a vision sensor to obtain information about the sphere and calibrate the coordinates of the sphere's center relative to the flange surface of the robotic arm; Step 2: Using a contour sensor and the vision sensor to simultaneously obtain information about the sphere, and combining the information from the contour sensor and the vision sensor to calibrate the relative positional relationship between the robotic arm and the contour sensor; and Step 3: Finally, a tool required for the actual application is mounted on the flange surface of the robot arm. The vision sensor and the profile sensor are used to obtain tool information. Based on this information, corresponding control instructions are generated to move the tool to a specified position. This calibrates the tool axis and the tool center point, completing the calibration of the tool without a tool tip point.
其中,步驟1中是利用兩個不同視角之該視覺感測器影像資訊控制該機械手臂運動,最終使該圓球的該球心以四種不同姿態到達空間中相同位置,求得該圓球的該球心相對於該法蘭面之座標。In step 1, the image information of the visual sensor at two different viewing angles is used to control the movement of the robotic arm, ultimately causing the center of the sphere to reach the same position in space in four different postures, thereby obtaining the coordinates of the center of the sphere relative to the flange surface.
其中,在該圓球的該球心位置校正是利用一旋轉盤改變該視覺感測器之視角,以兩個不同視角控制該機械手臂運動進而使該圓球的該球心以不同姿態到達空間中相同位置,最後再透過運動學方程式求得該圓球的該球心之座標。The center position of the sphere is corrected by using a rotating disk to change the viewing angle of the visual sensor. The robot arm is controlled to move at two different viewing angles, thereby causing the center of the sphere to reach the same position in space with different postures. Finally, the coordinates of the center of the sphere are obtained through kinematic equations.
其中,步驟2中根據輪廓感測器取得之資訊進行運動控制,取得該輪廓感測器座標系與該機械手臂參考座標系之相對關係,以將命令轉換成相對於該機械手臂參考座標系之命令。In step 2, motion control is performed based on the information obtained by the contour sensor, and the relative relationship between the contour sensor coordinate system and the reference coordinate system of the robot arm is obtained to convert the command into a command relative to the reference coordinate system of the robot arm.
其中,在校正該輪廓感測器與該機械手臂相對位置前,取得安裝於該機械手臂該法蘭面之該圓球的該球心相對於該法蘭面座標系之座標。Before calibrating the relative position between the contour sensor and the robot arm, the coordinates of the center of the sphere mounted on the flange surface of the robot arm relative to the flange surface coordinate system are obtained.
其中,完成該圓球的該球心相對於該法蘭面座標系之位置校正,取得座標後利用該輪廓感測器與該視覺感測器同時取得該圓球資訊,以校正該機械手臂與該輪廓感測器之位置關係。Among them, the position calibration of the center of the sphere relative to the flange coordinate system is completed. After obtaining the coordinates, the contour sensor and the visual sensor are used to simultaneously obtain the sphere information to calibrate the position relationship between the robot arm and the contour sensor.
其中,校正該輪廓感測器與該機械手臂參考座標系包含控制該機械手臂使該圓球移動至輪廓傳感器可擷取輪廓之位置,並同時取得該圓球的該球心相對於該機械手臂參考座標系之座標與相對於該輪廓感測器座標系之座標,透過取得數個不同位置後完成校正。Among them, calibrating the contour sensor and the robot arm reference coordinate system includes controlling the robot arm to move the ball to a position where the contour sensor can capture the contour, and simultaneously obtaining the coordinates of the center of the ball relative to the robot arm reference coordinate system and the coordinates relative to the contour sensor coordinate system. The calibration is completed after obtaining several different positions.
其中,確定該輪廓感測器與該該機械手臂參考座標系轉換關係後,進行該工具座標系之校正。After determining the conversion relationship between the contour sensor and the reference coordinate system of the robot arm, the tool coordinate system is calibrated.
藉由上述說明可知,本發明具有以下有益功效與優點:From the above description, it can be seen that the present invention has the following beneficial effects and advantages:
1.本發明無刀尖點工具之校正方法,主要是進行工具軸向(Tool-axis orientation)與工具中心點位置之校正,提出之方法透過2D輪廓傳感器與2D視覺感測器進行校正,其中視覺感測器與2D輪廓感測器皆安裝於旋轉盤上:校正前首先需先取得感測器與機械手臂之準確相對關係,透過將一顆已知半徑之圓球安裝於機械手臂法蘭面,以感測器取得圓球資訊後校正感測器與機械手臂之相對關係,最後即可將實際加工應用所需之工具(如砂輪)安裝於機械手臂,並利用感測器取得無刀尖點工具之刀具軸向與工具中心點座標。1. The present invention's calibration method for a tool without a tool tip point primarily calibrates the tool-axis orientation and tool center point position. The proposed method utilizes a 2D profile sensor and a 2D vision sensor for calibration, both of which are mounted on a rotating disk. Before calibration, the precise relative relationship between the sensor and the robotic arm must be established. A sphere of known radius is mounted on the flange surface of the robotic arm. The sensor acquires information about the sphere and then calibrates the relative relationship between the sensor and the robotic arm. Finally, the tool required for the actual machining application (such as a grinding wheel) can be mounted on the robotic arm, and the sensor is used to obtain the tool-axis orientation and tool center point coordinates of the tool without a tool tip point.
2. 本發明提出了一種創新的機械手臂無刀尖點工具中心校正方法,成功解決了現有技術依賴實體刀尖點辨識的侷限。透過整合2D視覺感測器與輪廓感測器,成功開發出一套能自動校正機械手臂無刀尖點工具中心的技術。本發明提出的校正方法能夠克服傳統方法所面臨的挑戰,例如在無法辨識刀尖點的情況下,需要使用替代尖棒進行校正的問題。2. This invention proposes an innovative method for calibrating tool centers in robotic arms without a tool tip point, successfully overcoming the limitations of existing technologies that rely on physical tool tip identification. By integrating a 2D vision sensor with a contour sensor, a technology has been developed that can automatically calibrate tool centers in robotic arms without a tool tip point. This calibration method overcomes challenges faced by traditional methods, such as the need to use a surrogate tip for calibration when the tool tip cannot be identified.
本發明以下將以數個較佳實施例進行技術詳細的說明與描述,所附圖示僅僅是本發明的一些示例性代表或實施例,對於本發明所屬領域具有通常知識者來講,在不付出進步性勞動的前提下,還可以根據這些附圖將本發明應用於其它類似情形。The present invention will be described and illustrated in detail below using several preferred embodiments. The attached figures are merely exemplary representations or embodiments of the present invention. For those with ordinary knowledge in the field to which the present invention belongs, the present invention can be applied to other similar situations based on these figures without making any further efforts.
以下本發明使用的“系統”、“裝置”、“單元”和/或“模組”是用於區分不同級別的不同組件、元件、部件、部分或裝配的一種方法。然而,如果其他詞語可實現相同的目的,則可通過其他表達來替換所述詞語。如本發明中所示,除非上下文明確提示例外情形,“一”、“一個”、“一種”和/或“該”等詞並非特指單數,也可包括複數。一般說來,術語“包括”與“包含”僅提示包括已明確標識的步驟和元素,而這些步驟和元素不構成一個排它性的羅列,方法或者設備也可能包含其它的步驟或元素。The "system", "device", "unit" and/or "module" used in the present invention below is a method for distinguishing different components, elements, parts, parts or assemblies at different levels. However, if other words can achieve the same purpose, they can be replaced by other expressions. As shown in the present invention, unless the context clearly indicates an exception, the words "a", "an", "an" and/or "the" do not specifically refer to the singular and may also include the plural. Generally speaking, the terms "include" and "comprising" only indicate the inclusion of clearly identified steps and elements, and these steps and elements do not constitute an exclusive list. The method or apparatus may also include other steps or elements.
本發明中使用了流程圖用來說明根據本發明的實施例的系統所執行的操作。應當理解的是,前面或後面操作不一定按照順序來精確地執行。相反,可以按照倒序或同時處理各個步驟。同該圓球10時,也可以將其他操作添加到這些過程中,或從這些過程移除某一步或數步操作。Flowcharts are used throughout this disclosure to illustrate the operations performed by the system according to embodiments of the present invention. It should be understood that the preceding and following operations do not necessarily need to be performed in exact order. Instead, the steps may be processed in reverse order or simultaneously. Similarly to the ball 10, other operations may be added to these processes, or one or more operations may be removed from these processes.
請參考圖1、圖2與圖3,其為本發明機械手臂無刀尖點工具中心之校正方法的步驟流程圖與示意圖,其步驟包含:Please refer to Figures 1, 2 and 3, which are a flowchart and a schematic diagram of the method for calibrating the center of a tool without a tool tip of a robot arm of the present invention. The steps include:
步驟S1: 將已知半徑之一圓球10安裝於一機械手臂20的一法蘭面21,並利用一視覺感測器30取得該圓球10資訊進而校正該圓球10的一球心11相對於該機械手臂20該法蘭面21之座標;Step S1: Mount a sphere 10 of known radius on a flange surface 21 of a robotic arm 20, and utilize a vision sensor 30 to obtain information about the sphere 10 and thereby calibrate the coordinates of a center 11 of the sphere 10 relative to the flange surface 21 of the robotic arm 20.
步驟S2: 利用一輪廓感測器40與該視覺感測器30同時取得該圓球10資訊,並結合該輪廓感測器40與該視覺感測器30之資訊,校正該機械手臂20與該輪廓感測器40之相對位置關係;以及Step S2: Utilize a contour sensor 40 and the visual sensor 30 to simultaneously obtain information about the sphere 10, and combine the information from the contour sensor 40 and the visual sensor 30 to calibrate the relative position relationship between the robot arm 20 and the contour sensor 40; and
步驟S3: 最後將實際應用所需一工具T安裝於該機械手臂20該法蘭面21,利用該視覺感測器30與該輪廓感測器40取得該工具T資訊,並根據所得資訊產生對應控制指令使該工具T移動至指定位置,以校正該工具T軸向與該工具T中心點位置,即可完成無刀尖點該工具T之校正。Step S3: Finally, a tool T required for the actual application is mounted on the flange surface 21 of the robot arm 20. The vision sensor 30 and the profile sensor 40 are used to obtain information about the tool T. Based on the obtained information, corresponding control instructions are generated to move the tool T to a specified position to calibrate the axial direction of the tool T and the center point position of the tool T. This completes the calibration of the tool T without a tool tip point.
步驟S2中根據輪廓感測資訊進行運動控制,因此須取得該輪廓感測器40座標系 與該機械手臂20參考座標系 之相對關係,以將命令轉換成相對於該機械手臂20參考座標系之命令;本發明所提供的轉換關係取得之方法以一個該視覺感測器30取代既有技術之多組距離感測器,以降低因硬體組裝困難而產生之誤差。 In step S2, motion control is performed based on the contour sensing information, so the coordinate system of the contour sensor 40 must be obtained. and the reference coordinate system of the robot arm 20 The relative relationship between the command and the reference coordinate system of the robot arm 20 is converted into a command; the method for obtaining the conversion relationship provided by the present invention replaces the multiple sets of distance sensors in the existing technology with one visual sensor 30 to reduce the error caused by the difficulty of hardware assembly.
詳細如圖2、3A、3B與3C所示,本發明所提供的方法在校正該輪廓感測器40與該機械手臂20相對關係前,首先須取得安裝於該機械手臂20該法蘭面21之該圓球10的該球心11 S 0相對於該法蘭面21座標系 之座標S 0F(圖未示),其校正流程如下:(1)首先取得2D該視覺感測器30座標系 與該機械手臂20參考座標系 之方位轉換關係,使相對於該視覺感測器30之運動命令可轉換為相對於該機械手臂20參考座標系之命令,其中 為一旋轉盤50角度為 時該視覺感測器30位置之參考座標系;(2)控制該機械手臂20運動,使該圓球10以4種不同姿態到達空間中相同位置,並記錄到達空間中相同位置時的該機械手臂20關節座標組合以求得該圓球10的該球心11相對於該法蘭面21之座標。 As shown in FIG2, 3A, 3B and 3C, the method provided by the present invention first obtains the coordinate system of the center 11 S 0 of the sphere 10 mounted on the flange surface 21 of the robot arm 20 relative to the flange surface 21 before calibrating the relative relationship between the contour sensor 40 and the robot arm 20. The coordinates S 0F (not shown) of the image are calibrated as follows: (1) First, obtain the 2D coordinate system of the visual sensor 30 and the reference coordinate system of the robot arm 20 The orientation conversion relationship enables the motion command relative to the visual sensor 30 to be converted into a command relative to the reference coordinate system of the robot arm 20, wherein A rotating disk with an angle of 50 (2) controlling the movement of the robot arm 20 so that the sphere 10 reaches the same position in space in four different postures, and recording the combination of joint coordinates of the robot arm 20 when reaching the same position in space to obtain the coordinates of the center 11 of the sphere 10 relative to the flange surface 21.
在該圓球10的該球心11位置校正方面,本發明將利用該旋轉盤50改變該視覺感測器30之視角,以兩個不同視角控制該機械手臂20運動進而使該圓球10的該球心11以不同姿態到達空間中相同位置,最後再透過運動學方程式即可求得該圓球10的該球心11之座標。該機械手臂20參考座標系 與該視覺感測器30參考座標系 之轉換關係可透過將該圓球10分別沿著 與 方向移動,並取得相對於影像感測器座標系 之向量,最後再利用 與 於空間中互相垂直之條件即可求得轉換關係。求得該機械手臂20座標系 與該視覺感測器30座標系 轉換關係之步驟如下: In terms of position correction of the center 11 of the sphere 10, the present invention utilizes the rotating disk 50 to change the viewing angle of the visual sensor 30, and controls the movement of the robot arm 20 at two different viewing angles, thereby making the center 11 of the sphere 10 reach the same position in space with different postures. Finally, the coordinates of the center 11 of the sphere 10 can be obtained through the kinematic equation. The reference coordinate system of the visual sensor 30 The conversion relationship can be achieved by moving the ball 10 along and direction and obtain the coordinate system relative to the image sensor The vector is then used again and The transformation relationship can be obtained by the condition of being perpendicular to each other in space. The 20 coordinate system of the robot arm is obtained. The coordinate system of the visual sensor 30 The steps to convert the relationship are as follows:
將該視覺感測器30安裝於該旋轉盤50組成感測模組,接著將感測模組安裝於該機械手臂20運動範圍使該圓球10可於該視覺感測器30視野範圍內移動,如圖3A所示。The visual sensor 30 is mounted on the rotating disk 50 to form a sensing module, and then the sensing module is mounted within the motion range of the robotic arm 20 so that the ball 10 can move within the field of view of the visual sensor 30, as shown in FIG3A.
將該旋轉盤50旋轉至任意角度,並記錄其角度為 ,接著將該圓球10移動至該視覺感測器30視野範圍內任意位置作為起始點,而後沿著 方向移動任意長度,並取得移動量投影於該視覺感測器30之移動向量 與其對應之空間向量則為 。 Rotate the rotating disk 50 to any angle and record the angle as Then, the ball 10 is moved to any position within the visual field of the visual sensor 30 as the starting point, and then The direction moves an arbitrary length and obtains the movement vector projected on the visual sensor 30. The corresponding space vector is .
將該圓球10移動至視野範圍內任意位置作為起始點,而後沿著 方向移動任意長度,並取得移動量投影於該視覺感測器30之移動向量 ,其對應之空間向量則為 。 Move the ball 10 to any position within the field of view as the starting point, and then The direction moves an arbitrary length and obtains the movement vector projected on the visual sensor 30. , and its corresponding space vector is .
將該圓球10移動至視野範圍內任意位置作為起始點,而後沿著 方向移動任意長度,並取得移動量投影於該視覺感測器30之移動向量 ,其對應之空間向量則為 。 Move the ball 10 to any position within the field of view as the starting point, and then The direction moves an arbitrary length and obtains the movement vector projected on the visual sensor 30. , and its corresponding space vector is .
利用 與 於空間中垂直之特性得式1至式3,,如圖3B所示,其中圖3B中I表示該視覺感測器30位在第一位置。 use and The vertical characteristics in space are obtained by equations 1 to 3, as shown in FIG3B , where I in FIG3B indicates that the visual sensor 30 is at the first position.
= 0 …式1。 = 0 …Equation 1.
= 0 …式2。 = 0 …Equation 2.
= 0 …式3。 = 0 …Equation 3.
求解式1至式3可得。Solving equations 1 to 3 yields:
…式4。 …Equation 4.
…式5。 …Formula 5.
…式6。 …Equation 6.
由式4至式6可知,所得之兩組解相差一負號,因此可利用視覺感應器判別該圓球10沿著 與 移動時,所得之該圓球10影像半徑變化進而判斷該球心11是朝向或遠離視覺感應器之方向移動,以選擇正確的解。 From equations 4 to 6, we can see that the two solutions differ by a negative sign, so we can use the visual sensor to determine whether the ball 10 is moving along and When moving, the radius of the image of the sphere 10 changes, and the center of the sphere 11 is judged to be moving toward or away from the visual sensor to select the correct solution.
該機械手臂20參考座標系 與視覺感應器座標系 之轉換關係如下式7。 The reference coordinate system of the robot arm 20 and visual sensor coordinate system The conversion relationship is as follows:
…式7。 …Formula 7.
其中 為沿著視覺感應器座標系 之方向, 為沿著該機械手臂20參考座標系 之方向, 為將向量由視覺感應器座標系 轉換為該機械手臂20參考座標系 表示之旋轉矩陣。 in Along the visual sensor coordinate system Direction, The reference coordinate system along the robot arm 20 Direction, To convert a vector from the visual sensor coordinate system Convert to the reference coordinate system of the robot arm 20 Represents the rotation matrix.
將該旋轉盤50旋轉至其他任意角度:Rotate the rotating disk 50 to any other angle:
若該旋轉盤50旋轉方向相對於 之方向 已知,則該旋轉盤50角度為 時之座標系 與該機械手臂20參考坐標系 之座標轉換關係如下式8,其中 與 之關係如圖3C所示,其中圖3C的I表示該視覺感測器30位在第一位置,II表示該視覺感測器30位在第二位置: If the rotating disk 50 rotates in the direction relative to Direction Known, the angle of the rotating disk is 50 Time coordinate system and the reference coordinate system of the robot arm 20 The coordinate transformation relationship is as follows: and The relationship is shown in FIG3C , where I in FIG3C indicates that the visual sensor 30 is at the first position, and II indicates that the visual sensor 30 is at the second position:
…式8。 …Equation 8.
其中, , , 為將方向向量由視覺感應器座標系 轉換為該機械手臂20參考座標系 表示之旋轉矩陣。。 in, , , To convert the direction vector from the visual sensor coordinate system Convert to the reference coordinate system of the robot arm 20 Represents the rotation matrix.
若該旋轉盤50旋轉方向相對於 之方向未知,則依步驟2至步驟6之方法求得 與該機械手臂20參考坐標系 之座標轉換關係。該旋轉盤50旋轉方向可由式8之關係 求得: If the rotating disk 50 rotates in the direction relative to If the direction is unknown, then follow the method from step 2 to step 6 to find and the reference coordinate system of the robot arm 20 The coordinate conversion relationship of the rotating disk 50 can be determined by the relationship of formula 8. Obtain:
…式9。 …Formula 9.
…式10。 …Equation 10.
…式11。 …Equation 11.
求得 即可利用式8求得該旋轉盤50旋轉至任意位置時,將向量由視覺座標系轉換至該機械手臂20參考坐標系之旋轉矩陣。 Obtain The rotation matrix of the vector from the visual coordinate system to the reference coordinate system of the robot arm 20 can be obtained by using Equation 8 when the rotating disk 50 rotates to any position.
請參考圖3A,詳細而言,前述步驟S1中較佳是利用兩個不同視角之該視覺感測器30影像資訊控制該機械手臂20運動,最終使該該圓球10的該球心11以四種不同姿態到達空間中相同位置 ,即可求得該圓球10的該球心11相對於該法蘭面21之座標,其中四組姿態應盡可能覆蓋刀具工作範圍內的所有方向,以確保校正結果的準確性。其流程如下: Please refer to FIG3A , in detail, in the aforementioned step S1, it is preferable to use the image information of the visual sensor 30 at two different viewing angles to control the movement of the robot arm 20, so that the center 11 of the sphere 10 reaches the same position in space in four different postures. , the coordinates of the center 11 of the sphere 10 relative to the flange surface 21 can be obtained. The four sets of postures should cover all directions within the tool working range as much as possible to ensure the accuracy of the calibration results. The process is as follows:
移動該機械手臂20使該圓球10該球心11 位於視野範圍並盡可能遠離影像邊界,以避免校正過程發生偏離至視野範圍外之情況,而無法取得影像產生對應運動控制命令。 Move the robot arm 20 so that the center 11 of the sphere 10 Located within the field of view and as far away from the image boundary as possible to avoid the correction process from deviating beyond the field of view and failing to obtain the image to generate corresponding motion control commands.
紀錄該圓球10該球心11 分別相對於該旋轉盤50角度為 與 時,該視覺感測器30之影像座標 ,令 。 Record the center of the ball 10 11 Respectively relative to the rotating disk 50 degrees and When the image coordinates of the visual sensor 30 ,make .
透過亂數產生器產生方位角增量 ,並控制該機械手臂20將該圓球10姿態改變至新的方位角 ;若新的方位角超出運動範圍限制、該圓球10該球心11偏離視野範圍、或與已記錄之刀具姿態相近而無法包含刀具工作範圍之所有方向時,則重新產生方位角增量。將該旋轉盤50角度旋轉至 ,利用視覺感應器取得目前該圓球10該球心11之影像座標 ,並計算移動方向 ,而後以式7將S c1轉換為 ,控制該機械手臂20沿 方向運動,直到該球心11之影像座標移動至O s1,如圖4A所示。 Generate azimuth increments through random number generator , and controls the robot arm 20 to change the posture of the ball 10 to a new azimuth If the new azimuth exceeds the range of motion, the center of the sphere 10 11 deviates from the field of view, or is close to the recorded tool posture and cannot include all directions of the tool working range, the azimuth increment is regenerated. Rotate the rotary disk 50 degrees to , using the visual sensor to obtain the image coordinates of the current sphere 10 and the center 11 , and calculate the direction of movement , and then use Equation 7 to convert S c1 into , control the robot arm 20 along direction until the image coordinate of the sphere center 11 moves to Os1 , as shown in FIG4A.
將該旋轉盤50角度旋轉至 ,利用視覺感應器取得目前該圓球10該球心11之影像座標O E2,並計算移動方向 ,而後以式8將S c2轉換為 ,控制該機械手臂20沿 方向運動,直到該球心11之影像座標移動至O s2,如圖4A所示。 Rotate the rotary plate 50 degrees to , using the visual sensor to obtain the current image coordinates O E2 of the center 11 of the sphere 10 and calculate the moving direction , and then use Equation 8 to convert S c2 into , control the robot arm 20 along direction until the image coordinate of the sphere center 11 moves to Os2 , as shown in FIG4A.
將該旋轉盤50角度旋轉回 ,並利用該視覺感測器30判別該圓球10該球心11位置:若已偏離座標 ,則回到步驟4,以控制該機械手臂20使該圓球10該球心11位置於A、B軸之間移動最終收斂至A、B軸之交點,使該球心11位置在該旋轉盤50角度 及 時該視覺感測器30之座標滿足 與 (此時該球心11位置相對於該機械手臂20參考坐標系之座標為 ),如圖4B所示,其中圖4B的I表示該視覺感測器30位在第一位置,II表示該視覺感測器30位在第二位置,A為通過 且垂直於該視覺感測器30之向量,B為通過 且垂直於該視覺感測器30之向量。 Rotate the rotary disk 50 degrees back to , and use the visual sensor 30 to determine the position of the center 11 of the sphere 10: if it has deviated from the coordinates , then return to step 4 to control the robot arm 20 to move the center 11 of the ball 10 between the A and B axes and finally converge to the intersection of the A and B axes, so that the center 11 is located at 50 degrees on the rotating disk. and When the coordinates of the visual sensor 30 meet and (At this time, the coordinates of the sphere center 11 relative to the reference coordinate system of the robot arm 20 are ), as shown in FIG4B, wherein I in FIG4B indicates that the visual sensor 30 is in the first position, II indicates that the visual sensor 30 is in the second position, and A is through and perpendicular to the vector of the visual sensor 30, B is a vector passing through And perpendicular to the vector of the visual sensor 30.
若 ,回到前述第3點,繼續搜集不同姿態且該球心11位置在 之關節座標組合以取得足夠之校正資訊;若 ,即可利用運動學方程式求得該圓球10該球心11相對於該法蘭面21座標系之座標。 like , return to the above point 3, continue to collect different postures and the position of the ball center 11 The joint coordinate combination to obtain sufficient calibration information; if , the coordinates of the center 11 of the sphere 10 relative to the coordinate system of the flange surface 21 can be obtained using the kinematic equation.
…式12。 …Equation 12.
其中 為代入第j組關節座標組合與其餘D-H參數所得之座標轉換矩陣,此轉換矩陣為將座標由該法蘭面21座標系轉換為該機械手臂20參考座標系; 為該圓球10該球心11相對於該法蘭面21座標系之座標, 代表該旋轉盤50角度為 , 該圓球10該球心11於該視覺感測器30之影像座標分別為 時,該圓球10該球心11相對於該機械手臂20參考座標系之座標。將四組關節座標組合分別代入式12,整理後可得式13及式14; in Substitute the j-th joint coordinate combination and the remaining DH parameters into the coordinate conversion matrix, which converts the coordinates from the flange surface 21 coordinate system to the robot arm 20 reference coordinate system; is the coordinate of the center 11 of the sphere 10 relative to the coordinate system of the flange surface 21, This means the rotating disk is 50 degrees. , The image coordinates of the sphere 10 and the center 11 in the visual sensor 30 are When , the coordinates of the center 11 of the sphere 10 relative to the reference coordinate system of the robot arm 20. Substituting the four sets of joint coordinates into Equation 12, we can obtain Equations 13 and 14.
…式13。 …Formula 13.
…式14。 …Equation 14.
其中 , 為 之偽逆矩陣(Pseudo-inverse matrix),即可求得該圓球10該球心11相對於該法蘭面21之座標 ,以及該圓球10該球心11於兩個該視覺感測器30之影像座標分別為 時,該圓球10該球心11相對於該機械手臂20參考座標系之座標 。 in , for The pseudo-inverse matrix can be used to obtain the coordinates of the center 11 of the sphere 10 relative to the flange surface 21. , and the image coordinates of the center 11 of the sphere 10 on the two visual sensors 30 are respectively When the center 11 of the sphere 10 is relative to the reference coordinate system of the robot arm 20, .
完成該圓球10的該球心11 相對於該法蘭面21座標系之位置校正,取得座標 後,即可利用該輪廓感測器40與該視覺感測器30同時取得該圓球10資訊,以校正該機械手臂20與該輪廓感測器40之位置關係;校正該輪廓感測器40與該機械手臂20參考座標系之方法,首先控制該機械手臂20使該圓球10移動至輪廓傳感器可擷取輪廓之位置,並同時取得該圓球10的該球心11 相對於該機械手臂20參考座標系之座標 與相對於該輪廓感測器40座標系之座標 ,透過取得數個不同位置之 與 後即可完成校正,其中 與 分別代表第k個位置該圓球10的該球心11 相對於該機械手臂20參考座標系之座標與該輪廓感測器40座標系之座標,其流程如下: Complete the center 11 of the sphere 10 Relative to the position calibration of the flange surface 21 coordinate system, obtain the coordinates After that, the contour sensor 40 and the visual sensor 30 can be used to simultaneously obtain the information of the sphere 10 to calibrate the positional relationship between the robot arm 20 and the contour sensor 40. The method of calibrating the reference coordinate system of the contour sensor 40 and the robot arm 20 is to first control the robot arm 20 to move the sphere 10 to a position where the contour sensor can capture the contour, and at the same time obtain the center 11 of the sphere 10. The coordinates relative to the reference coordinate system of the robot arm 20 and the coordinates relative to the coordinate system of the contour sensor 40 , by obtaining several different positions and The calibration is then completed, and Respectively represent the center 11 of the sphere 10 at the k-th position With respect to the coordinates of the reference coordinate system of the robot arm 20 and the coordinate system of the contour sensor 40, the process is as follows:
令 ,控制該機械手臂20使該圓球10移動至輪廓傳感器與任一該視覺感測器30可同時讀取該圓球10資訊之位置。 make , control the robot arm 20 to move the ball 10 to a position where the contour sensor and any one of the visual sensors 30 can simultaneously read the information of the ball 10.
記錄該圓球10該球心11相對於該機械手臂20參考座標系 之座標 。 Record the center 11 of the sphere 10 relative to the reference coordinate system of the robot arm 20 Coordinates .
利用該輪廓感測器40擷取該圓球10輪廓資訊,並取得相對於該輪廓感測器40座標系 之輪廓點數據組資訊 ,並以圓方程式 搭配隨機抽樣一致法RANSAC(Random Sample Consensus)進行最小誤差擬合,求得圓心座標 及剖面圓半徑 ,如圖5A所示。 The contour sensor 40 is used to capture the contour information of the ball 10 and obtain the coordinate system relative to the contour sensor 40. Contour point data set information , and the circle equation Use RANSAC (Random Sample Consensus) to perform minimum error fitting and obtain the center coordinates of the circle and cross-sectional radius , as shown in Figure 5A.
利用畢氏定理求得該球心11與一輪廓感測平面P之距離,如下式15:The distance between the sphere center 11 and a contour sensing plane P is calculated using the Pisces theorem as shown in Equation 15:
…式15。 …Formula 15.
其中 為該圓球10半徑, 為該圓球10該球心11相對於感測平面之高度。由式15可知所得之兩組解,分別代表該球心11位置位於該輪廓感測平面P上方或下方,因此可由該視覺感測器30所得之該圓球10資訊輔助判別,以取得正確的解。如圖5B所示,將該視覺感測器30所得之該圓球10影像透過影像處理方法判別圓心位置,並與該輪廓感測平面P投影於該圓球10上之雷射線比較,若圓心位置高於雷射線則代表 ,反之則 。 in is the radius of the sphere, is the height of the center 11 of the sphere 10 relative to the sensing plane. From Equation 15, we can see that the two solutions obtained represent whether the center 11 is located above or below the contour sensing plane P. Therefore, the information of the sphere 10 obtained by the visual sensor 30 can be used to assist in the judgment to obtain the correct solution. As shown in Figure 5B, the image of the sphere 10 obtained by the visual sensor 30 is used to determine the center position through image processing methods and compared with the laser line projected on the sphere 10 by the contour sensing plane P. If the center position is higher than the laser line, it means that , vice versa .
記錄此座標相對於輪廓傳感器座標系之座標為 。 Record the coordinates of this coordinate relative to the contour sensor coordinate system as .
令 make
a.若 ,控制該機械手臂20至 點後,再沿著 方向移動任意長度,並回到步驟2。 a. If , control the robot arm 20 to After that, follow Move any distance in the direction and return to step 2.
b. 若 ,控制該機械手臂20至 點後,再沿著該機械手臂20座標系 平面,且為 之任意方向移動任意長度,並回到步驟2。 b. If , control the robot arm 20 to After clicking, follow the 20 coordinate system of the robot arm Plane, and Move any length in any direction and return to step 2.
c. 若 ,控制該機械手臂20至 點後,再沿著任意方向移動任意長度,並回到步驟2。 c. If , control the robot arm 20 to After clicking, move any distance in any direction and return to step 2.
若 ,計算 相對於 之單位向量 以及垂直於此向量之兩向量 。 like ,calculate Relative to Unit vector and two vectors perpendicular to this vector .
…式16。 …Formula 16.
…式17。 …Formula 17.
…式18。 …Formula 18.
將 改以 方向表示,並求得 座標系原點與 座標系原點之關係,如圖5C所示。 will Change to Direction expression, and obtain The origin of the coordinate system and The relationship between the origin of the coordinate system is shown in Figure 5C.
…式19。 …Formula 19.
即可求得將座標由該輪廓感測器40座標系 轉換至該機械手臂20參考座標系 座標之轉換矩陣。 The coordinates of the contour sensor 40 can be obtained Convert to the reference coordinate system of the robot arm 20 Coordinate transformation matrix.
…式19。 …Formula 19.
步驟S3中,在確定該輪廓感測器40 與該機械手臂20參考座標系 轉換關係後,即可開始進行該工具T座標系之校正;本發明提出之方法將說明如何校正該工具T外型為圓柱且軸向末端為平面之刀具,以取得該工具T安裝於該機械手臂20時的刀具軸向,及該工具T中心點座標(刀具軸與末端平面之交點)。 In step S3, the contour sensor 40 is determined and the reference coordinate system of the robot arm 20 After the conversion relationship, the calibration of the tool T coordinate system can be started; the method proposed in the present invention will explain how to calibrate the tool T with a cylindrical shape and a flat axial end to obtain the tool axis when the tool T is installed on the robot arm 20, and the center point coordinates of the tool T (the intersection of the tool axis and the end plane).
本發明所提出的校正方法,首先將該工具T移動至該輪廓感測器40掃描範圍內,並透過RANSAC分析所取得之輪廓及其方程式:若所得之輪廓經擬合結果為直線,則代表所得之輪廓位於該工具T的A面(底面)或 與 平行,此時可透過外部該視覺感測器30(與圖2相同視角之2D該視覺感測器30)確認目前該輪廓感測器40感測平面與該工具T之關係,該工具T的定義如圖6所示: (1)若感測平面位於該工具T的A面(圖6中表示A)可控制機器人繞 或 旋轉,直到所得之輪廓擬合結果為橢圓或圓為止;(2)若 與 平行,則控制該機械手臂20使該工具T繞 軸旋轉,直到所得之輪廓擬合結果為橢圓或圓為止,即代表輪廓掃描範圍位於B面(圖6中表示B)。 The calibration method proposed by the present invention first moves the tool T into the scanning range of the contour sensor 40 and analyzes the obtained contour and its equation through RANSAC: If the obtained contour is a straight line after fitting, it means that the obtained contour is located on the A surface (bottom surface) or and At this time, the relationship between the current sensing plane of the contour sensor 40 and the tool T can be confirmed by the external visual sensor 30 (the 2D visual sensor 30 with the same viewing angle as Figure 2). The definition of the tool T is shown in Figure 6: (1) If the sensing plane is located on the A surface of the tool T (indicated by A in Figure 6), the robot can be controlled to orbit or Rotate until the resulting contour fit is an ellipse or a circle; (2) If and parallel, the robot arm 20 is controlled to make the tool T rotate around Rotate the axis until the contour fitting result is an ellipse or a circle, which means that the contour scanning range is located on the B surface (B in Figure 6).
當所得之輪廓位於B面後,即可進行該工具T軸方向之校正,其方法為控制該機械手臂20運動使該工具T軸方向 與該輪廓感測器40 方向一致即可完成校正,其流程如下: When the obtained contour is located on the B surface, the tool T axis direction can be calibrated by controlling the movement of the robot arm 20 so that the tool T axis direction With the contour sensor 40 The calibration can be completed if the directions are consistent. The process is as follows:
將該輪廓感測器40所得之感測點數據以RANSAC擬合最小誤差橢圓求得擬合方程式,如圖7A、式21所示,其中,圖7A中M表示橢圓的主軸,m表示橢圓的次要軸。The sensing point data obtained by the contour sensor 40 is fitted to a minimum error ellipse using RANSAC to obtain a fitting equation, as shown in FIG7A and Equation 21, where M in FIG7A represents the major axis of the ellipse and m represents the minor axis of the ellipse.
…式21。 …Formula 21.
其中 為橢圓方程式之係數。接著可利用橢圓方程式之係數求得橢圓長軸長度 、橢圓短軸長度 、橢圓長軸與該輪廓感測器40 之夾角 、及橢圓中心座標 ,如式22至26所示。 in is the coefficient of the ellipse equation. Then the coefficient of the ellipse equation can be used to find the length of the major axis of the ellipse. , length of the minor axis of the ellipse , the major axis of the ellipse and the profile sensor 40 Angle , and the coordinates of the center of the ellipse , as shown in Equations 22 to 26.
…式22。 …Formula 22.
…式23。 …Formula 23.
…式24。 …Formula 24.
…式25。 …Formula 25.
…式26。 …Formula 26.
其中若 且 ,則表示橢圓長軸與該輪廓感測器40座標系之座標軸 夾角 。 Among them, if and , which indicates the major axis of the ellipse and the coordinate axis of the contour sensor 40 coordinate system Angle .
將相對於輪廓掃描器之橢圓中心座標 與短軸方向 分別轉換為相對該機械手臂20參考座標系之方向 與方向 ,如式27、28所式,其中 可由式20求得。 The center coordinates of the ellipse relative to the profile scanner With the short axis direction Converted into directions relative to the reference coordinate system of the robot arm 20 and direction , as shown in Equations 27 and 28, where It can be obtained by formula 20.
…式27。 …Formula 27.
…式28。 …Formula 28.
控制該機械手臂20使橢圓中心點 固定不動,並沿著橢圓短軸方向 旋轉,如圖7B,在該輪廓感測平面P上,若旋轉後的長軸長度大於初始狀態之長軸長度、或輪廓感測資訊由橢圓變成直線,則改繞 ,直到橢圓之長軸與短軸長度相等,即代表該工具T軸方向 與該輪廓感測器40 方向平行,即可求得該工具T軸相對於該機械手臂20該法蘭面21座標系之方向 ,如式29、30所示: Control the robot arm 20 so that the center point of the ellipse Fixed and along the minor axis of the ellipse Rotation, as shown in Figure 7B, on the contour sensing plane P, if the length of the long axis after rotation is greater than the length of the long axis in the initial state, or the contour sensing information changes from an ellipse to a straight line, then change the rotation , until the length of the major axis and minor axis of the ellipse are equal, which represents the T-axis direction of the tool With the contour sensor 40 The direction of the tool T axis relative to the coordinate system of the flange surface 21 of the robot arm 20 can be obtained. , as shown in Equations 29 and 30:
…式29。 …Formula 29.
…式30。 …Formula 30.
由於該工具T軸方向 與該輪廓感測器40 方向平行,因此將該工具T軸方向 改以該輪廓感測器40座標系表示後之方向為 ,其中 與 分別為 及 以 表示之方向, 為 以 表示之方向。 Since the tool T axis direction With the contour sensor 40 direction is parallel to the T axis of the tool. The direction after being expressed in the coordinate system of the contour sensor 40 is ,in and They are and by Indicates the direction, for by Indicates the direction.
當求得刀具軸方向 後,僅需定義 與 即可求得該工具T座標系之方向。由於該工具T為圓柱外型,因此 與 僅需定義與 垂直即可:。 When the tool axis direction is obtained After that, just define and The direction of the tool T coordinate system can be obtained. Since the tool T is cylindrical, and Just define Just vertically:.
*若 不與 趨近平行,則該工具T座標系 與 則可定義如式31及32: *like Not with If the tool T coordinate system is close to parallel, and Then we can define it as Equations 31 and 32:
…式31。 …Formula 31.
…式32。 …Formula 32.
*若 與 趨近平行,則該工具T座標系 與 則可定義如式33及34,其中 : *like and If the tool T coordinate system is close to parallel, and It can be defined as Equations 33 and 34, where :
…式33。 …Formula 33.
…式34。 …Formula 34.
完成該工具T軸方向校正後,即可開始校正該工具T中心點位置:首先控制該機械手臂20使 或 與 平行, 則與 呈任意夾角 (其中 ),而後透過該視覺感測器30資訊控制該機械手臂20使該工具T平移至該輪廓感測器40感測範圍內,使輪廓感測所得輪廓位於B面,以取得該工具T輪廓資訊。 After the tool T axis direction calibration is completed, the tool T center point position calibration can be started: first control the robot arm 20 to or and parallel, Ze Yu Any angle (in ), and then the robot arm 20 is controlled by the information of the vision sensor 30 to move the tool T into the sensing range of the contour sensor 40 so that the contour sensed contour is located on the B surface to obtain the contour information of the tool T.
利用該視覺感測器30判別該輪廓感測平面P與該工具T之關係,若該輪廓感測平面P通過該工具T的A面或C面(圖6中所標示C),則代表該輪廓感測平面P與該工具T輪廓之交集曲線非完整橢圓曲線,如圖7C左側所示的系統狀態。The visual sensor 30 is used to determine the relationship between the contour sensing plane P and the tool T. If the contour sensing plane P passes through the A surface or the C surface of the tool T (marked C in FIG6 ), it means that the intersection curve of the contour sensing plane P and the contour of the tool T is not a complete elliptical curve, as shown on the left side of FIG7C .
若該輪廓感測平面P通過該工具T的A面或C面,則控制該機械手臂20沿 或 方向平移,使該輪廓感測平面P與該工具T輪廓之交集曲線交位於該工具T的B面,如圖7D;若沿 或 方向平移,感測平面與該工具T之交集曲線皆會包含該工具T的A面或C面輪廓,則改變 角度,使該輪廓感測平面P與該工具T輪廓之交集曲線交位於該工具T的B面。 If the contour sensing plane P passes through the A surface or C surface of the tool T, the robot arm 20 is controlled to move along or Direction translation, so that the intersection curve of the contour sensing plane P and the contour of the tool T intersects at the B surface of the tool T, as shown in Figure 7D; or Direction translation, the intersection curve of the sensing plane and the tool T will contain the A or C surface contour of the tool T, then change Angle, so that the intersection curve of the contour sensing plane P and the contour of the tool T intersects at the B surface of the tool T.
將該工具T移動至可取得完整橢圓輪廓之位置與角度後,接著控制該機械手臂20使該工具T沿 方向平移 後,再控制該機械手臂20使該工具T軸與感測平面之交點位置固定,並使該工具T沿該輪廓感測器40 方向旋轉取得完整輪廓剖面。重複執行此動作,直到輪廓狀態出現直線段,如圖7C右側所示的輪廓狀態與圖7E所示。 After the tool T is moved to a position and angle where a complete elliptical contour can be obtained, the robot arm 20 is then controlled to move the tool T along Direction translation Then, the robot arm 20 is controlled to fix the intersection of the tool T axis and the sensing plane, and the tool T is moved along the contour sensor 40. Repeat this process until a straight line segment appears in the contour state, as shown in the right side of Figure 7C and Figure 7E.
將該旋轉盤50旋轉360度,使該輪廓感測器40可取得感測平面與該工具T輪廓之完整交集曲線;接著將直線段(該工具T的A面)以外之該工具T的B面輪廓進行RANSAC取得橢圓輪廓方程式,並判別直線與橢圓之關係。The rotating disk 50 is rotated 360 degrees so that the contour sensor 40 can obtain the complete intersection curve between the sensing plane and the contour of the tool T. Then, the contour of the B surface of the tool T outside the straight line segment (the A surface of the tool T) is subjected to RANSAC to obtain the ellipse contour equation and determine the relationship between the straight line and the ellipse.
若輪廓感測所得之橢圓輪廓小於半橢圓(如圖7F左側),則沿 方向平移以增大橢圓輪廓感測範圍,並回到步驟9;若輪廓感測所得之橢圓輪廓大於半橢圓(如圖7F右側),則沿 方向平移以縮小橢圓輪廓感測範圍,並回到步驟9;若輪廓感測所得之橢圓輪廓等於半橢圓,則至步驟11。 If the elliptical contour obtained by contour sensing is smaller than a semi-ellipse (as shown on the left side of Figure 7F), then Direction translation to increase the elliptical contour sensing range and return to step 9; if the elliptical contour sensed is larger than a semi-ellipse (as shown on the right side of Figure 7F), then Shift in the direction to reduce the ellipse contour sensing range and return to step 9. If the ellipse contour sensed is equal to a semi-ellipse, proceed to step 11.
當輪廓感測所得之橢圓輪廓等於半橢圓時,橢圓中心座標 即為該工具T軸與該工具T的A面交點位置,故該工具T中心點相對於機器人該法蘭面21之座標 如式35所示,即可完成該工具T軸與該工具T中心之校正。 When the elliptical contour obtained by contour sensing is equal to a semi-ellipse, the coordinates of the ellipse center are This is the intersection of the tool T axis and the A surface of the tool T. Therefore, the coordinate of the tool T center point relative to the flange surface 21 of the robot is As shown in Formula 35, the calibration of the tool T axis and the tool T center can be completed.
…式35。 …Formula 35.
此外,除非文中明確說明,本發明所述元件、元素和序列順序、數字字母的使用、或其他名稱的使用,並非用於限定本發明流程和方法的順序。儘管上述文中通過各種示例討論了一些目前認為有用的發明實施例,但應當理解的是,該類細節僅起到說明的目的,本發明不僅限於揭露的實施例,也同時包含了其等價的相關變化與變形。Furthermore, unless expressly stated otherwise, the order of the components, elements, sequences, the use of numerals, or other names described herein are not intended to limit the order of the processes and methods of the present invention. Although the above text discusses some currently considered useful embodiments of the invention through various examples, it should be understood that such details are for illustrative purposes only, and the present invention is not limited to the disclosed embodiments but also encompasses equivalent related variations and modifications.
一些實施例中使用了描述成分、屬性數量的數字,應當理解的是,此類用於實施例描述的數字,在一些示例中使用了修飾詞“大約”、“近似”或“大體上”來修飾。除非另外說明,“大約”、“近似”或“大體上”表明所述數字允許有±20%的變化。相應地,在一些實施例中,說明書和請求項中使用的數值參數均為近似值,該近似值根據個別實施例所需特點可以發生改變。在一些實施例中,數值參數應考慮規定的有效數位並採用一般位數保留的方法。儘管本發明一些實施例中用於確認其範圍廣度的數值域和參數為近似值,在具體實施例中,此類數值的設定在可行範圍內盡可能精確。In some embodiments, numbers are used to describe the quantity of components and attributes. It should be understood that such numbers used in the description of the embodiments are modified in some examples using the modifiers "approximately", "approximately" or "substantially". Unless otherwise stated, "approximately", "approximately" or "substantially" indicate that the numbers are allowed to vary by ±20%. Accordingly, in some embodiments, the numerical parameters used in the description and the claims are approximate values, which may change according to the required characteristics of the individual embodiments. In some embodiments, the numerical parameters should consider the specified significant digits and adopt the general digit retention method. Although the numerical domains and parameters used to confirm the breadth of the scope of some embodiments of the present invention are approximate values, in specific embodiments, the settings of such numerical values are as accurate as possible within the feasible range.
最後,應當理解的是,本發明中所述實施例僅用以說明本發明實施例的原則。其他的變形也可能屬本發明的範圍。因此,作為示例而非限制,本發明實施例的替代配置可視為與本發明的教導一致。相應地,本發明的實施例不僅限於本發明明確介紹和描述的實施例。Finally, it should be understood that the embodiments described herein are intended only to illustrate the principles of the present invention. Other variations are also possible within the scope of the present invention. Therefore, by way of example and not limitation, alternative configurations of the present invention may be considered consistent with the teachings of the present invention. Accordingly, the embodiments of the present invention are not limited to those expressly shown and described herein.
10圓球 11球心 20機械手臂 21法蘭面 30視覺感測器 40輪廓感測器 50旋轉盤 I、II視覺感測器位置 M、m橢圓主軸與次要軸 P輪廓感測平面 T工具 S1-S3 步驟 10 Sphere 11 Center of Sphere 20 Robot Arm 21 Flange 30 Vision Sensor 40 Profile Sensor 50 Rotary Plate I and II Vision Sensor Positions M and m Ellipse Major and Minor Axes P Profile Sensing Plane T Tool Steps S1-S3
為了更清楚地說明本發明實施例的技術方案,下面將對實施例描述中所需要使用的附圖作簡單的介紹。顯而易見地,下面描述中的附圖僅僅是本發明的一些示例或實施例,並非絕對用以限定本發明的技術範圍。除非從前後文顯而易見或另做說明,圖中相同標號代表相同結構或操作。其中: 圖1為本發明無刀尖點工具之校正方法的步驟流程圖。 圖2為本發明無刀尖點工具之校正方法所使用的裝置示意圖。 圖3A為本發明無刀尖點工具之校正方法的系統架構圖。 圖3B為本發明無刀尖點工具之校正方法中 於 之移動投影向量。 圖3C為本發明無刀尖點工具之校正方法中旋轉角θ ri 與旋轉盤關係示意圖。 圖4A為本發明無刀尖點工具之校正方法中該視覺感測器移動方向示意圖。 圖4B為本發明無刀尖點工具之校正方法中控制該圓球的中心移動並收斂至該視覺感測器座標O s1 ,O s2 。 圖5A為本發明無刀尖點工具之校正方法中該輪廓感測器擷取該圓球資訊與該圓球輪廓資訊擬合示意圖。 圖5B為本發明無刀尖點工具之校正方法中該視覺感測器擷取該圓球與該輪廓感測器雷射掃描線之關係示意圖。 圖5C為本發明無刀尖點工具之校正方法中 座標系原點與 座標系原點之關係示意圖。 圖6為本發明無刀尖點工具之校正方法中該工具座標系校正系統的工具定義。 圖7A為本發明無刀尖點工具之校正方法中該輪廓感測器所得之輪廓(圓圈點)與擬合結果(實心點)。 圖7B為本發明無刀尖點工具之校正方法中控制該機械手臂使感測所得之橢圓中心點 固定不動並繞橢圓短軸旋轉示意圖。 圖7C為本發明無刀尖點工具之校正方法中不完整橢圓輪廓之系統狀態示意圖。 圖7D為本發明無刀尖點工具之校正方法中不完整橢圓輪廓之輪廓狀態示意圖。 圖7E為本發明無刀尖點工具之校正方法中取得完整橢圓輪廓之系統狀態示意圖。 圖7F為本發明無刀尖點工具之校正方法中控制該機械手臂沿+z p 方向平移直到出現直線輪廓之系統狀態示意圖。 In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following will briefly introduce the drawings required for use in the description of the embodiments. Obviously, the drawings described below are only some examples or embodiments of the present invention, and are not absolutely used to limit the technical scope of the present invention. Unless it is obvious from the preceding and following texts or otherwise explained, the same reference numerals in the figures represent the same structure or operation. Among them: Figure 1 is a step flow chart of the calibration method for a tool without a tool tip point of the present invention. Figure 2 is a schematic diagram of the device used in the calibration method for a tool without a tool tip point of the present invention. Figure 3A is a system architecture diagram of the calibration method for a tool without a tool tip point of the present invention. Figure 3B is a diagram of the calibration method for a tool without a tool tip point of the present invention. Yu 3C is a schematic diagram showing the relationship between the rotation angle θ ri and the rotating disk in the calibration method for the tool without a tool tip point of the present invention. FIG4A is a schematic diagram showing the movement direction of the vision sensor in the calibration method for the tool without a tool tip point of the present invention. FIG4B is a schematic diagram showing the control of the center movement of the sphere and convergence to the vision sensor coordinates O s1 , O s2 in the calibration method for the tool without a tool tip point of the present invention. FIG5A is a schematic diagram showing the fitting of the sphere information captured by the contour sensor and the sphere contour information in the calibration method for the tool without a tool tip point of the present invention. FIG5B is a schematic diagram showing the relationship between the sphere captured by the vision sensor and the laser scanning line of the contour sensor in the calibration method for the tool without a tool tip point of the present invention. FIG5C is a schematic diagram showing the relationship between the sphere captured by the vision sensor and the laser scanning line of the contour sensor in the calibration method for the tool without a tool tip point of the present invention. The origin of the coordinate system and Schematic diagram of the relationship between the origin of the coordinate system. Figure 6 is the tool definition of the tool coordinate system calibration system in the calibration method of the tool without a tool tip point of the present invention. Figure 7A is the contour (circle point) and the fitting result (solid point) obtained by the contour sensor in the calibration method of the tool without a tool tip point of the present invention. Figure 7B is the center point of the ellipse sensed by controlling the robot arm in the calibration method of the tool without a tool tip point of the present invention. FIG7C is a schematic diagram illustrating the system state of an incomplete elliptical contour in the calibration method for a tool without a tool tip point according to the present invention. FIG7D is a schematic diagram illustrating the contour state of an incomplete elliptical contour in the calibration method for a tool without a tool tip point according to the present invention. FIG7E is a schematic diagram illustrating the system state of obtaining a complete elliptical contour in the calibration method for a tool without a tool tip point according to the present invention. FIG7F is a schematic diagram illustrating the system state of controlling the robot arm to translate along the + zp direction until a straight line contour appears in the calibration method for a tool without a tool tip point according to the present invention.
S1-S3:步驟 S1-S3: Steps
Claims (9)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW113140452A TWI891549B (en) | 2024-10-24 | 2024-10-24 | A calibration method for robotic arm tool center point without a tool apex |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| TW113140452A TWI891549B (en) | 2024-10-24 | 2024-10-24 | A calibration method for robotic arm tool center point without a tool apex |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| TWI891549B true TWI891549B (en) | 2025-07-21 |
Family
ID=97228377
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| TW113140452A TWI891549B (en) | 2024-10-24 | 2024-10-24 | A calibration method for robotic arm tool center point without a tool apex |
Country Status (1)
| Country | Link |
|---|---|
| TW (1) | TWI891549B (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115507720A (en) * | 2018-02-26 | 2022-12-23 | 瑞尼斯豪公司 | Coordinate positioning machine |
| CN115533893A (en) * | 2022-08-25 | 2022-12-30 | 浙江工业大学 | A TCP Calibration Method for Robot Using Floatable Standard Sphere |
| TW202411802A (en) * | 2022-09-12 | 2024-03-16 | 創博股份有限公司 | Visual calibrating method for virtual tcp of robotic arm |
| CN117754592A (en) * | 2024-01-05 | 2024-03-26 | 湖南视比特机器人有限公司 | A vision-based robotic arm calibration method, device, equipment and medium |
-
2024
- 2024-10-24 TW TW113140452A patent/TWI891549B/en active
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115507720A (en) * | 2018-02-26 | 2022-12-23 | 瑞尼斯豪公司 | Coordinate positioning machine |
| CN115533893A (en) * | 2022-08-25 | 2022-12-30 | 浙江工业大学 | A TCP Calibration Method for Robot Using Floatable Standard Sphere |
| TW202411802A (en) * | 2022-09-12 | 2024-03-16 | 創博股份有限公司 | Visual calibrating method for virtual tcp of robotic arm |
| CN117754592A (en) * | 2024-01-05 | 2024-03-26 | 湖南视比特机器人有限公司 | A vision-based robotic arm calibration method, device, equipment and medium |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112105484B (en) | Robot kinematics parameter self-calibration method, system and storage device | |
| CN107421442B (en) | An Online Compensation Method for Robot Positioning Error Aided by External Measurement | |
| CN109048876B (en) | Robot calibration method based on laser tracker | |
| CN108748159B (en) | Self-calibration method for tool coordinate system of mechanical arm | |
| KR101200961B1 (en) | Parallel kinematic machine, calibration method of parallel kinematic machine, and calibration program product | |
| CN110193829A (en) | A kind of robot precision's control method of coupled motions and stiffness parameters identification | |
| CN114782513B (en) | Point laser sensor mounting pose calibration method based on plane | |
| EP1040393A1 (en) | Method for calibration of a robot inspection system | |
| CN105318838B (en) | A single-plane calibration method for the relationship between the laser range finder and the end of the manipulator | |
| CN114029982B (en) | A hand-eye calibration device and calibration method for a camera outside a robotic arm | |
| TWI762371B (en) | Automated calibration system and method for the relation between a profile scanner coordinate frame and a robot arm coordinate frame | |
| CN105444672A (en) | Orthogonal plane calibrating method and orthogonal plane calibrating system of relation between laser distance measuring device and end of mechanical arm | |
| CN107053216A (en) | The automatic calibration method and system of robot and end effector | |
| JP2015089575A (en) | Robot, control device, robot system and control method | |
| CN112361958A (en) | Line laser and mechanical arm calibration method | |
| CN113319855B (en) | Gravity compensation method under compliance control mode of multi-joint diagnosis and treatment robot | |
| CN114076581A (en) | Rotary table compensation | |
| TWI891549B (en) | A calibration method for robotic arm tool center point without a tool apex | |
| JPH052168B2 (en) | ||
| Huang et al. | A novel calibration method for robotic arm tool center point without a physical tool apex | |
| JP2000055664A (en) | Articulated robot system with function of measuring attitude, method and system for certifying measuring precision of gyro by use of turntable for calibration reference, and device and method for calibrating turntable formed of n-axes | |
| CN113043264B (en) | Zero calibration method for integrated joint seven-axis robot | |
| CN110954022B (en) | A circular object rotation scanning structure and calibration method | |
| CN113091670A (en) | Calibration device and calibration method for robot joint stiffness | |
| WO2025022496A1 (en) | Control device, machine tool system, and machining method |