[go: up one dir, main page]

TWI879332B - Method for calculating actual distance and angle of target object of tracking device - Google Patents

Method for calculating actual distance and angle of target object of tracking device Download PDF

Info

Publication number
TWI879332B
TWI879332B TW112149699A TW112149699A TWI879332B TW I879332 B TWI879332 B TW I879332B TW 112149699 A TW112149699 A TW 112149699A TW 112149699 A TW112149699 A TW 112149699A TW I879332 B TWI879332 B TW I879332B
Authority
TW
Taiwan
Prior art keywords
coordinate
image
target object
positioning
angle
Prior art date
Application number
TW112149699A
Other languages
Chinese (zh)
Other versions
TW202526266A (en
Inventor
林邦傑
龔皇光
林聖傑
李順良
Original Assignee
正修學校財團法人正修科技大學
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 正修學校財團法人正修科技大學 filed Critical 正修學校財團法人正修科技大學
Priority to TW112149699A priority Critical patent/TWI879332B/en
Application granted granted Critical
Publication of TWI879332B publication Critical patent/TWI879332B/en
Publication of TW202526266A publication Critical patent/TW202526266A/en

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention discloses a method for calculating the actual distance and angle of a target object with a tracking device. The method comprises the following steps: an image capture step, an image segmentation step, a coordinate transformation step, a coordinate positioning step, a coordinate mapping step, and a coordinate system transformation step. Initially, the method involves capturing a target-following image containing the target object using the tracking device. Subsequently, the image undergoes segmentation, coordinate transformation, coordinate positioning, coordinate mapping, and coordinate system transformation. Through these steps, the method enables the application of an imaging device to capture images of the target being followed and perform image coordinate transformation without the need for complex computations. This facilitates the precise calculation of the relative positional coordinates between the actual target and the device. Consequently, the method calculates the actual distance and angle, allowing the device to automatically follow individuals or movable target objects, thereby assisting in the handling of heavy objects.

Description

跟隨裝置之計算目標物實際距離及角度的方法Method for calculating actual distance and angle of target object by following device

本發明是關於一種跟隨裝置之計算方法,特別是關於一種跟隨裝置之計算目標物實際距離及角度的方法。The present invention relates to a calculation method of a following device, and more particularly to a method of calculating the actual distance and angle of a target object by a following device.

在一些自動化領域,例如自動跟隨系統、自動交通號誌管理系統及自動澆水系統等技術領域,時常需進行即時自動計算裝有取像裝置本身與一目標物之間的實際距離及角度的技術,以利該自動跟隨系統進行載具自動駕駛以跟隨目標物、該自動交通號誌管理系統之交通號誌自動判斷路上車子的距以在安全性前提下切換適合號誌,以及該自動澆水系統判斷射程範圍內需澆水植物所在之相對位置的距離及角度進行澆水。In some automation fields, such as automatic following systems, automatic traffic signal management systems, and automatic watering systems, it is often necessary to automatically calculate the actual distance and angle between the imaging device and a target object in real time, so that the automatic following system can automatically drive the vehicle to follow the target object, the traffic signal of the automatic traffic signal management system can automatically determine the distance of the car on the road to switch to the appropriate signal under the premise of safety, and the automatic watering system can determine the relative distance and angle of the plants to be watered within the range to water them.

習知的計算目標物實際距離及角度的方法有許多不同的技術,例如台灣專利第I407081號公開一種測距裝置,係藉由將該測距裝置中之一影像感測器所產生之光感測訊號,移除掉對應於一背景光與一閃爍光的部分,來降低該背景光與該閃爍光之影響,並可根據量測具有一已知距離之一校正物所得之一反射光之一成像位置,計算出可校正該測距裝置之一組裝誤差角度之一校正參數,再根據該校正參數正確地計算出該待測距離。惟,該習知技術為透過發光元件發射偵測光射向待測物產生反射光後計算所收到的能量大小來反推距離長短。故該習知技術僅能算出相對距離卻無法得知相對角度。There are many different methods for calculating the actual distance and angle of a target object. For example, Taiwan Patent No. I407081 discloses a distance measuring device, which reduces the influence of the background light and the flash light by removing the part corresponding to the background light and the flash light from the light sensing signal generated by an image sensor in the distance measuring device, and can calculate a correction parameter that can correct an assembly error angle of the distance measuring device according to an imaging position of a reflected light obtained by measuring a calibration object with a known distance, and then correctly calculate the distance to be measured according to the correction parameter. However, the known technology is to calculate the energy received after the light-emitting element emits detection light to the object to be measured to generate reflected light to reversely infer the distance. Therefore, the known technology can only calculate the relative distance but cannot know the relative angle.

另一習知方法,例如台灣專利第I777791號公開了一種人機介面之動態虛像顯示距離驗證方法,包含步驟有:建立測試底圖資料庫;藉由顯示元件顯示測試底圖資料庫中的第一測試底圖;藉由人機介面模組投影第一待測影像至疊像元件上,其中第一待測影像對應第一待測虛像顯示距離,第一待測虛像顯示距離與第一測試底圖對應的第一基準虛像顯示距離相同;藉由影像擷取模組擷取第一測試底圖及第一待測影像;藉由辨識模組針對第一待測影像及第一測試底圖進行可靠度評估;以及藉由處理模組針對第一待測影像及第一測試底圖進行疊合率計算,以驗證人機介面之第一待測虛像顯示距離的準確度。惟,該習知技術主要藉由影像擷取模組擷取第一測試底圖及第一待測影像,再藉由辨識模組針對第一待測影像及第一測試底圖進行疊合率計算,以計算出待測物的距離,故該習知技術仍僅能計算待測物距離,無法求出兩者間之相對角度。Another known method, such as Taiwan Patent No. I777791, discloses a method for verifying the dynamic virtual image display distance of a human-machine interface, comprising the steps of: establishing a test base map database; displaying a first test base map in the test base map database by a display element; projecting a first image to be tested onto a superimposed image element by a human-machine interface module, wherein the first image to be tested corresponds to a first virtual image display distance to be tested, and the first virtual image to be tested corresponds to a first virtual image display distance to be tested. The image display distance is the same as the first reference virtual image display distance corresponding to the first test base map; the first test base map and the first image to be tested are captured by an image capture module; the reliability of the first image to be tested and the first test base map is evaluated by a recognition module; and the overlap rate of the first image to be tested and the first test base map is calculated by a processing module to verify the accuracy of the first virtual image display distance to be tested of the human-machine interface. However, the known technology mainly uses an image capture module to capture the first test background image and the first image to be tested, and then uses a recognition module to calculate the overlap rate of the first image to be tested and the first test background image to calculate the distance of the object to be tested. Therefore, the known technology can only calculate the distance of the object to be tested, and cannot calculate the relative angle between the two.

另一習知方法,例如台灣專利第I746234號公開了一種無人直升機對海面目標測距定位的方法,包含:高度氣壓感測模組、GPS模組、影像模組、訊號處理模組、飛控模組及傳輸模組等組件,將這些組件整合至無人機上,當發現目標物時,高度氣壓感測模組可知無人機距海平面的高度;影像模組可透過攝影機照射目標物的角度,以算出無人機距目標物的平面距離,再透過GPS模組及影像模組的攝影鏡頭旋轉的角度差,得知目標物的角度,由無人機本身GPS座標位置、平面距離及目標物與無人機的角度,可算出目標物的實際GPS座標位置,進而掌握目標物與無人機間的相對座標位置與距離。惟,該習知技術主要是透過攝影機照射目標物的角度及GPS定位目標物取得座標後回傳載具裝置進行相對位置距計算,所需設備相對複雜,具無法在無GPS信號或信號不良場所使用。Another known method, such as Taiwan Patent No. I746234, discloses a method for measuring the distance and positioning of a target on the sea surface by an unmanned helicopter, which includes: an altitude pressure sensing module, a GPS module, an image module, a signal processing module, a flight control module, and a transmission module. These components are integrated into the unmanned aircraft. When the target is found, the altitude pressure sensing module can know the height of the unmanned aircraft from the sea level; the image module The plane distance between the drone and the target can be calculated by the angle at which the camera illuminates the target, and the angle of the target can be obtained by the angle difference of the rotation of the camera lens of the GPS module and the image module. The actual GPS coordinate position of the target can be calculated from the GPS coordinate position of the drone itself, the plane distance, and the angle between the target and the drone, and then the relative coordinate position and distance between the target and the drone can be grasped. However, the known technology mainly calculates the relative position distance by illuminating the target with the camera and locating the target with GPS, and then sending the coordinates back to the carrier device. The required equipment is relatively complex and cannot be used in places without GPS signals or with poor signals.

鑑於上述及其他問題,先前技術有加以改善之必要。In view of the above and other problems, the prior art needs to be improved.

本創作之目的在於提供一種跟隨裝置之計算目標物實際距離及角度的方法,以改良先前技術之缺點。The purpose of this invention is to provide a method for calculating the actual distance and angle of a target object by a tracking device to improve the shortcomings of the prior art.

本創作之另一目的在於提供一種跟隨裝置之計算目標物實際距離及角度的方法,應用攝像裝置所得到之圖像畫面進行影像座標轉換,即可計算出實際目標與本體之間的相對位置座標,進而計算出實際距離及角度。Another purpose of this invention is to provide a method for calculating the actual distance and angle of a target object by a tracking device. By using the image frame obtained by the camera to perform image coordinate conversion, the relative position coordinates between the actual target and the body can be calculated, and then the actual distance and angle can be calculated.

本創作之另一目的在於提供一種跟隨裝置之計算目標物實際距離及角度的方法,不需複雜的運算即可精確計算出實際距離及角度,方便跟隨裝置進行移動保持跟隨。Another purpose of this invention is to provide a method for calculating the actual distance and angle of a target object by a following device, which can accurately calculate the actual distance and angle without complicated calculations, making it convenient for the following device to move and keep following.

本創作之另一目的在於提供一種跟隨裝置之計算目標物實際距離及角度的方法,可應用於載物機器人、高球撿球機、高球球具攜帶機或自動除草機等無人智慧裝置,可以透過影像辨識的方式讓裝置自動跟隨人或可移動之目標物。Another purpose of this invention is to provide a method for calculating the actual distance and angle of a target object by a following device, which can be applied to unmanned intelligent devices such as cargo robots, golf ball pickers, golf equipment carriers or automatic lawn mowers, and can allow the device to automatically follow a person or a movable target object through image recognition.

為達成上述及其他目的,本創作實施例提供一種跟隨裝置之計算目標物實際距離及角度的方法,包含:一影像擷取步驟,由一跟隨裝置之一攝像單元擷取一跟隨目標影像,其中,該跟隨裝置具有一裝置座標,該跟隨目標影像為內含一目標物之一二維平面圖像,該跟隨目標影像具有一第一原點座標;一影像分割步驟,辨識該跟隨目標影像中該目標物之位置,將該跟隨目標影像進行影像分割,取得一比對影像,其中,該目標物之影像位於該比對影像中間;一座標轉換步驟,利用該跟隨目標影像與該比對影像之圖像尺寸,將該第一原點座標進行線性轉換後,在該比對影像中取得一第二原點座標;一座標定位步驟,利用該第二原點座標之座標值於該比對影像中設定一定位座標;一座標對應步驟,透過該裝置座標之各軸距離與該定位座標之間曲線擬合關係,計算出該裝置座標與該定位座標之坐標對應關係;及一座標系變換步驟,依據該定位座標與該裝置座標之坐標對應關係,計算出該跟隨裝置與該目標物之實際距離與實際角度。To achieve the above and other purposes, the present invention provides a method for calculating the actual distance and angle of a target object by a tracking device, comprising: an image capture step, in which a camera unit of a tracking device captures a tracking target image, wherein the tracking device has a device coordinate, and the tracking target image is a two-dimensional plane image containing a target object, and the tracking target image has a first origin coordinate; an image segmentation step, identifying the position of the target object in the tracking target image, performing image segmentation on the tracking target image, and obtaining a comparison image, wherein the image of the target object is located in the middle of the comparison image; a coordinate conversion step; A step of linearly transforming the first origin coordinates by using the image sizes of the tracking target image and the comparison image to obtain a second origin coordinate in the comparison image; a coordinate positioning step of setting a positioning coordinate in the comparison image by using the coordinate value of the second origin coordinate; a coordinate correspondence step of calculating the coordinate correspondence relationship between the device coordinates and the positioning coordinates through the curve fitting relationship between the axis distances of the device coordinates and the positioning coordinates; and a coordinate system transformation step of calculating the actual distance and actual angle between the tracking device and the target object according to the coordinate correspondence relationship between the positioning coordinates and the device coordinates.

在一實施例中,另包含:一跟隨步驟,依據該裝置座標與該定位座標之坐標對應關係,算出與該裝置座標原點對應之該定位座標值,且將該定位座標值換算為實際距離與角度,提供該跟隨裝置之一自動駕駛功能進行相對應之驅動方向與距離之修正,自動駕駛跟該隨裝置跟隨該目標物。In one embodiment, it further includes: a following step, according to the coordinate correspondence between the device coordinates and the positioning coordinates, the positioning coordinate value corresponding to the origin of the device coordinates is calculated, and the positioning coordinate value is converted into an actual distance and angle, and an automatic driving function of the following device is provided to perform corresponding driving direction and distance corrections, and the following device automatically drives to follow the target object.

在一實施例中,在一些實施例中,該裝置座標、該第一原點座標、該第二原點座標及該定位座標為直角座標系。In one embodiment, in some embodiments, the device coordinates, the first origin coordinates, the second origin coordinates, and the positioning coordinates are rectangular coordinate systems.

在一實施例中,該座標定位步驟之該定位座標原點係設於該比對影像圖像之底部中間位置。In one embodiment, the coordinate origin of the coordinate positioning step is set at the bottom middle position of the comparison image.

在一實施例中,該目標物為可移動物體。In one embodiment, the target object is a movable object.

在一實施例中,該座標對應步驟包含,預先量測該攝像單元之一鏡頭與不同位置之數該定位座標之間的實際距離。In one embodiment, the coordinate corresponding step includes pre-measuring the actual distance between a lens of the camera unit and a number of the positioning coordinates at different positions.

本創作全文所述方向性或其近似用語,例如前、後、左、右、上(頂)、下(底)、內、外、側等,主要是參考圖式的方向,各方向性或其近似用語僅用以輔助說明及理解本創作的各實施例,非用以限制本創作。The directions or similar terms described throughout the present invention, such as front, back, left, right, top, bottom, inside, outside, side, etc., are mainly with reference to the directions of the drawings. Each direction or similar terms are only used to assist in explaining and understanding the various embodiments of the present invention, and are not used to limit the present invention.

本創作全文所記載的元件及構件使用之冠詞,如一或該,僅是為了方便使用或簡化描述,應被解讀為包括一個或至少一個,且單一的概念也包括複數的概念,除非明顯有不同意思。The articles used in the elements and components described in the present invention, such as a or the, are only for the convenience of use or simplified description and should be interpreted as including one or at least one, and a single concept also includes a plural concept unless it is obvious that there is a different meaning.

為讓上述及其他目的、功效、特徵更明顯易懂,下文特舉部分較佳實施例,並參照所附圖式,作詳細說明。在不背離創作精神下,本創作具有多種實施方式,並不受限於下文實施方式所具體描述的細節,且圖式未必按照實際比例繪製,僅為說明實施例而提供。In order to make the above and other purposes, effects, and features more clearly understood, some preferred embodiments are specifically cited below, and detailed descriptions are made with reference to the attached drawings. Without departing from the spirit of the creation, this creation has a variety of implementation methods, and is not limited to the details specifically described in the implementation methods below, and the drawings may not be drawn according to the actual scale, but are provided only for illustrating the embodiments.

圖1為本創作跟隨裝置之計算目標物實際距離及角度的方法之一實施例流程圖。請參考圖1所示,本實施例之計算目標物實際距離及角度的方法,包含:一影像擷取步驟S10、一影像分割步驟S11、一座標轉換步驟S12、一座標定位步驟S13、一座標對應步驟S14及一座標轉換步驟S15,該影像擷取步驟S10係由設置在該跟隨裝置10之一攝像單元11,例如鏡頭,即時擷取該跟隨裝置10欲跟隨之一目標物的跟隨目標影像20,其中,該跟隨裝置10定義有一裝置座標系統,該跟隨目標影像20為內含該目標物之影像之一二維平面圖像,該跟隨目標影像20左上角為一第一原點座標C1。該影像分割步驟S11係辨識該跟隨目標影像20中該目標物之位置,將該跟隨目標影像20進行影像分割,取得包含該目標物之影像之一比對影像30,且該目標物之影像位於該比對影像30之中間位置。該座標轉換步驟(S12)係利用該跟隨目標影像20與該比對影像30之圖像尺寸之線性轉換關係,將該第一原點座標C1進行線性轉換後,在該比對影像30中取得一第二原點座標。該座標定位步驟S13係利用該第二原點座標C2之座標值於該比對影像30中設定一定位座標C3。該座標對應步驟S14係經預先量測該攝像單元11之一鏡頭與不同位置之數該定位座標之間的實際距離,且透過該裝置座標之X、Y軸向距離與該定位座標之間曲線擬合關係,計算出該裝置座標與該定位座標之坐標對應關係。該座標系變換步驟S15係依據該定位座標與該裝置座標之坐標對應關係,計算出該跟隨裝置10與該目標物之實際距離與實際角度。FIG1 is a flowchart of an embodiment of a method for calculating the actual distance and angle of a target object by the creative follower device. Referring to FIG1, the method for calculating the actual distance and angle of a target object in the embodiment comprises: an image capturing step S10, an image segmentation step S11, a coordinate conversion step S12, a coordinate positioning step S13, a coordinate corresponding step S14 and a coordinate conversion step S15. The image capturing step S10 is performed by a controller installed in the follower device. A camera unit 11 of the device 10, such as a lens, captures a tracking target image 20 of a target object that the tracking device 10 is to follow in real time, wherein the tracking device 10 defines a device coordinate system, and the tracking target image 20 is a two-dimensional plane image containing the image of the target object, and the upper left corner of the tracking target image 20 is a first origin coordinate C1. The image segmentation step S11 is to identify the position of the target object in the tracking target image 20, perform image segmentation on the tracking target image 20, and obtain a comparison image 30 containing the image of the target object, and the image of the target object is located in the middle position of the comparison image 30. The coordinate conversion step (S12) utilizes the linear conversion relationship between the image sizes of the tracking target image 20 and the comparison image 30 to linearly convert the first origin coordinate C1, and obtain a second origin coordinate in the comparison image 30. The coordinate positioning step S13 utilizes the coordinate value of the second origin coordinate C2 to set a positioning coordinate C3 in the comparison image 30. The coordinate correspondence step S14 pre-measures the actual distance between a lens of the camera unit 11 and the positioning coordinates at different positions, and calculates the coordinate correspondence relationship between the device coordinates and the positioning coordinates through the curve fitting relationship between the X and Y axis distances of the device coordinates and the positioning coordinates. The coordinate system transformation step S15 calculates the actual distance and actual angle between the following device 10 and the target object according to the coordinate correspondence between the positioning coordinates and the device coordinates.

圖2為本創作跟隨裝置算出之目標物實際距離及角度並據而自動跟隨之方法實施例流程圖。請參考圖2所示,本實施例主要是前述座標系變換步驟S15在計算出該跟隨裝置10與該目標物之實際距離與實際角度的應用,即另於該座標系變換步驟S15之後,另包含:一跟隨步驟S16係利用該跟隨裝置之一自動駕駛功能,依據該裝置座標與該定位座標之坐標對應關係,算出與該裝置座標原點對應之該定位座標值,且將該定位座標值(如直角座標系統)換算為實際距離與角度(如極座標系統( )),提供該自動駕駛系統進行相對應之驅動方向與距離之修正,自動駕駛跟該隨裝置保持預設的距離去跟隨可移動之該目標物。 FIG2 is a flow chart of an embodiment of a method for automatically following the actual distance and angle of the target object calculated by the tracking device of the present invention. Please refer to FIG2. The present embodiment mainly includes the application of the aforementioned coordinate system conversion step S15 after calculating the actual distance and actual angle between the tracking device 10 and the target object, that is, after the coordinate system conversion step S15, it further includes: a following step S16 is to use an automatic driving function of the tracking device, according to the coordinate correspondence between the device coordinates and the positioning coordinates, calculate the positioning coordinate value corresponding to the origin of the device coordinates, and convert the positioning coordinate value (such as a rectangular coordinate system) into an actual distance and angle (such as a polar coordinate system ( )) provides the automatic driving system with a corresponding correction of the driving direction and distance, and the automatic driving system follows the movable target object while maintaining a preset distance with the following device.

換言之,在該攝像單元11所擷取之該跟隨目標影像中,該目標物的位置與該跟隨目標距攝像單元11之鏡頭實際角度及實際距離具有一定關聯性,找出此關聯,就能透過鏡頭擷取之該跟隨目標影像中,該目標物位置的資訊,進一步找出實際上的角度及距離。由於每家鏡頭的規格特性及解析度不一,因此須針對該攝像單元11使用的鏡頭之擷取圖像,結合實際距離量測值,找出其曲線擬合(Curve fitting)公式。In other words, in the tracking target image captured by the camera unit 11, the position of the target object has a certain correlation with the actual angle and actual distance of the tracking target from the camera unit 11. By finding this correlation, the actual angle and distance can be further found through the information of the position of the target object in the tracking target image captured by the camera. Since the specifications and resolutions of each camera lens are different, it is necessary to find the curve fitting formula for the captured image of the camera lens used by the camera unit 11 in combination with the actual distance measurement value.

圖3為本創作裝置座標、第一原點座標、第二原點座標及定位座標關聯位置示意圖。如圖3所示,該跟隨目標影像20之圖像像素原點,即第一原點座標C1為圖像之左上角,且該跟隨目標影像20的座標系統使用( )來定義,或稱為圖像像素大座標系統( )。 FIG3 is a schematic diagram of the relationship between the coordinates of the creation device, the first origin coordinates, the second origin coordinates and the positioning coordinates. As shown in FIG3, the image pixel origin of the tracking target image 20, that is, the first origin coordinate C1 is the upper left corner of the image, and the coordinate system of the tracking target image 20 uses ( ) is defined, or called the image pixel large coordinate system ( ).

但由於實際使用圖像所擷取的空間不需要使用到整張圖像,一般會使用靠近到跟隨裝置中間附近的圖像像素,也就是圖3中靠近下半部比較小框中的部分,其左上角為第二原點座標C2,此比對影像座標系統則使用( )來定義,或稱為圖像像素小座標系統( )。 However, since the space captured by the actual image does not need to use the entire image, the image pixels close to the center of the tracking device are generally used, that is, the part near the small comparison box in the lower half of Figure 3, whose upper left corner is the second origin coordinate C2. The comparison image coordinate system uses ( ) is defined, or called the image pixel small coordinate system ( ).

圖像像素大座標系統( )與圖像像素之小座標系統( )為線性轉換關係,當大圖像與小圖像尺寸決定後,兩者轉換關係即可決定。 Image pixel coordinate system ( ) and the image pixel small coordinate system ( ) is a linear transformation relationship. Once the sizes of the large image and the small image are determined, the transformation relationship between the two can be determined.

但由於該跟隨裝置10使用該裝置座標系統(x,y)或稱為裝置座標系統(x,y)來定義。所以圖像像素小座標系統( )最好進一步轉換為該定位座標系統( ) ,或稱圖像( )座標系統來使用,亦即,圖像像素小座標系統( )與圖像( )座標系統之間,只要決定圖像底部中間為定位座標原點C3,即可得知其轉換關係,如圖3所示。 However, since the following device 10 is defined using the device coordinate system (x, y) or referred to as the device coordinate system (x, y), the image pixel small coordinate system ( ) is preferably further converted to the positioning coordinate system ( ), or image ( ) coordinate system, that is, the image pixel small coordinate system ( ) and images ( ) coordinate systems, as long as the bottom center of the image is determined as the positioning coordinate origin C3, the conversion relationship can be obtained, as shown in Figure 3.

接著,決定該跟隨裝置所適用之裝置座標系統(x,y)與圖像( )座標系統之間的關聯性,即可決定裝置座標系統(x,y)與圖像像素大座標系統( )之關聯性。 Next, determine the device coordinate system (x, y) and image ( ) coordinate system, it can determine the relationship between the device coordinate system (x, y) and the image pixel large coordinate system ( )’s relevance.

圖4為本創作目標物之影像位置與實際距離驗證示意圖及圖5為本創作一實施例之裝置座標系統X實測距離與定位座標系統之間曲線擬合關係示意圖。請參考圖4、圖5所示,為找出裝置座標系統(x,y)與圖像像素大座標系統( )之關聯性,可使用如圖4之影像位置與距離驗證圖,針對一已知攝像單元(11)與場地(F)距離之間的關係,就可以找出裝置座標系統(x,y)與圖像( )座標系統之間的關聯性。 FIG4 is a diagram showing the image position and actual distance verification of the target object of the present invention, and FIG5 is a diagram showing the curve fitting relationship between the device coordinate system (x, y) and the measured distance and the positioning coordinate system of the present invention. Please refer to FIG4 and FIG5 to find out the relationship between the device coordinate system (x, y) and the image pixel large coordinate system ( ) can be used to verify the image position and distance as shown in Figure 4. For a known distance between the camera unit (11) and the venue (F), the relationship between the device coordinate system (x, y) and the image ( ) coordinate systems.

表1為跟隨裝置之裝置座標系統X距離與圖像( ) 座標系統之間的實際量測值。 表1 X(cm) =0 =80 =160 =240 =320 =400 =360 0 30.2 60.5 90.7 121 151.2 =300 0 21.2 42.5 63.7 84.9 106.2 =240 0 15.6 31.3 46.9 62.6 78.2 =180 0 12.3 24.6 36.9 49.2 61.5 =120 0 10.2 20.4 30.6 40.9 51.1 =60 0 8.8 17.6 26.5 35.3 44 =0 0 7.75 15.5 23.5 31 38.75 Table 1 shows the X distance and image of the device coordinate system of the tracking device ( ) coordinate systems. Table 1 X(cm) =0 =80 =160 =240 =320 =400 =360 0 30.2 60.5 90.7 121 151.2 =300 0 21.2 42.5 63.7 84.9 106.2 =240 0 15.6 31.3 46.9 62.6 78.2 =180 0 12.3 24.6 36.9 49.2 61.5 =120 0 10.2 20.4 30.6 40.9 51.1 =60 0 8.8 17.6 26.5 35.3 44 =0 0 7.75 15.5 23.5 31 38.75

根據表1的關係,發現X距離與圖像( ) 座標系統關係可以假設為: 數學式1 = According to the relationship in Table 1, it is found that the X distance and the image ( ) The coordinate system relationship can be assumed as follows: Mathematical formula 1 =

表2為當 為80 cm時,各 之實測距離、表3為當 為80 cm時之曲線擬合數值: 表2 實測距離(cm) 360 30.2 300 21.2 240 15.6 180 12.3 120 10.2 60 8.8 0 7.75 表3 曲線擬合數值 80 360 28.214 80 300 21.32 80 240 15.866 80 180 11.852 80 120 9.278 80 60 8.144 80 0 8.45 Table 2 shows the When the The actual measured distance is shown in Table 3. The curve fitting value when the diameter is 80 cm: Table 2 Measured distance(cm) 360 30.2 300 21.2 240 15.6 180 12.3 120 10.2 60 8.8 0 7.75 Table 3 Curve Fitting Value 80 360 28.214 80 300 21.32 80 240 15.866 80 180 11.852 80 120 9.278 80 60 8.144 80 0 8.45

透過表2、表3及圖5跟隨裝置之裝置座標系統X距離與圖像( ) 座標系統之間曲線擬合的關係,可以找出: 數學式2 數學式3 According to Table 2, Table 3 and Figure 5, the device coordinate system X distance and image of the tracking device ( ) The relationship between the curve fitting of the coordinate system can be found: Mathematical formula 2 Mathematical formula 3

表4為跟隨裝置之裝置座標系統Y距離與圖像( ) 座標系統之間的量測值。 表4 (=95)(cm) =0 =80 =160 =240 =320 =400 =0 0 0 0 0 0 0 =60 13 13 13 13 13 13 =120 30 30 30 30 30 30 =180 55 55 55 55 55 55 =240 95.5 95.5 95.5 95.5 95.5 95.5 =300 163 163 163 163 163 163 Table 4 shows the Y distance and image of the device coordinate system of the tracking device ( ) coordinate systems. Table 4 (=95)(cm) =0 =80 =160 =240 =320 =400 =0 0 0 0 0 0 0 =60 13 13 13 13 13 13 =120 30 30 30 30 30 30 =180 55 55 55 55 55 55 =240 95.5 95.5 95.5 95.5 95.5 95.5 =300 163 163 163 163 163 163

根據表4的關係,發現Y距離與圖像( ) 座標系統關係可以假設為: 數學式4 According to the relationship in Table 4, it is found that the Y distance and the image ( ) The coordinate system relationship can be assumed as follows: Mathematical formula 4

其中數學式4表示跟隨裝置座標系統Y距離與圖像座標系統中的 影響關聯性不大,可以直接捨棄不用考慮。 The mathematical formula 4 represents the Y distance of the tracking device coordinate system and the image coordinate system. The impact is not relevant and can be discarded without consideration.

圖6為跟隨裝置之裝置座標系統Y距離與圖像( ) 座標系統之間曲線擬合結果。表5為 之實測距離,表6為對應表5的擬合曲線數值。 表5 (cm) 實測距離(cm) 0 0 60 13 120 30 180 55 240 95.5 300 163 表6 曲線擬合數值 3.55 8.38 26.17 56.92 100.63 157.3 FIG6 is a diagram showing the Y distance and image of the device coordinate system of the tracking device ( ) coordinate system. Table 5 shows the curve fitting results between Table 6 shows the fitting curve values corresponding to Table 5. (cm) Measured distance(cm) 0 0 60 13 120 30 180 55 240 95.5 300 163 Table 6 Curve Fitting Value 3.55 8.38 26.17 56.92 100.63 157.3

透過表4、表5及表6的數據,可以擬合出 數學式5 Through the data in Table 4, Table 5 and Table 6, we can fit the mathematical formula 5

由數學式2、3及5,即可得知就可以找出裝置座標系統(x,y)與圖像( ) 座標系統之間的關聯性。亦即,只要知道圖像中( ) 座標,我們就可以得知裝置座標系統的(x,y)座標。進一步用來操控隨機裝置所需的( )極座標系統也可以由直角座標系統直接轉換計算得知,之後再下指令給跟隨裝置10來完成其所需驅動之方向及距離。 From equations 2, 3, and 5, we can find the device coordinate system (x, y) and the image ( ) coordinate systems. That is, as long as we know the image ( ) coordinates, we can get the (x,y) coordinates of the device coordinate system. This is further used to control the ( ) The polar coordinate system can also be directly converted and calculated from the rectangular coordinate system, and then a command is given to the following device 10 to complete the required driving direction and distance.

本創作實施例的一些特點在於:在一些實施例應用中,可提供一種自動跟隨的裝置或載具,例如:當有重物需要從甲地搬運到乙地時,負責搬運人員可以不需要操控該載具,只要設定好跟隨目標(如負責搬運人員)後,將物品放置該跟隨載具上,則當負責人員開始移動從甲地前往乙地的過程,該跟隨裝置載具會鎖定該負責人員的影像進行跟蹤移動。另,此應用也可應用於辦公室文件或大型廠房文件或零組件收發置放需求。Some features of the present invention are: in some embodiments, an automatic following device or vehicle can be provided. For example, when a heavy object needs to be transported from place A to place B, the person responsible for transporting the object does not need to operate the vehicle. He only needs to set the following target (such as the person responsible for transporting the object) and place the object on the following vehicle. When the person responsible starts to move from place A to place B, the following device vehicle will lock the image of the person responsible to track the movement. In addition, this application can also be applied to the needs of receiving, sending and placing documents or components in offices or large factories.

本創作已利用上述較佳實施例揭示,惟其並非用以限定本創作。本創作所屬技術領域中具有通常知識者,可清楚了解本創作並不受限於上述說明性實施方式的細節。實施方式僅為說明本創作,而非限制本創作,本創作以申請專利範圍為依據,而非以上述說明為依據。申請專利範圍之文義及其均等範圍均屬本創作之專利權範圍。The present invention has been disclosed by using the above preferred embodiments, but they are not used to limit the present invention. Those with ordinary knowledge in the technical field to which the present invention belongs can clearly understand that the present invention is not limited to the details of the above illustrative implementation methods. The implementation methods are only for explaining the present invention, not for limiting the present invention. The present invention is based on the scope of the patent application, not on the above description. The meaning of the scope of the patent application and its equivalent scope are all within the scope of the patent right of the present invention.

(S10~S16):步驟 (10):跟隨裝置 (11):攝像單元 (20):跟隨目標影像 (30):比對影像 (C1):第一原點座標 (C2):第二原點座標 (C3):定位座標原點 (F):場地 (S10~S16): Steps (10): Following device (11): Camera unit (20): Following target image (30): Comparison image (C1): First origin coordinates (C2): Second origin coordinates (C3): Positioning coordinate origin (F): Venue

圖1為本創作跟隨裝置之計算目標物實際距離及角度的方法之一實施例流程圖。 圖2為本創作跟隨裝置算出之目標物實際距離及角度並據而自動跟隨之方法實施例流程圖。 圖3為本創作裝置座標、第一原點座標、第二原點座標及定位座標關聯位置示意圖。 圖4為本創作目標物之影像位置與實際距離驗證示意圖。 圖5為本創作一實施例之裝置座標系統X實測距離與定位座標系統之間曲線擬合關係示意圖。 圖6為本創作一實施例之裝置座標系統Y實測距離與定位座標系統之間曲線擬合關係示意圖。 FIG. 1 is a flowchart of an embodiment of a method for calculating the actual distance and angle of a target object by the creative tracking device. FIG. 2 is a flowchart of an embodiment of a method for automatically following the actual distance and angle of a target object calculated by the creative tracking device. FIG. 3 is a schematic diagram of the associated positions of the device coordinates, the first origin coordinates, the second origin coordinates and the positioning coordinates of the creative device. FIG. 4 is a schematic diagram of the image position and actual distance verification of the target object of the creative device. FIG. 5 is a schematic diagram of the curve fitting relationship between the measured distance of the device coordinate system X and the positioning coordinate system of the first embodiment of the creative device. FIG. 6 is a schematic diagram of the curve fitting relationship between the measured distance of the device coordinate system Y and the positioning coordinate system of the first embodiment of the creative device.

(S10~S15):步驟 (S10~S15): Steps

Claims (5)

一種跟隨裝置之計算目標物實際距離及角度的方法,包含: 一影像擷取步驟(S10),由一跟隨裝置(10)之一攝像單元(11)擷取一跟隨目標影像(20),其中,該跟隨裝置(10)具有一裝置座標(x,y),該跟隨目標影像(20)為內含一目標物之一二維平面圖像,該跟隨目標影像(20)具有一第一原點座標( )(C1); 一影像分割步驟(S11),辨識該跟隨目標影像(20)中該目標物之位置,將該跟隨目標影像(20)進行影像分割,取得一比對影像(30),其中,該目標物之影像位於該比對影像(30)中間; 一座標轉換步驟(S12),利用該跟隨目標影像(20)與該比對影像(30)之圖像尺寸,將該第一原點座標( )(C1)進行線性轉換後,在該比對影像中取得一第二原點座標( )(C2); 一座標定位步驟(S13),利用該第二原點座標( )(C2)之座標值( )於該比對影像(30)中設定一定位座標( )(C3); 一座標對應步驟(S14),透過該裝置座標(x,y)之各軸距離與該定位座標( )(C3)之間曲線擬合關係,計算出該裝置座標(x,y)與該定位座標( )(C3)之坐標對應關係;及 一座標系變換步驟(S15),依據該定位座標( )(C3)與該裝置座標(x,y)之坐標對應關係,計算出該跟隨裝置(10)與該目標物之實際距離與實際角度, 其中,前述計算係根據數學式1~5, 數學式1為 = , 數學式2為 , 數學式3為 , 數學式4為 , 數學式5為 A method for calculating the actual distance and angle of a target object by a tracking device comprises: an image capturing step (S10), wherein a camera unit (11) of a tracking device (10) captures a tracking target image (20), wherein the tracking device (10) has a device coordinate (x, y), the tracking target image (20) is a two-dimensional plane image containing a target object, and the tracking target image (20) has a first origin coordinate ( )(C1); an image segmentation step (S11), identifying the position of the target object in the tracking target image (20), performing image segmentation on the tracking target image (20), and obtaining a comparison image (30), wherein the image of the target object is located in the middle of the comparison image (30); a coordinate conversion step (S12), using the image sizes of the tracking target image (20) and the comparison image (30), converting the first origin coordinate ( After linear transformation (C1), a second origin coordinate is obtained in the comparison image ( )(C2); a coordinate positioning step (S13), using the second origin coordinate ( )(C2) coordinate value( ) sets a positioning coordinate ( )(C3); A coordinate correspondence step (S14), through the distance of each axis of the device coordinate (x, y) and the positioning coordinate ( )(C3) to calculate the device coordinates (x, y) and the positioning coordinates ( )(C3) coordinate correspondence; and a coordinate system transformation step (S15), according to the positioning coordinates ( The actual distance and actual angle between the following device (10) and the target object are calculated by using the coordinate correspondence between (C3) and the device coordinate (x, y). The above calculation is based on mathematical formulas 1 to 5. Mathematical formula 1 is = , Mathematical formula 2 is , Mathematical formula 3 is , Mathematical formula 4 is , mathematical formula 5 is . 如請求項1所述跟隨裝置之計算目標物實際距離及角度的方法,另包含: 一跟隨步驟(S16),依據該裝置座標(x,y)與該定位座標( )(C3)之坐標對應關係,算出與該裝置座標原點對應之該定位座標值,且將該定位座標值換算為實際距離與角度,提供該跟隨裝置之一自動駕駛功能進行相對應之驅動方向與距離之修正,自動駕駛跟該隨裝置跟隨該目標物。 The method for calculating the actual distance and angle of a target object by a tracking device as described in claim 1 further comprises: a following step (S16), according to the device coordinates (x, y) and the positioning coordinates ( )(C3) coordinate correspondence relationship, calculate the positioning coordinate value corresponding to the device coordinate origin, and convert the positioning coordinate value into actual distance and angle, provide an automatic driving function of the following device to perform corresponding driving direction and distance correction, and automatically drive the following device to follow the target object. 如請求項1所述跟隨裝置之計算目標物實際距離及角度的方法,其中,該裝置座標(x,y)、該第一原點座標( )(C1)、該第二原點座標( )(C2)及該定位座標( )(C3)為直角座標系。 The method for calculating the actual distance and angle of a target object by a tracking device as described in claim 1, wherein the device coordinates (x, y), the first origin coordinates ( )(C1), the second origin coordinates ( )(C2) and the positioning coordinates ( )(C3) is a rectangular coordinate system. 如請求項1所述跟隨裝置之計算目標物實際距離及角度的方法,其中,該座標定位步驟(S13)之該定位座標原點係設於該比對影像圖像之底部中間位置。A method for calculating the actual distance and angle of a target object by a tracking device as described in claim 1, wherein the origin of the positioning coordinates in the coordinate positioning step (S13) is set at the bottom center position of the comparison image. 如請求項1所述跟隨裝置之計算目標物實際距離及角度的方法,其中,該目標物為可移動物體。A method for calculating the actual distance and angle of a target object by a tracking device as described in claim 1, wherein the target object is a movable object.
TW112149699A 2023-12-20 2023-12-20 Method for calculating actual distance and angle of target object of tracking device TWI879332B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW112149699A TWI879332B (en) 2023-12-20 2023-12-20 Method for calculating actual distance and angle of target object of tracking device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW112149699A TWI879332B (en) 2023-12-20 2023-12-20 Method for calculating actual distance and angle of target object of tracking device

Publications (2)

Publication Number Publication Date
TWI879332B true TWI879332B (en) 2025-04-01
TW202526266A TW202526266A (en) 2025-07-01

Family

ID=96142176

Family Applications (1)

Application Number Title Priority Date Filing Date
TW112149699A TWI879332B (en) 2023-12-20 2023-12-20 Method for calculating actual distance and angle of target object of tracking device

Country Status (1)

Country Link
TW (1) TWI879332B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090213219A1 (en) * 2007-12-11 2009-08-27 Honda Research Institute Europe Gmbh Visually tracking an object in real world using 2d appearance and multicue depth estimations
US8515130B2 (en) * 2010-10-15 2013-08-20 Dai Nippon Printing Co., Ltd Conference system, monitoring system, image processing apparatus, image processing method and a non-transitory computer-readable storage medium
US8810648B2 (en) * 2008-10-09 2014-08-19 Isis Innovation Limited Visual tracking of objects in images, and segmentation of images
US9043146B2 (en) * 2013-06-19 2015-05-26 The Boeing Company Systems and methods for tracking location of movable target object
US11263761B2 (en) * 2016-02-26 2022-03-01 SZ DJI Technology Co., Ltd. Systems and methods for visual target tracking
CN114578849A (en) * 2022-01-18 2022-06-03 中国人民解放军海军工程大学 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090213219A1 (en) * 2007-12-11 2009-08-27 Honda Research Institute Europe Gmbh Visually tracking an object in real world using 2d appearance and multicue depth estimations
US8810648B2 (en) * 2008-10-09 2014-08-19 Isis Innovation Limited Visual tracking of objects in images, and segmentation of images
US8515130B2 (en) * 2010-10-15 2013-08-20 Dai Nippon Printing Co., Ltd Conference system, monitoring system, image processing apparatus, image processing method and a non-transitory computer-readable storage medium
US9043146B2 (en) * 2013-06-19 2015-05-26 The Boeing Company Systems and methods for tracking location of movable target object
US11263761B2 (en) * 2016-02-26 2022-03-01 SZ DJI Technology Co., Ltd. Systems and methods for visual target tracking
CN114578849A (en) * 2022-01-18 2022-06-03 中国人民解放军海军工程大学 Unmanned aerial vehicle target detection system, target detection method and computer readable storage medium

Also Published As

Publication number Publication date
TW202526266A (en) 2025-07-01

Similar Documents

Publication Publication Date Title
CN110335295B (en) A registration and optimization method for plant point cloud acquisition based on TOF camera
US20220292720A1 (en) Method and system for calibrating multiple cameras
CN112837383B (en) Camera and laser radar recalibration method and device and computer readable storage medium
CN104200086B (en) Wide-baseline visible light camera pose estimation method
CN109270534B (en) Intelligent vehicle laser sensor and camera online calibration method
CN109977813A (en) A kind of crusing robot object localization method based on deep learning frame
CN110458161A (en) A mobile robot door plate location method combined with deep learning
CN114413958A (en) Monocular vision distance and speed measurement method of unmanned logistics vehicle
CN108733039A (en) The method and apparatus of navigator fix in a kind of robot chamber
KR102490521B1 (en) Automatic calibration through vector matching of the LiDAR coordinate system and the camera coordinate system
Momeni-k et al. Height estimation from a single camera view
CN112884832B (en) Intelligent trolley track prediction method based on multi-view vision
CN110488838B (en) An accurate and repeatable positioning method for indoor autonomous navigation robot
Cheng et al. 3D radar and camera co-calibration: A flexible and accurate method for target-based extrinsic calibration
CN115371673A (en) Binocular camera target positioning method based on Bundle Adjustment in unknown environment
CN117911271A (en) A method and system for fast point cloud removal of dynamic obstacles based on YOLO
CN113191388A (en) Image acquisition system for target detection model training and sample generation method
Kim et al. External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots
Kang et al. Accurate fruit localisation for robotic harvesting using high resolution lidar-camera fusion
CN117611762A (en) A multi-level map construction method, system and electronic device
CN112540382B (en) Laser navigation AGV auxiliary positioning method based on visual identification detection
TWI879332B (en) Method for calculating actual distance and angle of target object of tracking device
CN115830140A (en) Method, system, medium, equipment and terminal for offshore short-range photoelectric monitoring
CN120219509A (en) A dynamically enhanced camera and lidar joint calibration method
CN110738706A (en) quick robot vision positioning method based on track conjecture