[go: up one dir, main page]

TW201235932A - Interactive device and operating method thereof - Google Patents

Interactive device and operating method thereof Download PDF

Info

Publication number
TW201235932A
TW201235932A TW100106523A TW100106523A TW201235932A TW 201235932 A TW201235932 A TW 201235932A TW 100106523 A TW100106523 A TW 100106523A TW 100106523 A TW100106523 A TW 100106523A TW 201235932 A TW201235932 A TW 201235932A
Authority
TW
Taiwan
Prior art keywords
response
display
unit
coordinate
image
Prior art date
Application number
TW100106523A
Other languages
Chinese (zh)
Other versions
TWI423114B (en
Inventor
Chin-Lun Lai
Hai-Chou Tien
Original Assignee
Liao Li Shih
Chin-Lun Lai
Lai Chin Ding
Hai-Chou Tien
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liao Li Shih, Chin-Lun Lai, Lai Chin Ding, Hai-Chou Tien filed Critical Liao Li Shih
Priority to TW100106523A priority Critical patent/TWI423114B/en
Publication of TW201235932A publication Critical patent/TW201235932A/en
Application granted granted Critical
Publication of TWI423114B publication Critical patent/TWI423114B/en

Links

Landscapes

  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An interactive entertainment device including an image capturing module, a processing module, and a graphic processing unit is illustrated. The image capturing module is used for capturing images of a response object. The processing module is used for generating a first movement track of a displayed object, identifying a second movement track of the response object according to the images captured by the image capturing module, computing an intersect point of the response object and the displayed object, and calculating a rebound movement track of the displayed object once the intersect point is found. The graphic processing unit generates the images of the displayed object in accordance with the rebound movement track and outputs to a display so that the displayed object would appear to move in compliant to the rebound movement track. Users may interact with the displayed object via the interactive device.

Description

201235932 六、發明說明: 【發明所屬之技術領域】 本發明係關於一種蜈樂設備及其運 於娛樂設備的-種絲#纽其運;,尤其係關 【先前技術】 坊=設有一些球類運動場館’如棒球、 ,,用以提供愛好_運動的人场習 ^ θ。然而,這類的場館通常需要廣 :機 的使用者及預留球類被擊打後 魏u如内夕數 此外,因場館建置成本高所需要的空間。 易於消耗損壞,也需要許多成本加以更;=練=球具 成本轉嫁到消費者身上時 ^補充’故在上述 =動、練習打擊的活動所費不f :更進一館 者欲進行打擊練習時,還必須配 1步來祝,當消費 動。 W而要視天候狀況以決定可否進行練習活 【發明内容】 本發明係涉及一種互動式 ㈣與顯示裝置所輪出的===法:供 進行球類運動等互動遊戲。》件的&像和τ互動,以 根據本發明的一種實^ 供使用者利用顯干梦晉、隹7方木所提供的互動式裝置用以 實施例中,===打擊練習等互動活動,在一個 影像擷取模^取組、處理模組及緣圖單元。 以辨識 棋、、且用於擷取回應物件之複數影像,201235932 VI. Description of the invention: [Technical field to which the invention pertains] The present invention relates to a 蜈乐设备 and its application to an entertainment device---------------------------------------------------------------------------- Class sports venues such as baseball, , are used to provide a hobby _ sports field ^ θ. However, such venues usually need to be wide: after the users of the machine and the reserved balls are hit, Weiu is like the number of inner nights. In addition, because of the space required for the stadium to build high costs. It is easy to consume and damage, and it also needs a lot of cost to be more; = practice = when the cost of the ball is passed on to the consumer, ^ supplements, so the activity of the above-mentioned action and practice is not f: when you want to practice the fight It must also be accompanied by 1 step to wish, when consumption moves. W, depending on the weather conditions to determine whether it is possible to carry out practice activities. [Invention] The present invention relates to an interactive (4) and display device that rotates the === method for interactive games such as ball games. The interaction between the pieces & τ and τ is used in an embodiment according to the present invention for the user to use the interactive device provided by the syllabus Mengjin and 隹7 Fangmu for the interaction in the embodiment, ===strike practice, etc. Activity, in a video capture module, processing module and edge map unit. To identify the chess, and to capture the complex image of the response object,

4/37 S 201235932 該回應物件之第二運動執跡。處理 兀、軌跡辨識單元、交合 々口。、,、、匕括.執跡產生單 跡產生單元耦接顯示裝[’:用:c跡計算單元。軌 軌跡;軌跡辨識單 $、、員不物件的第-運動 件之影“辨識用·回應物 ,接執跡產生單元及軌跡計: 件依照第一運動勅访、^ π 平①帛以什具顯不物 動而交會時之交會座二:卩:根據第二運動執跡運 單元及顯示梦蓄、,執亦计早几則耦接交會點計算 及交會座〆據第—運動執跡、第二運動轨跡 像,以及物件的立體影 並分別❹輯摘之立體影像, 互動=實=提:種運作方法’可用以運作上述的 第—運動軌齡=方法實關包括:控制繪®單元依據 示梦署。ηΓϊ71"物件的多個立體影像,並輸出到顯 物:、蚕叙士夕㊣坆不同視角連續擷取回應物件朝向顯示 的多個影像,並根據回應物件的多個影像計算 ^物件朝向顯示物件運動的第二運動轨跡。進-步還判 不物件根據第—運動轨跡運動與回應物件根據第二運 九跡運動時是否交會··當判斷結果為會交會時,則計算 轨=件的反應運動轨跡,並由繪圖單元根據反應運動: 忒及輸出顯示物件的多個立體影像。 r本發明所提供的互動式裝置及其運作方法可供使用者 =任了回應物件對顯錢置麟射出來賴示物件立體影 進行互動,互動式裝置將根據顯示物件及回應物件個別 5/37 201235932 :運動轨跡計算出顯示物件應有的反應結果,並將其具像 化(V軸hzation),以達到_的效果。 【實施方式】 么月各實細4例分別提供互動式裝置及其運作方法的 物^^以說明如何顯示虛擬的顯示物件供使用者以回應 卢健並!1用影像分析的技術判斷回應物件與所述的 此以供二物件乂二與否’再模擬出相對應的反應結果。藉 球戋賊:2實組運動場地以外,亦可進行如棒球、網 打擊練習,或進行其它具有相似概 與立體螢幕铺_制者可_肢體或器具 =螢幕顯現_容直接溝通互動,下達命令。 可種簡置’所述 的互動練習的技術手制’並可提供更彈性有效 球練習。 又,以供遊戲愛好者更便利地進行擊 〔互動式裝置之實施例〕 示意明的互動式裝置的使用環境 應物件2,如棒球立於顯示裝置1前方’並揮擊回 装置3控制顯==拍:器材’以嘗試揮擊由互動式 ,如棒球或網球二二影'象中的顯示物件4 明 轉體。本實施例以棒球棒及棒球進行說 6/37 5 201235932 $使用者5的方向運_效果。此外,互動式裝置3還設 有影像擷取模組30’所述影像錄模組3q包括至少一個以 上的影像擷取裝置,可連續操取回應物件2被揮擊而移動 的多個影像,以計算回應物件2的移動距離、方向和使用 者5揮擊的力道。藉此,互動式裝置3即可接收影像操取 杈組30所操取到的影像來計算回應物件2的運動軌跡,並 根據處理模組所控制輸出的顯示物件4的運動軌跡以及計 鼻出來的回應物件2的運動執跡,而計算出顯示物件4與 回應物件2在二維空間中交會的狀況。 因此’使用者5雖係對虛擬的影像揮擊,但仍可透過 互動式裝置3的運算和晶p示梦署Ί σ曰 打擊實體顯示她;1果 ㈣晝面,獲得如同 所,示裝置丨係可為採用如雙凸透鏡歸* ”見祕(parallax barrier)技術的裸視型立體顯示螢 顯示5 (即距離感)的立體影像。 ’、”、、衣 亦可為一般的顯示螢幕,此時使用者5項配哉 適當的輔助裝置觀看顯示裝置i,如配戴紅藍據光眼鏡或二 1獲得具深度(gp距離幻的立體影像。互動式 ^中。可設置於顯示裝置】附近或整合嵌人於顯示裝置1 ,2顯示了本發明實施例所提供的—種互動式裝 鬼圖。本實施例的互喊裝置3包括影軸取模組%、 广正杈組32、處理模組34、位置辨識單元35、繪 辨^號^輸單元38。校正模組32、處理模組^及位Ϊ 、* Β早几35分別耦接於影像擷取模組30,繪圖單元3 、接於校正模組32及處理模組34。訊號傳輪單元%執接 7/37 201235932 早% 34 ’並用以傳輸訊號到回應物件2。影像抝 包括間隔—歧離的至少—個以上影像 或相機)’故影像擷取模組30可同時以不同^ ( 擷取回應物件2被揮擊而移_多個影像。不门現角 互動式m可在二種不同的運作模式下運作 父模式及互動模式,以下分別舉實施例說明。匕括 [校正模式] 320 ^ 32 口愿座私汁异早兀.322及座標對應單元324。去= 2置3在校正模式時,可利用校正模組32辨識“ ^ 示的件2,以及賊用者的定錄置與㈣裝置1 , 不的衫像進行設定和擁化,⑽配合 顽 習慣及不同回應物件2而能準姻㈣门= 的動作 顯示物件是否與回應物件2交會。^丁虞置1所顯示的 校正模組32的物件座標輸出單元32〇通過綠 顯示單元丨,並根據互動式裝置3内記憶單 ㈣的資料’提供顯示物件的物件座標,以上 圖早7L 36根據物件座標繪製顯示物件的立體影像 二 ^ 所述的物件座標係為顯示物心 、y、攻置1之纟,、員不平面的三維座標。 請參閱圖2及圖3,圖3為顯示物件之物件座標的示音 圖。顯不裝置1所輸出的平面影像係 " 即㈣上,平面影像的每一像素;t;在=面' :::平面10的左上角開始C橫:: 為了呈現出立體影像的效果,顯示農置i係輸出至少 8/37 201235932 物件件之不同心的平面影像’利用顯示 本例中的Γ產生的像差(dispadty)造成深度感, 差值即為顯示裝置1所輪出賴示物件的像 城的欠件座標輸出單元320係根據記憶單元預先 i物。Γ輸出—組物件座標以控崎圖單元36 1會製顯 不物件4的立體影像。 衣,',貝 舉例來說,所述顯示物件47, 元32〇輸出物件座標(Xd】 ^座^^早 36根據物件座標的 36,~圖早凡 影像及第二影傳,χ ^產千面影像,分別為第一 示平面“ 為第—影像中,球體在顯 影像間的像i":^;值,zdi則為球體在第一影像及第二 μ 差亚利用正、負值分別表示補隹伞而从二 第:體在顯示平面10的二維座標二 即會在顯示it ! 1;=及第二影像所組成的立體影像 體懸浮在顯;球體晝面,呈現出球 體的座標峰D1編^ 而娜的立體球 操取=^= = _單元322 _於影像 干2的衫像,亚根據回庳 的像差’偵測出回應物件2的回應;椤::衫像 對於該影像擷取模組30所定義之擷:平:二怎坐標為相 請參閱圖2及圖4,圖4為回應 二,標。 圖。影像掏取模組3〇係以多 ^應座標之示意 取回應物件2的影像而或相機從不同視角擷 施例的影像操取^30 = =物件2的立體資訊。本實 3G包括兩個設置於同—垂直平面及同 9/37 2012359324/37 S 201235932 The second movement of the response object. Handling 兀, track recognition unit, and intersection 。. ,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,, Track trajectory; trajectory identification sheet $, and the shadow of the first moving part of the object "identification and response object, the stalk generating unit and the trajectory meter: the pieces are in accordance with the first movement, and the π ping is 1 帛At the time of the rendezvous, the meeting will be held at the second meeting: 卩: According to the second movement, the observing unit and the display of the dream storage, the execution is also counted as early as the coupling point calculation and the rendezvous. , the second motion trajectory image, and the three-dimensional image of the object and the three-dimensional image selected separately, interaction = real = mention: the kind of operation method can be used to operate the above-mentioned - the age of the track = method of practical control including: control painting The ® unit is based on a plurality of stereoscopic images of the object, ηΓϊ71", and is output to the display: the silkworm, the singer, and the different angles of view continuously capture the multiple images of the response object toward the display, and according to the multiple of the response object The image calculates the second motion trajectory of the object moving toward the display object. The step-by-step also judges whether the object moves according to the first motion trajectory and the response object according to the second movement and the nine-track motion. When calculating The trajectory of the reaction of the rail = member, and the motion of the drawing unit according to the reaction: 忒 and the output display multiple stereo images of the object. r The interactive device provided by the invention and the method of operation thereof are available for the user to respond to the object The interaction device will interact with the stereoscopic image of the object, and the interactive device will calculate the response result of the object according to the display object and the response object 5/37 201235932: motion trajectory, and visualize it. (V-axis hzation), in order to achieve the effect of _. [Embodiment] 4 cases of each month of the real month to provide interactive devices and their methods of operation ^ ^ to illustrate how to display virtual display objects for users to respond to Lu Jianhe! 1 Use the technique of image analysis to judge the response object and the above-mentioned two-piece 乂2 or not to re-simulate the corresponding reaction results. Borrowing thieves: 2 outside the actual sports venue, can also be carried out Such as baseball, net combat exercises, or other similar stereoscopic and three-dimensional screen shop _ _ limbs or appliances = screen display _ capacity direct communication interaction, release orders. Can be described as 'the described mutual The technical technique of the practice can provide more flexible and effective ball exercises. In addition, it is more convenient for game lovers to perform the attack. [Interactive device embodiment] The environment of the interactive device should be illustrated as object 2, such as baseball. Standing in front of the display device 1 and swinging back to the device 3 to control the display of the device to control the display object 4 in an interactive, such as a baseball or tennis game. In the case of baseball bats and baseball, 6/37 5 201235932 $ user 5's direction _ effect. In addition, the interactive device 3 is further provided with an image capturing module 30', the video recording module 3q includes at least one or more The image capturing device can continuously capture a plurality of images that are moved by the response object 2 to calculate the moving distance, the direction of the response object 2 and the power of the user 5 to swipe. Thereby, the interactive device 3 can receive the image captured by the image manipulation group 30 to calculate the motion track of the response object 2, and according to the motion track of the display object 4 controlled by the processing module, In response to the motion of the object 2, the situation in which the display object 4 and the response object 2 meet in a two-dimensional space is calculated. Therefore, the user 5 is able to swipe the virtual image, but can still display her through the operation of the interactive device 3 and the crystal display unit; The 丨 system can be a stereoscopic image showing the 5 (ie, the sense of distance) of a naked-view stereoscopic display such as a lenticular lens. The ', ', and clothing can also be a general display screen. At this time, the user 5 is equipped with appropriate auxiliary devices to view the display device i, such as wearing red and blue light glasses or two 1 to obtain a depth (gp distance stereoscopic image. Interactive ^ can be set in the display device) The interactive display device 1 and 2 are displayed in the vicinity or integrated with the display device 1 and 2, and the interactive shuffle device provided in the embodiment of the present invention includes the shadow capture module 100 and the Guangzheng group 32. The processing module 34, the position recognition unit 35, and the image recognition unit 38. The correction module 32, the processing module ^ and the bits 、, * Β 35 are respectively coupled to the image capturing module 30, the drawing unit 3, connected to the correction module 32 and the processing module 34. The signal transmission unit % is connected 7/37 201235932 Early % 34 'and used to transmit signals to respond to object 2. Image 拗 including interval - at least one or more images or cameras) 'The image capture module 30 can be different at the same time ^ (Respond The object 2 is swiped and moved _ multiple images. The non-door angle interactive m can operate the parent mode and the interactive mode in two different modes of operation, which are respectively described below. [Corrective mode] 320 ^ 32 mouths will be privately held 兀.322 and coordinate corresponding unit 324. Go to = 2 set 3 in the correction mode, the correction module 32 can be used to identify the "^ 2 shown, and the thief user's record setting (4) Device 1, no shirt image setting and crowding, (10) match with stubborn habits and different response objects 2 (4) Door = action shows whether the object meets with response object 2. ^ Correction shown in 1 The object coordinate output unit 32 of the module 32 passes through the green display unit 丨, and provides the object coordinate of the display object according to the data of the memory unit (4) in the interactive device 3, and the above figure 7L 36 draws a stereoscopic image of the object according to the object coordinate. The object described by The standard system is the three-dimensional coordinates showing the center of the object, y, the attack, and the plane. See Figure 2 and Figure 3. Figure 3 is a diagram showing the coordinates of the object of the object. The plane image system " that is, (four), each pixel of the plane image; t; at the upper left corner of the = face ' ::: plane 10 starts C horizontal:: In order to present the effect of the stereo image, the display of the farm i system output at least 8/37 201235932 The unconventional planar image of the object piece uses the disparity generated by the Γ in this example to create a sense of depth. The difference is the deficiencies of the image device of the display device 1 The coordinate output unit 320 is based on the memory unit. Γ Output—The group object coordinates to control the sagittal unit 36 1 will produce a stereoscopic image of the object 4. For example, the display object 47, element 32 〇 output object coordinates (Xd) ^ seat ^ ^ early 36 according to the object coordinates of 36, ~ map early image and second photo, χ ^ production The thousand-face image is the first plane "for the first image, the image of the sphere between the images is i":^; value, and the zdi is for the sphere to use positive and negative values in the first image and the second μ. Respectively, the two-dimensional coordinates of the body on the display plane 10 will be displayed on the display plane 10; the stereo image body composed of the second image is suspended in the display; the sphere is displayed on the surface of the sphere, showing the sphere The coordinate of the coordinate D1 is edited and Na's stereoscopic ball operation =^= = _unit 322 _ in the image of the image of the dry 2, the response of the response object 2 is detected according to the aberration of the 庳 椤; 椤:: 衫For example, the definition of the image capturing module 30 is as follows: flat: the two coordinates are the phase, please refer to FIG. 2 and FIG. 4, and the fourth is the response two, the standard. The image capturing module 3 is more than ^ The coordinates of the object should be taken in response to the image of the object 2 or the image of the camera from different angles of view. ^30 = = the stereo information of the object 2. The real 3G includes two Set in the same - vertical plane and the same 9/37 201235932

水平阿度⑽频杨機,並 軸所構成的平面)為 +面(知軸與yH 元322根據兩個#^面·。回應座標制單 應物件2的平面位置,到的兩個影像中的回 二維座其中-個影像的 =在兩個影像之間的像差二:獲: ’’面的深度(亦即與攝影 距 V猎此而獲得_件2三_ 之距離 為了易於辨識回應物件2的回應座標,可預::) :件】的特定部位標示—接觸點22(接觸 如= 或接應=,V22依據接觸點22 料的位置而計算出代表整個回 ^座標。為便於繪示,以下實施例僅以接觸點22為例說 座標對應單it 324㈣以將物件座標( 應座標相互對應,以便得知 二與以擷取平面3⑻為基準的回應座標二 1間上的座標對應關係。 使用者可在校正模式下,根據目視顯示裝置i购示 角間中的位置’將回應物件2擺置到碰 I員不物件的位置,以供影像娜模組3Q擷取回摩 =影像。參_ 2及圖5,圖5為顯示物件與回應物件交合 的不意圖。以棒球為例,當顯示裝置i所顯示出來; 物件4 (球體)與回應物件2 (球棒)接觸日夺,物件座# 10/37 201235932 (XD1,yD丨,ZD1)與回應座標(XH1,yH丨,ZH1)實際上係對應到三度空 點。因此’在物件座標與回應座標皆為線性關 係的刖k下,物件純的練祕如敍 可假設有以下的對應關係f: 知系、,·元 XH=C1XD+C2yD+C3ZD+C4 yH=C5XD+C6yD+C7ZD+C8 zH=c9XD+Ci〇yD+C] ]Z〇+Ci2 由於有0“12等12個變數待解以獲得 勺線性對應函式,物件座標輸出單元320可在物 知伽細附近再依序輸出其他三以同的物件座標 到(如加办4) ’使顯示裝 件座標的顯示物件4的讀影像。接著,❹=== 將回應物件2的接觸點22碰觸到顯: :柄組30操取到的影像’亦分別計算出其他三二:The horizontal Adu (10) frequency Yang machine, and the plane formed by the axis) is the + face (the axis and the yH element 322 are based on the two #^ faces. The response coordinates are determined by the plane position of the object 2, in the two images The back of the two-dimensional seat - one image = the aberration between the two images: get: ''the depth of the face (that is, the distance from the photographic distance V to get _ pieces 2 _ for easy identification In response to the response coordinate of object 2, it can be pre-marked::): the specific part of the part is marked - contact point 22 (contact such as = or response =, V22 is calculated according to the position of the contact point 22 material to represent the entire back coordinate. For convenience It is shown that the following embodiment only uses the contact point 22 as an example to indicate that the coordinates correspond to a single it 324 (four) to coordinate the object coordinates (the coordinates should correspond to each other so as to know the coordinates on the two sides of the response coordinate 2 with reference to the capture plane 3 (8). Corresponding relationship. In the calibration mode, the user can display the response object 2 according to the position of the visual display device i to the position where the member is not touched, so that the image module 3Q can be retrieved. Image. _ 2 and Figure 5, Figure 5 shows the intersection of the object and the response object. Figure: Take baseball as an example, when the display device i is displayed; object 4 (sphere) and response object 2 (ball) contact with the day, object seat # 10/37 201235932 (XD1, yD丨, ZD1) and response coordinates (XH1, yH丨, ZH1) actually corresponds to the third degree of vacancies. Therefore, under the 刖k where the object coordinates and the response coordinates are linear, the pure object of the object can be assumed to have the following correspondence f. : 知系,,·元 XH=C1XD+C2yD+C3ZD+C4 yH=C5XD+C6yD+C7ZD+C8 zH=c9XD+Ci〇yD+C] ]Z〇+Ci2 Since there are 0 “12 and other 12 variables to be Solving the scoop linear correspondence function, the object coordinate output unit 320 can sequentially output the other three identical object coordinates to the vicinity of the known gamma (for example, adding 4) 'to display the display object coordinate display object 4 Read the image. Then, ❹=== will touch the contact point 22 of the response object 2 to the display: : The image taken by the handle set 30' also calculates the other three:

Cl到C丨2的數值,建立起球 °十开出文數 的座標對應關係f。上述所例示的2在三度空間中 性對應關係,但本技術領域具通常==為線 複雜的計算_,找出 _ ^可^其他更 請參照圖6A及圖6BtI標對應關係。 ,同位置時,回應物件與顯:二用者定位 用者站立在爾置1前不同位置時;察 11/37 201235932 物件與顯示物件的交會點在三度空間中Η 的:=|會與使用者所觀看的角度與距中間 :吏是由物件座標輸出單元32G讀取並輸出2個^; :物件2與顯示物件4 一交二 —位,先參閱圖6A,當使用者面對顯示裝置1並 ^ =測點L](未綠於圖6A)而觀察到顯示物件4後, 像的2物件2移到使接聰22接_脸物件4之立體影 像的位置,以供回應座標蚪管 回應練(X V 早兀322算出回應物件2的 J下HU,yHL1,ZHL1)(對應圖6A的接觸點22)。再皋昭 f广當使用者移動到量測點L!後方的量測點L2時’α' 未較圖剛’顯示物件4的物件座標並未改變, 了月匕感覺’定位在量測叫時’顯示物件4與自身的距離 與定位在量測點Ll時顯示物件4與自身的距離—樣。 用者^動回應物件2使接觸點22,接觸到顯示物件4時的回 應座標(XHL2,yHL2,ZHL2)則比對應於量測點L的回應座標 (XHLl,yHLl,ZHU)更遠離顯示裝置!。 不 換έ之’使用者位於量測點Li及量測點y寺,就使用 者目視的角度所觀看到的顯示物件4固定顯示在同—個位 置(例如使用者定位在量測點Li所觀看到的顯示物件4距 離使用者半公尺m在量測點L2所觀看到的顯示物件 4仍然距離使用者半公尺遠),然而,個者以回應物件2 分別在制點L!及L2接顧相同龄物件4的回應座標則 有差異⑼上述的〜^命…及以動加細^此時 若未能根據使用者所在的量測點不同而改變回應座標與物 12/37 201235932 二應關係’則可能發生使用者認為已碰觸到顯示 4,/動式裝置3判斷為未交會的錯誤結果。因此可 的脖料ΐ不同量測點所產生的回應物件2與顯示物件4 :=對!關係應該相對應調整。特別說明的是,上述例 不以^丨1後不同位置的量測點Li&L2為例說明’但並 不以此例為限,者 關係不同的情况^里測點左、右不同時,亦具有座標對應 物株:4诗讓使用者在顯不裝置1前的任一個位置時,顯示 •,在於^回應物件2之間的座標對應關係都能正確被判斷 量測i Π下及\可”像操取模組30操取使用者在不同 收影像%的位置’並由位置辨識單元35接 掏取的影像來判斷使用者在每一量測點 測點八別:軸標’經由物件座標輸出單元320在每個量 如刀1”同的多組物件座標顯示顯示物件4數次( 桩此=:的四次)’經對應接觸後’由座標對應單元324 所麵量咖的三維純及計算tU目關變數( φ 到el2)’如此可分別獲得在不同量測點時的座 f(L)。換言之,取得四組物件座標(XDl,yD】,ZD1) (xD4,yD4,ZD4)與相對應的回應座標(χΗ⑶,%⑶而⑶)到 lXHL14,yHLl4,ZHL〗4),而由座標對應單元324進行運算後,可 獲得使用者定位在量測點L,時,回應物件2與顯;物件4 的座標對應關係f(L〇。同理,取得相同的四組物件座標 (xD],ym,zD】)到(xD4,yD4,ZD4)與相對應的回應座標 (XHUl’yHLH2丨)到(XHL24,yHL24,ZHL24)而進行運算後,則可庐 得使用者定位在量測點L2時,回應物件2與顯示物件^ 座標對應關係f(L2),依此類推。而所述用以在量測點L|進 13/37 201235932 行比對的四組物件座標與用在量測點進行比對的四組物 件座標相同,藉此可分別擷取及記錄使用者在不同位置時 ,觀看顯不在同一物件座標的顯示物件4,並進而根據目則 結果以回應物件2接觸顯示物件4的不同回應座標。用以 判斷使用者定位位置的量測點可為使用者的雙眼中心 幹上一特定位置。 &身區 為便於理解,本實施例中亦假設對應不同位置之息 關係之間屬線性關係,但本技術; 中具通书知識者自亦可知可採用其他更複雜、考 境影響因素的演算方式,以獲得不同量測點(即使2 不同定位點)時的回應物件2與顯示 = 轉)。在本實施例中,當計算出至少兩組不同 標對應關係f(L)後,使用者之後即可在顯示裝置】前2 位置對顯示物件4揮擊,座標對應單元324可根據 取^ 3G擷關的❹者定位位置,以及經過上述運、 戶對f關係之間的線性關係,估算出使:者 3。例如,當使用者站立位置介於 座㈣雜嶋糾難方法,藉由已知的 满叫時,回應物件2與顯示物件4的座標對應關係犯) 综合而言,根據座標對應單元32 ;座標與物件座標間的線性對應關係,可用以:f:: 罝測點時,回應物件2接觸顯示物件 5 。而進一步按不同量測點門的斟痛^ 子應杈式(pattern) 、間的對應關係所計算出來的座標 14/37 201235932 對應關係,γ m 回應物件2 ^判斷同—使用者在不同线位置時, 互動模式下ί^ΐ件4接_模式。當互動式裝置3在 示出來的。物二右回應物件2欲接觸顯示I置1所顯 物件座標與_卽^式裝置3即可計算同—時間點的 座標對應關仙符與座標對應單元324對計算出的 觸到所述顯:4而判斷回應物件2的啊 [互動模式] 請繼續參閱圖2,王 ^ 、軌跡產生單元342、交合料二包括340 ,以,跡辨識單以8 Γ ^ 344、通知單元346 口月同時參照圖2與圖7, 跡之示意®。當互料 Γ ^應物件與顯示物件軌 模組34的執跡產生單f互動模式下運作時,處理 與圖7)中記錄的次制、-可根據记憶單元(未繪於圖2 並根據第—運動執:控=機,生-第-運動軌跡62 ’ 4的立體影像並輸出^顯^圖=361 會製—速串顯示物件 隨著時間經過,沿著望一、、,使得顯示裝置〗顯示出 體影像。所述的運2動轨跡運_顯示物件4的立 、位移速度⑽顯示物件= 包^物件4驗移方向 件座標。 任弟一運動轨跡的每一點的物 ,使:二出具有立體感的顯示物件" 的影像都會相對於:Γ者如 面對顯示裝置1站立於左侧轉於.位置。例如使用者5 的顯示物件4後,使用去1而看到顯示在使用者5右方 即使面野顯示裝置〗朝右方移 15/37 201235932 動’所觀看到的顯示物件4㈣像減會 顯示物件4始終維持盥使用去曰〃刖相同’ 不物件4顯示在顯示裝置丨上時,純 = f顯示物件4以第—運動軌跡幻運動,則需要^ = Ϊ 5的移動狀態,依時間順序改變顯示物件4的物件'座^ Ϊ !擷:模組30可即時擷取使用者5的影像,並由位置 識制者5蚊錄置(如制者5的 轨跡產生單丨342根據位置辨識單元35所辨識出的定 ^置以及運動轨跡62,即時運算顯示物件4相對應 私用者之疋位位置的物件座標。藉此以使顯示裝置工戶; =ΓΓ物件4觀看起來,會隨著使用者5的移動而接 ^ 者5。以上述使用者5右移為例,執跡產生單元342 所根據的第-運動轨跡62假設為垂直於顯示平面⑺之χ 轴與Υ _執跡,當使用者5右移時,軌跡產生單元342 所產生的㈣物件4的物件絲,實際上俩時間經過逐 向使用者5的方向靠近,換言之,顯示物件4的物 全‘ X軸數值越來越小。藉此,從使用者5的角度觀看 時:才不致於產生顯示物件4始終與使用者5間隔一樣的 距綠而無法使回應物件2接觸到顯示物件4的問題。 〜再以棒球為-具體例示說明,當顯示裝£ 1根據_ 二=36的控制連續顯示出以第一運動軌跡運動的棒球的立 體影像時’使用者即可從顯示裝置丨觀看到像是從顯示裝 置1向使用者投擲而來的虛擬棒球。此時,使用者可握持 回應物件2,即球棒,向虛擬的球體影像揮擊。影像擷取模 、且30可持續擷取球棒的影像,並傳送到執跡辨識單元Mg 16/37 201235932 。由於球棒會根據❹者的揮擊動作,㈣ ,位移’因此,軌跡辨識單元348可依照影㈣取桓^產 在不同時間所榻取到的多個影像判斷球棒的位移,〇 球棒上預定的接觸點22的位移距離及方向, = 物件2運動的第二運動軌跡64。 μ出回應 ^交^計料71 344 _軌跡產生單元如及軌跡辨 4早το 348,亚分別根據執跡產生單元祀產生 運動 =62及軌跡辨識單元州計算出的第二運動軌跡糾計 y頒不物件4及回應物件2交會時,該交會點τ的交會座標 計日士 ί跡產f早兀342和軌跡辨識單元3招可分別包括有 =早兀未緣於圖2),因此執跡產生單元342產生第一 應物^ 早兀348接收影像#|取模組30擷取回 ^回絲影像時,分別可記錄顯示物件4影像輸出 可桐:鬥 影像被揭取的時間。交會點計算單元344即 的的顯示物件4的物件座標位置及賴 件4 ί Γ 合座標對應單元似所計算出顯示物 依日〃 …物件2兩個座標系統的座標對應關係,以判斷 ::二:運是動= 得球體的同—贼會。若是’則同時獲 未在顿算’第—運動轨跡62及第二運動軌跡64並 有交會iL曰時,交會(如兩個路徑完全無交會,或是雖 的通二’:a有時間差),則透過祕在交會點計算單元344 通知346發出未交會通知。繪圖單元36依照未交會 所包括的訊息產生如「打擊失敗」等文字、場景晝面 17/37 201235932 及/或相關的打擊記錄數據,並產生畫面輸出到顯示裝置1 ,以利使用者獲知當次揮擊未成功擊中球體的訊息。 再回到第一運動軌跡62及第二運動執跡64在相同的 時間點產生交會的情況。交會點計算單元344除了根據校 正模組32中所獲得的顯示物件4及回應物件2之座標的座 標對應關係之函式而計算出交會點I的交會座標之外,更進 一步由軌跡計算單元340計算出球體被球棒擊中(也就是 顯示物件4與回應物件2交會)後應產生的反應運動執跡 66。 顯示物件4與回應物件2交會後所產生的反應運動軌 跡66包括顯示物件4的反應距離及反應方向,此二者的變 化受到回應物件2的回應力道、顯示物件4被擊中時之速 度,以及顯不物件4與回應物件2接觸的入射角度的影響 由於板擬系統並非真實世界的碰撞,故可制將因回應 ί勿件2揮擊而產生的衝擊量完全由顯示物件4接收的簡化 Ϊ ί Ϊ以計算。而由於顯示物件4的第—運動轨跡6 2係由 生,二1置3根ί §己憶單元未示於圖”記制資料而產 的初速示物件4的質量為m。及顯示物件4 另,假設根據轨跡產峰罝;。μ 單元348的計算,判斷出t早二342 #設定以及執跡辨識 間、以及回應物件2 物件4在輸出經過單位時 應物件2在t2單位時f揮麵過單位時間時交會’且回 變動到交會點〗的仇置自起始點Ρ(Χηρ,Υηρ,Ζηρ) 曰·、、、η1,Υηι,ζη1) 〇 回應物件2的移動距The value of Cl to C丨2 establishes the coordinate corresponding relationship f of the ball. The two exemplified above are in the three-dimensional space neutral correspondence, but the technical field has the usual == for the line complex calculation _, find _ ^ can be other, please refer to the corresponding relationship of Figure 6A and Figure 6BtI. In the same position, the response object and the display: the two-user positioning user stands in different positions before the set 1; check 11/37 201235932 The intersection of the object and the display object in the three-dimensional space ::=| The angle and the distance viewed by the user: 吏 is read and output by the object coordinate output unit 32G; 2: the object 2 and the display object 4 are in two positions, refer to FIG. 6A first, when the user faces the display Device 1 and ^ = measuring point L] (not green in Figure 6A) and after viewing the object 4, the image of the object 2 is moved to the position of the stereo image of the connected object 4 for the response coordinate The fistula responds to the training (XV 兀 322 calculates the response to the object 2 under the HU, yHL1, ZHL1) (corresponding to the contact point 22 of Figure 6A). When the user moves to the measurement point L2 behind the measurement point L! 'α', the object coordinate of the object 4 is not changed compared with the figure just shown, and the lunar sensation 'position is measured. The time 'displays the distance between the object 4 and itself and the distance between the object 4 and itself when positioned at the measuring point L1. The user responds to the object 2 so that the contact point 22, the response coordinate (XHL2, yHL2, ZHL2) when the object 4 is displayed is farther away from the display device than the response coordinate (XHL1, yHL1, ZHU) corresponding to the measurement point L. ! . The user is located at the measuring point Li and the measuring point y temple, and the display object 4 viewed from the user's visual angle is fixedly displayed at the same position (for example, the user is positioned at the measuring point Li) The displayed display object 4 is still half a meter away from the user when the display object 4 viewed from the measuring point L2 is half a meter away from the user. However, the responding object 2 is at the manufacturing point L! The response coordinates of L2 to the same age object 4 are different. (9) The above-mentioned ~^ life... and the movement plus fine ^ At this time, if the measurement point is not changed according to the user's measurement point, the response coordinates and objects 12/37 201235932 The second response relationship may result in an error that the user thinks that the display 4 has been touched, and that the mobile device 3 determines that it has not been handed over. Therefore, the responding object 2 and the display object 4 generated by different measuring points can be: = right! Relationships should be adjusted accordingly. In particular, the above example does not take the measurement point Li&L2 at different positions after ^丨1 as an example to illustrate 'but not limited to this example. If the relationship is different, the left and right points are different. It also has a coordinate counterpart: 4 poems allow the user to display the position of any position before the device 1 is displayed, and the coordinate correspondence between the responding objects 2 can be correctly judged and measured. The image capture module 30 can be used to determine the position of the user at different positions of the received image and the image captured by the position recognition unit 35 can be used to determine that the user is measuring each point at each measurement point: the axis mark The object coordinate output unit 320 displays the display object 4 several times in the same number of object coordinates as the knife 1" (four times of the pile =: "after the corresponding contact" by the coordinate corresponding unit 324 The three-dimensional pure and calculated tU target variables (φ to el2)' can thus obtain the seat f(L) at different measurement points. In other words, obtain four sets of object coordinates (XDl, yD), ZD1) (xD4, yD4, ZD4) and corresponding response coordinates (χΗ(3), %(3) and (3)) to lXHL14, yHLl4, ZHL〗 4), and coordinate by coordinates After the operation of the unit 324, the user can be positioned at the measuring point L, and the correspondence between the object 2 and the coordinate of the object 4 is f(L〇. Similarly, the same four groups of objects (xD) are obtained. Ym, zD]) to (xD4, yD4, ZD4) and the corresponding response coordinates (XHUl'yHLH2丨) to (XHL24, yHL24, ZHL24) to operate, then the user can be positioned at the measurement point L2 The response object 2 corresponds to the object ^ coordinate corresponding to the coordinate f(L2), and so on. The four sets of object coordinates used for comparison at the measurement point L| into 13/37 201235932 are used for measurement. The points are the same as the four sets of object coordinates, so that the user can view and record the display objects 4 that are not in the same object coordinate position when the user is in different positions, and then respond to the object 2 to contact the display object according to the result of the goal. 4 different response coordinates. The measurement point used to determine the user's position can be The center of the user's eyes is dry on a specific location. In order to facilitate understanding, in this embodiment, it is assumed that there is a linear relationship between the interest relations corresponding to different positions, but the present technology; It can be seen that other more complicated and influential factors can be used to obtain the response object 2 and display = turn when different measurement points (even 2 different positioning points). In this embodiment, after calculating at least two sets of different standard correspondences f(L), the user can then swipe the display object 4 at the first 2 positions of the display device, and the coordinate corresponding unit 324 can take the ^3G according to The position of the leader of the Shaoguan, as well as the linear relationship between the above-mentioned operations and the relationship between the household and the f, is estimated to be: 3. For example, when the user's standing position is in the seat (four) chowder correction method, by the known full call, the correspondence between the object 2 and the coordinate of the displayed object 4 is made. In general, according to the coordinate corresponding unit 32; The linear correspondence with the coordinates of the object can be used to:f:: When the point is measured, the response object 2 contacts the display object 5. Further, according to the different measures, the coordinates of the door, the coordinates of the pattern, and the correspondence between the coordinates 14/37 201235932, γ m respond to the object 2 ^ judge the same - the user is in a different line When the position is in interactive mode, ί^ΐ4 is connected to _ mode. When the interactive device 3 is shown. Object 2 right response object 2 wants to contact display I set 1 object object coordinates and _ 卽 ^ type device 3 can calculate the same - time point coordinate corresponding Guan Xian Fu and coordinate corresponding unit 324 pairs of calculated touch :4 and judge the response object 2 ah [interactive mode] Please continue to refer to Figure 2, Wang ^, track generation unit 342, the intersection material 2 includes 340, to trace identification list to 8 Γ ^ 344, notification unit 346 mouth and month simultaneously Refer to Figure 2 and Figure 7, the schematic of the trace. When the interaction between the object and the display object track module 34 is generated in a single f interaction mode, the processing and the secondary system recorded in FIG. 7) can be processed according to the memory unit (not shown in FIG. According to the first - exercise: control = machine, the raw - the first motion track 62 ' 4 stereo image and output ^ display ^ map = 361 system - speed string display object over time, along the look of one, The display device displays the body image. The movement 2 shows the vertical and displacement speed of the object 4 (10), the object is displayed, and the object 4 is used to check the direction of the object. The object is such that the image of the display object having a three-dimensional appearance is relative to: the position of the display device 1 standing on the left side to the position. For example, after the user 5 displays the object 4, the use 1 And the display is displayed on the right side of the user 5 even if the face display device is moved to the right 15/37 201235932. The displayed object 4 (four) is displayed like the minus display object 4 is always maintained and used the same 'no When the object 4 is displayed on the display device, the pure = f shows the object 4 with the first motion trajectory For the illusion movement, you need the movement state of ^ = Ϊ 5, and change the object of the display object 4 in time order. 座 撷 撷 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组 模组The recording (such as the trajectory generating unit 342 of the maker 5 is based on the fixed position and the motion trajectory 62 recognized by the position recognizing unit 35, and the object coordinates of the position of the object 4 corresponding to the private user are instantaneously calculated. Thereby, the display device worker; = the object 4 is viewed, and the user 5 is connected with the user 5. The user 5 is shifted to the right, for example, the first generation according to the execution generating unit 342 - The motion trajectory 62 is assumed to be perpendicular to the 平面 axis and Υ _ 执 of the display plane (7). When the user 5 moves to the right, the object of the object 4 generated by the trajectory generating unit 342 is actually used for both directions. The direction of the person 5 is close, in other words, the value of the full object 'X axis of the display object 4 is getting smaller and smaller. Thereby, when viewed from the angle of the user 5, the display object 4 is not always separated from the user 5 at all times. It is a problem that the response object 2 is in contact with the display object 4 from the green. In the case of a baseball-specific illustration, when the display device 1 1 continuously displays the stereoscopic image of the baseball moving in the first motion trajectory according to the control of _2=36, the user can view the image from the display device. The virtual baseball thrown by the device 1 to the user. At this time, the user can hold the response object 2, that is, the bat, and swipe the virtual spherical image. The image captures the mold, and the 30 can continuously capture the bat. The image is transmitted to the obstruction identification unit Mg 16/37 201235932. Since the bat will be according to the slamming action of the singer, (4), the displacement 'thus, the trajectory identification unit 348 can take the time according to the shadow (four) The plurality of images taken determine the displacement of the bat, the displacement distance and direction of the predetermined contact point 22 on the racquet bar, and the second motion trajectory 64 of the movement of the object 2. μ 回应 ^ 交 交 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 71 When the object 4 and the response object 2 are rendezvous, the intersection point of the intersection point τ, the date of the levy, the date of the 士 trace 342 and the trajectory identification unit 3 can be included respectively = early (not before the figure 2), so The trace generating unit 342 generates the first object ^ 兀 348 receives the image # | takes the module 30 to retrieve the ^ silk image, respectively, can record the time when the image of the object 4 can be output. The intersection position of the object 4 of the display object 4, which is the intersection point calculation unit 344, and the coordinate unit of the object 4 所 座 座 对应 对应 似 似 似 似 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物 物Second: the movement is the same = the same thief will get the sphere. If it is 'at the same time, it is not counted in the first - motion trajectory 62 and the second motion trajectory 64 and there is a rendezvous iL ,, the rendezvous (if the two paths are completely absent, or although the two pass: 'a has time difference Then, the secret meeting point calculation unit 344 notifies 346 that the unpaid notice is issued. The drawing unit 36 generates texts such as "failure", scenes 17/37 201235932 and/or related striking record data according to the information included in the unsold club, and generates a screen output to the display device 1 to facilitate the user to know. The second swipe did not successfully hit the ball. Returning to the case where the first motion trajectory 62 and the second motion trajectory 64 generate an intersection at the same time point. The intersection point calculation unit 344 calculates the intersection coordinates of the intersection point I in addition to the function of the coordinates of the coordinates of the coordinates of the coordinates of the coordinates of the object 4 and the object of the response object 2 obtained by the correction module 32, and further by the track calculation unit 340. Calculate the reaction motion trace 66 that should be produced after the sphere is hit by the bat (ie, the object 4 is shown to meet the response object 2). The reaction motion trajectory 66 generated after the intersection of the display object 4 and the response object 2 includes the reaction distance and the reaction direction of the display object 4, and the changes of the two are reflected by the return stress of the object 2 and the speed at which the object 4 is hit. And the influence of the incident angle of the object 4 in contact with the response object 2, since the panel system is not a real world collision, the simplification of the amount of impact generated by the response of the object 2 can be completely reduced by the display object 4. Ϊ ί Ϊ to calculate. However, since the first motion track 6 2 of the display object 4 is generated by the raw material, the first speed display object 4 is not shown in the figure. The mass of the initial velocity display object 4 is m. 4 In addition, it is assumed that according to the calculation of the trajectory peak 罝; μ unit 348, it is judged that t is early two 342 # setting and between the identification identification, and the response object 2 object 4 is outputting the unit when the object 2 is in the t2 unit fWhen the unit time is crossed, the rendezvous of the rendezvous 'and the change back to the rendezvous point' is from the starting point Ρ(Χηρ,Υηρ,Ζηρ) 曰·,,,η1,Υηι,ζη1) 〇Response to the moving distance of the object 2

所對應的回應座標計管而/曰p根據起始點P與父會點I 18/37 201235932 獲仔移動距離S後,更進一步根據加速度計算公式, 而可獲得喊物件2的軸加速度ai : q = 2V(,22) 庫物應物件2的與加速度ai計算出回 物件2的ίΐ9點1擊中球體時的回應力道F。其中,回應 w mi可贱經過量測後記錄於互動式裝置3的 。〜早凡中。回應力道F為:The corresponding response coordinate gauge / / 曰p according to the starting point P and the parent point I 18/37 201235932 to obtain a moving distance S, and further according to the acceleration calculation formula, and obtain the axis acceleration ai of the object 2: q = 2V(,22) The storage object 2 and the acceleration ai calculate the return stress path F when the object 2 is hit by 9 points. Wherein, the response w mi can be recorded on the interactive device 3 after being measured. ~ Early in the middle. The return stress path F is:

F = m'.a] a,係2示物件4被轉物件2擊中後,其反彈的加速度 …'、目應力這?對顯示物件4質量m。的作用而產生·· = F/mo 、亲再由,示物件4隨第一運動執跡運動的初速度加 k又%及顯不物件4到達交會點丨所經過的時間h,計算 出頦示物件4根據第一運動執跡到達交會點I時的速度Vi v^~v〇 + a〇-t\ 藉此即可得知顯示物件4根據反應運動轨跡66運動時 的反彈速度V2# : _ $ V2 = V] + a2'i3 、中L係^曰回應物件2與顯示物件4接觸的時間長产 么值得一提的是,真實世界中揮擊而接觸顯示物件的時 間約為〇·6至0.7毫秒,因此可預設t3為〇.6或〇 7毫秒。 飞最後,只要得知顯示物件4根據反應運動轨跡在空中 散仃的時間長度U後,即可計算出球體根據反應運動執跡 19/37 201235932 而移動的距離Sl : A = V2.i4 補充朗,上述對受擊物被擊打後其速度變化的說明 ’亦可以用物理學上的衝量關係來表示’即以=,.心; 中Δν - - Vl,係為顯示物件*的速度改變量。 欲。十π顯示物件4根據反應運動轨跡飛行的時間長 度t4 ’換句如,指顯示物件4停留於空中到墜地前的時間 ’則需要先計算出顯示物件4與回應物件2交會時,相對 於回應物件2所形成的-接觸平面的人射角度。 請參閱圖2與圖δΑ,圖8A為顯示物件1射角度及反· 射角度之示意圖。接觸平面7 〇係為回應物件2揮擊以致碰 觸到顯示物件4時所形成的一個虛擬平面。執跡辨識單元 348根據影像擷取模組30連續揭取的多個回應物件2之影 像,刀析出回應物件2的接觸平面%,並由合曾 元靖顯示物件4依照第_運動軌跡二= 70時,與接觸平面70的正切方向產生的入射角度^。顯 示物件4被擊中後的反彈方向即會相對於接觸平面7〇的正 切方向’而以與人射角度Θ!同大小的反射角度飛行 。因此鲁 ’軌跡計算單元340根據反射角度及重力加速度公式,即 可计异出顯示物件4依照反應運動轨跡%運動時的時間長 度當球體入射的角度越大,其反射的角度也越大,如圖 8B的入射角度(9 2所示。 請繼續參閱圖2,藉由上述各項變數的計算結果,軌跡 計算單元340可獲彳于顯示物件被回應物件2擊中後反彈的 距離和方向,並產生出反應運動執跡到繪圖單元36,由繪 圖單元36根據反應運動執跡的内容產生顯示物件畫面,以 20/37 $ 201235932 輸出到顯現出顯示物件反彈 二3算單元可進—步根據顯示物件; 即時對應背景影像,並心 管出夂種併繪示軌跡計算單幻4〇所計 ::;種對應於不同反應運動執跡的背景影像,以便二 不衣 呈現出如臨實體揮擊場地的臨場晝面。 ’、 位置二模tv吏用者可能會移動其定位位置,因此 :以計算::的座標到交會點計算單二 Λ早7V計算第—運純跡62與第二運動軌 g 2 可依據制者所在的位置自記憶單it ( 對應單元324預先估算的座標對_ 在另-個實施例中,回應物件2可 例如振動產生單元、音效單域發料㈣。= 當交會點計算單元344計算出回應物件' 小’ ^線或無線地傳送回饋信號到振動產生單元,使^ 產生早讀據回饋錢產生對應於回應力道F之強度的^ 動,以讓使用者感受到更擬真的打擊經 ^為音效U紐光單树,音料域發鮮 可根據回饋彳產生相對於回應力道f的㈣或 例如:回應城F越強音效單元發出的聲音紙,;應功 遏F越制聲音越」、;或是回應力道f越強發光單元 的光線越亮,反之則發出微弱的光線。 21/37 201235932 上述實施例及相對應的圖式中,雖係以棒球作為顯示 裝置1所輸出之顯示物件的具體例示,而回應物件2則^ 相對應的球棒,但在實際操作上,使用者可透過遠端控制 (如遙控器)對互動式裝置3輸人控制命令,切換顯示物 件的項目,例如對應為網球打擊練習。不同的、練習顯示物 2會對應不同的運動軌跡模式,例如棒球與網球的運動路 徑即大異其趣。校正模組32及處理模組34需依昭互 3被指定的_項目而在產生第一運動執 : 不物件影像時產生區別。 負 請參照圖9’圖9繪示了本實施例所述的互 ,、=圖。圖9所示的互動式裝置3a與圖2不同之處^於 本^例的_單元36 _於訊號傳輸單元%,用 元38將緣圖單元36根據執跡產生單元342 =的影像訊號傳送顯示裝置^以輸出 f關係可記錄在記憶單元37,以供交合點呼管2 f 斷顯示物件4與回應物件的交會與否時θ所用 判 等打擊裝 為排球戍手排日’ Τ,4之外,當顯示物件4 為回庫1 為使用者的肢體,如手臂或手掌作 模組34的勅=^且32的回應座標計算單元322或處理 ,出影像中的使线用早^肢3Γ可,行影像辨識的演算技術而 如上歧末端的手掌)與顯:==1 觸斷定肢體(F = m'.a] a, the system 2 shows the acceleration of the rebound after the object 4 is hit by the object 2, and the eye stress? For the display object 4 mass m. The effect is generated by ·· = F/mo, pro-re-in, and the object 4 is calculated with the initial velocity of the first motion-observing movement plus k and % and the time h after the object 4 reaches the intersection point. The object 4 can know the rebound speed V2# when the display object 4 moves according to the reaction motion track 66 according to the speed Vi v^~v〇+ a〇-t\ when the first motion track reaches the intersection point I. : _ $ V2 = V] + a2'i3 , medium L system ^ 曰 response object 2 and the display object 4 contact time long production It is worth mentioning that the real world swings and contacts the display object time is about 〇 • 6 to 0.7 milliseconds, so t3 can be preset to 〇.6 or 〇7 milliseconds. At the end of the flight, as long as the length U of the object 4 is revealed in the air according to the reaction trajectory, the distance S1 of the sphere moving according to the reaction movement track 19/37 201235932 can be calculated: A = V2.i4 Lang, the above description of the change in speed after the hit is hit can also be expressed by the physical impulse relationship 'that is, =,. heart; Δν - - Vl, which is the speed change of the display object* the amount. want. Ten π shows the length of time t4 of the object 4 flying according to the reaction trajectory. 'For example, if the time when the object 4 stays in the air to fall to the ground is displayed', it is necessary to calculate the intersection of the display object 4 and the response object 2, as opposed to Responding to the angle of incidence of the contact plane formed by the object 2. Please refer to FIG. 2 and FIG. 8A. FIG. 8A is a schematic diagram showing the angle of incidence and the angle of reflection of the object. The contact plane 7 is a virtual plane formed in response to the object 2 swiping so as to touch the display object 4. The trace recognition unit 348 extracts the image of the plurality of response objects 2 continuously extracted by the image capture module 30, and the tool extracts the contact plane % of the response object 2, and displays the object 4 according to the first motion track 2 by 70. The angle of incidence generated by the tangential direction of the contact plane 70. It is revealed that the rebound direction of the object 4 after being hit is relative to the tangential direction of the contact plane 7〇 and flies at a reflection angle of the same size as the human angle. Therefore, according to the reflection angle and the gravity acceleration formula, the Lu's trajectory calculation unit 340 can calculate the length of time when the display object 4 moves according to the reaction trajectory %, and the angle of the reflection is larger when the angle of the sphere is incident. The incident angle of FIG. 8B is shown in FIG. 8B. Referring to FIG. 2, by the calculation result of the above variables, the trajectory calculating unit 340 can obtain the distance and direction of the rebound after the display object is hit by the response object 2. And generating a reaction motion to the drawing unit 36, and the drawing unit 36 generates a display object image according to the content of the reaction motion, and outputs to the display object rebounding by 2/37 $201235932. According to the display object; the background image is immediately matched, and the trajectory is calculated and the trajectory calculation is calculated according to the illusion::; the background image corresponding to the different reaction movements, so that the two garments appear as the physical entity Swing the face of the venue. ', position two-mode tv users may move their positioning position, therefore: to calculate:: coordinates to the intersection point calculation single two early 7V calculation The transport trace 62 and the second motion track g 2 may be self-remembered according to the position of the maker (the coordinate pair pre-estimated by the corresponding unit 324). In another embodiment, the response object 2 may be, for example, a vibration generating unit, Sound effect single domain issue (4). = When the intersection point calculation unit 344 calculates the response object 'small' line or wirelessly transmits the feedback signal to the vibration generating unit, so that the early read data is generated to generate the intensity corresponding to the return stress path F. The action is to allow the user to feel more realistically hitting the sound. The sound field can be generated according to the feedback 彳(4) or, for example, the stronger the response city F The sound paper emitted by the sound unit, the more the sound is made, the stronger the light is, the stronger the light of the light-emitting unit is, and the light is emitted. 21/37 201235932 The above embodiment and corresponding In the drawing, although the baseball is used as a specific example of the display object outputted by the display device 1, and the response object 2 is corresponding to the bat, in actual operation, the user can control through the remote control (such as remote control). Interactive) The device 3 inputs a control command to switch items for displaying the object, for example, corresponding to a tennis strike practice. Different exercises 13 can correspond to different motion track modes, for example, the motion path of baseball and tennis is quite different. The group 32 and the processing module 34 need to make a difference in the first motion: the object image is generated according to the specified _ item of the mutual mutual 3. Negative, please refer to FIG. 9 ′ FIG. 9 illustrates the mutual interaction described in this embodiment. The interactive device 3a shown in FIG. 9 is different from that of FIG. 2 in the _unit 36__ signal transmission unit % of the present example, and the edge map unit 36 is used according to the trajectory generating unit 342 by the element 38. = image signal transmission display device ^ can be recorded in the memory unit 37 with the output f relationship, for the intersection point call tube 2 f to break the display object 4 and the response object when the intersection or not θ is used for the volleyball hand platoon In addition to the day ' Τ, 4, when the display object 4 is the back of the library 1 for the user's limb, such as the arm or the palm of the module 34 敕 = ^ and 32 of the response coordinate calculation unit 322 or processing, the image in the image The line uses the early limbs 3 Γ, the algorithm of image recognition Palm above the manifold end) and significantly: == 1 Touch determine limb (

22/37 S 201235932 〔互動式裝置的再一 請參閱圖1〇,在另—电:列〕 括耗接於處理模組34 =例中’互動式裝置3b還可包 或無線傳輸的方式另—早7039,用以透過連接線390 ’以供不同使用者透過^二的:接單元%連結 互相連結進行雙向對打的互動亍雙向對打。 產生單元342將對方的點式衣置b僅需互相由執跡 為自身的顯示物件 1物件的反應運動軌跡對應轉換 可達成連線進行雙向^ =跡而輸出到顯示裝置,即 2動式裝置之運作方法實施例〕 圖11及圖12揭露—種 ε 的流㈣,㈣紅動式衣1响作枝實施例 動式裝置。” 3式她為圖2實施例所提供的^ r 圖11况明互動式裝置 ,而圖U則說明使用者利用乍.扠正流程 流程。以下將分觀明互動认置與顯示物件互動的 r竹刀扪.兄明圖u、圖12 作方法的各個频,請—併參照圖2置的運 施例的方塊圖。 々互動式裝置實 在本只%例中,可先由互動式 應物件2的對應關俜谁疒扦 " h、·貝不物件與回 動° i卜校正回應物件2與顯示n開互 應關係時,影像操取模組3G可擷取使用Ί對 置辨識單元35辨識使用者的定位位置為一量;= )。物件座標輸出單元32〇則可 〜占(S1101 2) ^ 物件座標繪製並在顯示裝置 + 根據 直"别出-個對應於物件座標的 23/37 201235932 顯不物件的立體影像⑶1〇3)。所述的物件座標可為對應 」t平面1Q的—個二維座標,顯示平面1G則例如為顯 不哀置1的面板(參閱圖3)。 接著由使用者目測顯示物件的位置 =:裝sr的顯示物件的立體影像二 ..干(S 〇5),影像擷取模組3〇的多個摄影 見角操取回應物件2的多個不同視角 ^像(SH07),並且依照回應物件2在 標的回應物件2的像差’由二 所述的回應座標則為對::=了:(5咖。 的擷取平面可為影像操取模組 ^、·隹座標,所述 頭所在的同一垂直平面(參、夕域或相機的鏡 由於顯示物件所對應的物件座;=:·)。 應的回應座標分別屬於不 =及回應物件2所對 示物件與回應物件 =的座標系統,為了獲得顯 接觸到顯示物件時利肋應物件2 面與掏取平面兩個座標系統的座^^,來計算顯示平 =且32可判斷輸出顯示物件影像::關係。因此,校正 座標的次數是否已經到達預定欠凄、叶异回應物件2之回應 一個量測點應量測四次。若广、(S1U〗),例如預設每 返回步驟S1103執行,若已到丨達預定的量測次數,貝 據所量測到的多組物件座標的量測次數後,則根 定一個量測點揮擊時,顯二件2標計算出使用者在特 關係(S1113)。 ,、回應物件2的座標對應 24/37 £ 201235932 由於使用者在活動的過程中可能改織 用者所在的定位位置改變日士,蟲_ 又疋位位置,而使 標對應關係即可能:物件與⑽物件2的座 應關係後,校正模組32 凡成一個量測點的座標對 到達預定數量(S1115)。辑里測點的數量是否已 則指示使用者㈣到另—t到達預定的量測點數量,22/37 S 201235932 [More details of the interactive device please refer to Figure 1〇, in the other-electric: column] Including the processing module 34 = in the example 'interactive device 3b can also be packaged or wirelessly transmitted in another way - Early 7039, for connecting the two lines by means of the connection line 390' for different users to connect through the two units: the connection unit is connected to each other for two-way interaction. The generating unit 342 converts the pointing clothes b of the other party only by the reaction trajectory corresponding to the object 1 of the display object 1 that is being tracked by itself, and can realize the connection and output to the display device, that is, the 2-moving device. [Embodiment Method] FIG. 11 and FIG. 12 disclose a flow (4) of ε, and (4) a dynamic device of a red-moving garment. Figure 3 provides the interactive device for Figure 2, while Figure U shows the user using the process flow. The following will identify the interactive interaction with the display object. r竹刀扪. Brother Mingu u, Figure 12 for the various frequencies of the method, please - and refer to the block diagram of the application example shown in Figure 2. 々 interactive device is only in this example, can be first interactive object 2 corresponds to who is 疒扦" h,···························································································· 35 identifies the positioning position of the user as an amount; =). The object coordinate output unit 32 可 can be occupied (S1101 2) ^ object coordinates are drawn and displayed on the display device + according to the straight line - the other corresponds to the object coordinates 23/37 201235932 Stereoscopic image of the object (3)1〇3). The object coordinates can be a two-dimensional coordinate corresponding to the "t plane 1Q", and the display plane 1G is, for example, a panel that does not show the first one (see the figure). 3). Then, the position of the display object is visually displayed by the user =: the stereoscopic image of the display object with the sr is 2: dry (S 〇 5), and the plurality of photographic images of the image capturing module 3 见 are taken to respond to the plurality of objects 2 Different angles of view ^ (SH07), and according to the response object 2 in the target response object 2 aberration 'by the two said response coordinates are right: : = : (5 coffee. The capture plane can be image manipulation The module ^, · 隹 coordinates, the same vertical plane where the head is located (the mirror of the reference, the eve, or the camera is the object seat corresponding to the display object; =: ·). The response coordinates are not = and the response object 2 The coordinate system of the object and the response object = in order to obtain the visible contact with the display object, the rib should be the object of the two sides of the object and the plane of the coordinate system of the capture plane, to calculate the display flat = and 32 can judge the output Display object image:: relationship. Therefore, whether the number of times the coordinate is corrected has reached the predetermined undershoot, and the response of the leaf response object 2 should be measured four times. If wide, (S1U), for example, preset every return Step S1103 is executed, if the predetermined number of measurements has been reached After measuring the number of measurements of the coordinates of the plurality of objects measured by the data, when the weight of the measurement point is determined, the two pieces of the standard are calculated to calculate the user relationship (S1113), and the response object 2 is Coordinates correspond to 24/37 £ 201235932 Since the user may change the position of the user in the course of the activity to change the position of the Japanese, the insect _ and the position of the position, and the corresponding correspondence is possible: the object and (10) the seat of the object 2 After the relationship is correct, the coordinate pair of the calibration module 32 reaches a predetermined number (S1115). Whether the number of the measurement points in the series indicates that the user (4) reaches the predetermined number of measurement points.

,在新的量測點再度執行置’再返回步驟SUCH ,以獲得另一個座標對應關係。 及其以下的量測步驟 當同一使用者已經在多_ 量測點的數量已經到達預設數旦 里U進仃置測,而 可根據使用者在每-個量二二^標對應單元324即 應關係,而估域座標對 時,所揮擊的回應物件2與顯 ;-位置 關係(S1117)。最後並可 b1的各個座標對應 特定該使用者的座標對應關; )中(S1119),以完成校 早凡(未、.'曰方;圖2 前述實施例,請參照圖4的相關^月計算已揭示於 當有多數個使用者要同時利用互動‘ ° 據圖13所,程,= 錄在記憶單元。母1用者的座標對應關係,並記 獲得回應物件2與顯示物件的座標對 互動,:影像操取模組3。可擷取使用者:影像:= 置辨識早70 35辨識出❹者的定位位置⑷ 制繪圖單元36依據預先記錄在記S 几(未繪於圖2)中的第-運動軌跡,繪製顯示物件的多個 25/37 201235932 立體影像’並輸出到顯示裝置1 (S1203);所述的顯示物件 可如前所例示的棒球、網球或羽球等虛擬物體 。顯示义置1連續輸出多個依照第-運動轨跡(如圖6所 示的64)運動的顯示物件立體影像時,位於顯示裝置 方的使用者可觀看到顯示物件朝向使用者的方向運動的主 面,並且根據目測的結果,持回應物件2向顯示物件揮擊 〇 影像擷取模組30包括的至少一個鏡頭從不同的視角連 續擷取回應物件2的多個影像(sl2〇5),藉此,執跡辨 單元348可根據回應物件2同一時間被不同攝影機或相^ 所擷取到的晝面的座標及像差,計算出回應物件2 ^應於 榻取:V參照圖4的3〇〇)的三維座標,更可進-步:據 不同時間的影像所計算出的座標變化,獲得回库物件t :動距Ϊ、方向錢料資訊。執_識單元348更可^ 據所计算出來的距離、方向及度等資料 :向顯示物件運動的第二運動執跡(如圖 交會點計算單元344分別獲得執跡 =第-運動執跡與執跡辨識單元348計算 並根據步驟s麗時所獲得的使用 = _(_),對第,二運= 關斷顯示物件根據第—運動執 轨跡而朝向對方運動時,兩個運二= 會點(S1213),若是,則更進—步判斷回應物Γ2 及”属不物件4分別依相對應的運動執跡運動時,是否=同2 26/37 201235932 -時間點產生交會(s】2 “ =)式是- 日祕 土知(篸閱圖ό =根據計算的結果判斷出第 生义曰,例如兩個運動執跡完 卓—運動軌跡不會產 判斷結果為否)或 :交會點( s⑶5判達所述交會點的^’t回應物 , 增、、,。果為否),交會點計 項不-致(步驟 凡346產生未父會通知並輸出到^L 344可透過通知單 物件回應物件未交會通知可包括置1(S12I7)。顯示 所產生的影像畫面。 子呢明或配合打擊情境 當判斷顯示物件及回應物件 步驟S⑵5判斷結果為是) :在同-時間點交會( 元340計算顯示物件與回應物件交步由軌跡計算單 顯示物件的反應運動執跡(如圖θ_蚪的交會座標,以及 36根據反應運動軌跡緣 °斤不的),並由繪圖單元 ⑽終謙鱗= 影像 、飛行時間、角度等資訊,已的顯示物件的速度 圖7到圖8Α及8Β的說明。& ;則述實施例及對應於 當判斷出顯示物件與回應物件2合 根據回應物件2的質量、加 運動之外’還可 與顯示物件交會時的回;|力'^ Ά出回應物件2 信號而透過訊號傳輸單元3’、亚將回應力道轉換成回饋 早兀38傳迗到設置有回饋單元20的 27/37 201235932 回應物件2 (S122H。门供。 效單元、發林Q可為振滅生單元、音 單元(未料圖組合,當回應物件2透過訊號接收 據回應力道二而t收到回饋信號後’由回饋單元2〇根 一加二 〔實施例的可能功效〕 詈之各實施例的内容,已揭示本發明的互動式裝 出虛擬與運作。所述絲式裝置可透過顯示裝置輪 杜:4 W像’同時亦以影像操取模組連續擷取回應物 生回應動作時的影像。並利用各式影像分析及處理的 中:件=裂置計算出回應物件與被呈現的立體影像 :τ:得知回應物件是否會與輸出的顯示物 一白、〜像乂會,進而根據顯示物件與回應物件交會與否顯 不,相對應的活動,供使用者㈣應物件對立體影像進行 揮相模擬在球場或物場館擊球等絲活動的場景。 本發明實施例所述的互動式裝置利用顯示裝置顯示物 件的虛擬飛行路徑’不需要建置空_的練f場地,間接達 到保,技境的效果。除此之外,使用者亦無需擔心顯示物 件^壞週遭器物或用品,所述的互動式裝置特別可適用 在空間受限的室内場地,如客廳或書房等場所。另外,互 動式裝置可拥各種相電路(ASIC)實作其巾的各個分 析或運异元件,或由軟體程式分別達成各元件所負責之功 能,故其建置成本低且更新容易。 以上所述僅為本發明之實施例,其並非用以侷限本發 明之專利範圍。At the new measurement point, the 'return' step SUCH is performed again to obtain another coordinate correspondence. And the following measuring steps: when the same user has already reached the preset number of deniers in the number of multi-measurement points, the U-input measurement is performed according to the user, and the corresponding unit 324 is used according to the user. That is to say, when the domain coordinate pair is estimated, the response object 2 and the positional relationship are displayed; (S1117). Finally, the coordinates of b1 correspond to the coordinates of the specific user; (S1119), to complete the school's early (not, .' square; Figure 2, the previous embodiment, please refer to the relevant ^ month of Figure 4. The calculation has been revealed when there are a large number of users who want to use the interaction '° at the same time, the process, = recorded in the memory unit. The parent 1 user's coordinate correspondence, and the coordinate object 2 and the displayed object coordinate pair are obtained. Interactive: Image manipulation module 3. Can capture user: Image: = Set recognition early 70 35 Identify the position of the leader (4) The drawing unit 36 is based on the pre-recorded S (not shown in Figure 2) The first motion track in the figure, the plurality of 25/37 201235932 stereo images of the display object are drawn and output to the display device 1 (S1203); the display object may be a virtual object such as a baseball, a tennis ball or a badminton ball as exemplified above. When the display unit 1 continuously outputs a plurality of display object stereoscopic images that move according to the first motion track (64 as shown in FIG. 6), the user on the display device side can watch the display object move toward the user. Main face, and according to visual inspection If the response object 2 is swiping to the display object, the at least one lens included in the image capture module 30 continuously captures multiple images (sl2〇5) of the response object 2 from different perspectives, thereby performing the tracking unit 348 can calculate the three-dimensional coordinates of the response object 2 ^ should be on the couch: V with reference to Figure 3 (3〇〇) according to the coordinates and aberrations of the response object 2 being captured by different cameras or phases at the same time. , can also enter - step: according to the coordinate changes calculated by the images of different time, get the returned object t: dynamic distance Ϊ, direction money information. The literacy unit 348 can further calculate the distance, the direction and the degree of the data: a second motion trajectory for displaying the movement of the object (as shown in the intersection point calculation unit 344, the executor = the first - the movement is performed and The trace identification unit 348 calculates and uses the usage = _(_) obtained according to the step s, and the second and second transports = turn off the display object according to the first motion trajectory and move toward the other side, two transport two = Meeting point (S1213), if yes, then further step-by-step judgment of responding object 2 and "non-object 4 respectively according to the corresponding movement of the track, whether = = 2 26/37 201235932 - time point to generate a meeting (s) 2 " = ) is - the secret of the day (see Figure ό = judge the first meaning according to the results of the calculation, for example, the two movements are completed - the movement track will not produce a judgment result is no) or: rendezvous Point (s(3)5 is judged to be the ^'t response of the meeting point, increase, and, if the result is no), the intersection point item is not - (the step is 346, the parent will be notified and output to ^L 344 The notice object response object may not include the notification (1) (S12I7). Display the resulting image When the child is clear or cooperates with the attack situation, when the judgment object and the response object step S(2)5 are judged as YES: at the same time point (the 340 calculation shows that the object and the response object are displayed by the trajectory calculation sheet to display the reaction of the object. Traces (such as the intersection coordinates of θ_蚪, and 36 according to the trajectory of the reaction motion), and by the drawing unit (10) the final mode = image, flight time, angle and other information, the speed of the displayed object Figure 7 8 to 8 and 8; and the embodiment and the corresponding to when the display object and the response object 2 are combined according to the quality of the response object 2, plus the movement 'can also be compared with the display object; The force '^ Ά 回应 回应 物 物 物 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应 回应The efficiency unit and the forest Q can be used to annihilate the unit and the sound unit (unlike the combination of the pictures, when the response object 2 receives the feedback signal through the signal 2 and receives the feedback signal), the feedback unit 2 is rooted and added. [Possible Efficacy of the Embodiments] The contents of the various embodiments have been disclosed in the interactive loading and unloading virtual and operation of the present invention. The wire type device can be rotated through the display device: 4 W image while also taking images The module continuously captures the image in response to the object response action, and uses the various images of the image analysis and processing to calculate the response object and the rendered stereo image: τ: know whether the response object will output The display object is white, ~ like a meeting, and then according to whether the display object and the response object meet or not, the corresponding activity, for the user (4) should be the object to simulate the stereo image, hit the ball in the stadium or the venue, etc. The scene of silk activity. The interactive device according to the embodiment of the present invention uses the display device to display the virtual flight path of the object, which does not require the construction of the space, and indirectly achieves the effect of the security and the technical environment. In addition, the user does not have to worry about displaying objects, such as dirty objects or articles, and the interactive device is particularly suitable for use in indoor spaces where space is limited, such as living rooms or study rooms. In addition, the interactive device can implement various analysis or different components of the towel by various phase circuits (ASIC), or the software program can respectively realize the functions of each component, so the construction cost is low and the update is easy. The above is only an embodiment of the present invention, and is not intended to limit the scope of the patents of the present invention.

S 28/37 201235932 圖式簡單說明】 之示意 圖, 圖1:本發明實施例提供的—種互動式裝置之 圖2:本發明實施例提供的—種互動式裝置之方塊圖;’ 圖3 :本發明實關提供的—+的物件座標 圖; 圖4 :本發明f闕提供_應物件的⑽萄之示意 一土圖5:本發明實_提供的回應物件麵示物件交會之 不意圖, 曰 圖6A及6B:本發明實施例提供的回應物件 件的對應位置之示意圖; 、員不物 示意=本發明實施例提供的回應物件與顯示物件轨跡之 圖8A及8B :本發明實施例提供的受擊 及反射角度之示意圖; 、 ^角度 .圖9:本發明之實施例提供的—種互動式|置之方塊圖 塊圖 圖10 :本發明再一實施例提供的一種互動式裝置 之方 圖^1:本發明實施例提供的—種互㈣裝置的 法之杈正程序流程圖;及 卜万 圖12.本發明實施例提供的一種互動 法之互動程序流程圖。 、、作方 【主要元件符號說明】 1,la_ic顯示裝置 29/37 201235932 ίο顯示平面 2回應物件 20回饋單元 22, 22’接觸點 3,3a-3b互動式裝置 30影像擷取模組 300擷取平面 32校正模組 320物件座標輸出單元 322回應座標計算單元 324座標對應單元 34處理模組 340執跡計算單元 342執跡產生單元 344交會點計算單元 346通知單元 348軌跡辨識單元 35位置辨識單元 36繪圖單元 37記憶單元 38訊號傳輸單元 39連接單元 390連接線 39無線傳輸單元 4,4a-4c顯示物件 5, 5a-5c使用者 30/37 201235932 62第一運動軌跡 64第二運動執跡 66反應運動執跡 70接觸平面 S1101-S1119流程步驟 S1201-S1221流程步驟S 28/37 201235932 BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a block diagram of an interactive device provided by an embodiment of the present invention; FIG. 3 is a block diagram of an interactive device provided by an embodiment of the present invention; Figure 4: The present invention provides a schematic representation of the object of the object (10). Figure 5: The present invention provides a response object that indicates the intersection of the object. 6A and FIG. 6B are schematic diagrams showing the corresponding positions of the response object members according to the embodiment of the present invention; FIG. 8A and FIG. 8B showing the trajectory of the response object and the display object provided by the embodiment of the present invention: FIG. 9 is a block diagram of an interactive embodiment of the present invention. FIG. 10 is an interactive device according to still another embodiment of the present invention. The figure is a flow chart of the method of the mutual (4) device provided by the embodiment of the present invention; and the flowchart of the interactive program of the interactive method provided by the embodiment of the present invention. , and [Parts and symbols] 1, la_ic display device 29/37 201235932 ίο Display plane 2 response object 20 feedback unit 22, 22' contact point 3, 3a-3b interactive device 30 image capture module 300撷Taking plane 32 correction module 320 object coordinate output unit 322 response coordinate calculation unit 324 coordinate correspondence unit 34 processing module 340 execution calculation unit 342 execution generation unit 344 intersection point calculation unit 346 notification unit 348 track recognition unit 35 position recognition unit 36 drawing unit 37 memory unit 38 signal transmission unit 39 connection unit 390 connection line 39 wireless transmission unit 4, 4a-4c display object 5, 5a-5c user 30/37 201235932 62 first motion track 64 second motion track 66 Reaction movement trace 70 contact plane S1101-S1119 process steps S1201-S1221 process steps

31/3731/37

Claims (1)

201235932 七、申請專利範圍: 1. 一種互動式裝置,用以供一使用者透過一顯示裝置進行 I 遊戲,包括: 一影像擷取模組,擷取一回應物件之複數影像; 一處理模組,包括一執跡產生單元、一執跡辨識單元 、一交會點計算單元及一轨跡計算單元,其中: 該軌跡產生單元,耦接該顯示裝置,用以產生一顯 示物件的一第一運動軌跡; 該轨跡辨識單元,耦接該影像擷取模組,接收該回 應物件之該等影像,以辨識該回應物件之一第二 運動執跡; 該交會點計算單元,耦接該轨跡產生單元及該執跡 辨識單元,用以計算該顯示物件依照該第一運動 執跡運動及該回應物件根據該第二運動轨跡運 動而交會時之一交會座標;及 該軌跡計算單元,耦接該交會點計算單元,用以根 據該第一運動執跡、該第二運動執跡及該交會座 標,計算該顯示物件與該回應物件交會後之一反 應運動轨跡;及 一繪圖單元,耦接該軌跡產生單元、該執跡計算單元 及該顯示裝置,根據該第一運動軌跡繪製該顯示物 件的立體影像,以及繪製該顯示物件對應於該反應 運動軌跡之立體影像,並分別輸出所繪製的立體影 像到該顯示裝置。 2. 如申請專利範圍第1項所述的互動式裝置,其中,該交 會點計算單元更根據該回應物件的運動距離及運動時間 201235932 ’以及該回應物件 件接觸時的—㈣力道里該回應物件與該顯示物 • 第2項所述的互動式裝置,其中,該交 . 兮。r "早70根據影像操取模組所擷取的該衫傻中, 出:二動顯示物件交會的位移變:而計算 時間計算該回應物==距離、該回應物件的運動 質量及加速度計算該回應以及根據相應物件之 利範圍第2項所述的互動式裝置,其中 ::中’::該顯示物件的一反應距離及-反應速度 的質旦及^ 單元根據該回應力道、該顯示物件 迷产% =、不物件依據該第—運動執跡運動時的運動 應時間計算出該反應距離。及根據献應速度及一反 5· ^申請專利範圍第4項所述的 ;;=包;該顯示物件的-反射角度,該= 形二=物件交會時’該回應物件 •如申凊專利乾圍第2項所述的互動式裝置,更包括: Λ號傳^單儿接該交會點計算單元 該回應力道所轉換的—回饋信號到該^ ^ 以使該回應物件之-回饋單元根據該回館㈣: 生相對應於該回應力道的振動。 〜 7·如申請專利範圍第6項所述的互動式裝置, 饋單元為—振動產生單^、—音效單元或—發光單^回 33/37 201235932 8. 如=請專利範圍第i項所述的互動式裝置,更包括: ,虎:t早兀’耦接該繪圖單元,用以傳輸該繪圖 的該顯示物件的立體影像的影 到该顯不裝置。 9. 如申^利範圍第1項所述的互動式裝置,更包括: 一識單元,耦接於該影㈣取模岐該交會點 擷it::亥位置辨識單元接收該影像擷取模組所 ,以辨識該使用者的-定位位置,並傳 交會點計算單元,以供該交會點 ===相對應的-座標對 的該交會點。 跡及該第二運動執跡 如申凊專利範圍第9項所,+、6A ^ a L 一校正模組,包括所相互動式裝置,更包括·· 絡圖單it早兀’轉接於該顯示裝置,控制該 據一物件座標繪製該顯示物件之立 於該顯示裝置,該物件座標係該顯 ::件相對於該顯示裝置之一顯示平面的三維 一:算單元’轉接於該影像操取模組,根 影像,計算知魅=取到相應物件之複數 標係該回應物件回應座標,該回應座 取平面的三_取模組之一操 -座標對應單元, 應座標計算單亓…r从輸出早兀及該回 、’根據該回應物件與該顯示物 34/37 201235932 件座標及該回應座標’計算該顯 11 ·如申請專利範® 平面的該座標對應關係。 會點計算單元彳=所4的互動式裝置’其中,該交 該第二運動軌關係、該第-運動軌跡及 會座標。 汁^该顯不物件與該回應物件的交 12 ·如申請專利筋圖楚〇 ^ 辨識單元更:::項所述的互動式裝置,其中,位置 到該座標對應單 二 亚傳迗该定位位置 置與所計算出來的C座標對應單71對應該定位位 13 出;的该座標對應關係。 •申明專利範圍第9項所述的互動 -連接單元,·接該校 ;匕括: 連接另-互動式裝置。、U—,並用以 】4.種互動式裝置的運作方法 模組、-影像操取模組及一繪:單元:=括-處理 控制該繪圖單元依據—第该方法包括: 件的多個立體影像,廿 "、’繪製—顯示物 從不同視角連續到一顯示裝置; 動時的多個影像;愿物件朝向該顯示物件運 根據該回應物件的多個影像 顯示物件運動的—第—开。應物件朝向該 判斷該顯示物件根據該力::: 物件根據該第二運動軌跡47=運動與該回應 當判斷該顯示物件及該回^^否交會; 示物件的一反應運動執跡::::,計算該顯 反應運動軌跡繪製及私山田°亥繪圖單元根據該 輸出該顯示物件的多個立體 35/37 201235932 影像。 15.如::專利範圍第14項所述的互動 :其中更包括:當判斷該顯示物件 := l6f由產生—未交會通知並輪出到該顯示裝置件不父會 &如2專職_ 14 述的互動式裝置的運作 之;驟前=顯示物件與該回應物件運動時是否交會 擷取^辨識對應於該回應物件的— 位置;及 $ W 疋位 取得對應於該定位位置的—座標對應關伟. 座,應關係、該第-運動 物件是否交會動執物該顯示物件與該回應 17.如::專二=:項所述的互動式裝置的運作方法 、〒汁·亥反應運動軌跡的步驟令包括: 根ί I:對f =計算該顯示物件及㈣ 乂 g野的一父會座標;及 根據該交會座標、該第一 計算該反應__。 枝料二運動執跡 .如圍第17項所述的互動式裝置的運作方法 根據气k-顯示平面的-物件座標 從不同視角擷:該置二::示::的:f影像; 影像; 應物件與泫顯不物件交會時的 根據該回應物件在不同視角的影像中的位置計算έ 36/37 S 201235932 =回應物件對應於該影像操取裝 的-回應座標,·及 員取+面 根==座標及該回應座標計算出該顯示平面盘 ^擷取平面的該座標對應關係。 〃 .如If專;:第18項所述的互動式裝置的運作方法 :、 该座標對應關係的步驟前,包括, 應物件的該使用者之該定位位置,將 對Μ關係對應到該定位位置 記=應的該定位位置及該座標對應關係於-記 利範圍g 14項料的互 ’其中,告到齡兮_, 展置的運作方法 包括:員不物件及該回應物件會交會時,更 計算該回應物件的一回應力道;及 根據該回應物件的回應 應物件,藉以使該回應物件\=回;!信=該回 饋信號產生㈣應㈣ 貞早讀據違回 源。 I力逼的振動、聲音或光 37/37201235932 VII. Patent Application Range: 1. An interactive device for a user to play an I game through a display device, comprising: an image capture module for capturing a plurality of images of a response object; The method includes: a track generating unit, a track identifying unit, a meeting point calculating unit, and a track calculating unit, wherein: the track generating unit is coupled to the display device for generating a first motion of the display object The track recognition unit is coupled to the image capture module to receive the images of the response object to identify a second motion trace of the response object; the intersection point calculation unit is coupled to the track a generating unit and the obstruction identifying unit, configured to calculate one of the intersection coordinates of the display object according to the first motion tracking motion and the response object moving according to the second motion trajectory; and the trajectory computing unit, coupled And the intersection point calculation unit is configured to calculate the display object and the response object according to the first motion trace, the second motion trace, and the intersection coordinate a reaction motion track after the intersection; and a drawing unit coupled to the trajectory generating unit, the trajectory computing unit, and the display device, drawing a stereoscopic image of the display object according to the first motion trajectory, and drawing the display object Corresponding to the stereoscopic image of the reaction motion trajectory, and outputting the drawn stereoscopic image to the display device, respectively. 2. The interactive device according to claim 1, wherein the intersection calculation unit further responds according to the movement distance and movement time of the response object 201235932 'and the response object (-) The object and the display object • The interactive device described in item 2, wherein the intersection. r " early 70 according to the image manipulation module captured in the shirt silly, out: two movements show the displacement of the object intersection: and calculate the time to calculate the response == distance, the motion quality and acceleration of the response object Calculating the response and the interactive device according to item 2 of the corresponding object, wherein::::: a reaction distance of the display object and a quality of the reaction speed and the unit according to the return stress, Displaying the object's fascination % =, the object is calculated according to the movement of the first movement exercise time should calculate the reaction distance. And according to the speed of the offer and the anti-5·^ patent application scope mentioned in item 4;; = package; the display object - reflection angle, the = shape 2 = when the object meets 'the response object ・ such as the application for patent The interactive device described in the second paragraph further comprises: an nickname transmission unit connected to the intersection point calculation unit, the feedback signal converted by the feedback path to the ^^, so that the response object-return unit is The back hall (4): The vibration corresponding to the return stress path. ~ 7· As in the interactive device described in item 6 of the patent application, the feeding unit is - vibration generating single ^, - sound effect unit or - light single back 33/37 201235932 8. If = please patent scope item i The interactive device further includes: a tiger: t early coupled to the drawing unit for transmitting a stereoscopic image of the display object of the drawing to the display device. 9. The interactive device according to the first item of the claim, further comprising: an identification unit coupled to the shadow (four) modulo, the intersection point 撷it:: the position recognition unit receives the image capture mode The group is located to identify the location of the user and to transmit the meeting point calculation unit for the intersection point of the corresponding point-coordinate point ===. Traces and the second movements, such as the 9th item of the patent scope of the application, +, 6A ^ a L, a correction module, including the mutual-moving device, and including the network diagram single it early transfer The display device controls the display object to be drawn on the display device according to an object coordinate, the object coordinate is: the three-dimensional one of the display plane relative to one of the display devices: the calculation unit is transferred to the Image manipulation module, root image, computing enchantment = taking the plural number of the corresponding object, the response object responding to the coordinate, the response taking the plane of the three _ taking module one of the operation-coordinate corresponding unit, the coordinate calculation table亓...r from the output early and the back, 'According to the response object and the display 34/37 201235932 pieces coordinates and the response coordinates' calculate the corresponding 11 · such as the patent application of the plane plane of the coordinates. The point calculation unit 彳 = the interactive device of the unit 4, wherein the second track relationship, the first motion track, and the coordinates are communicated. Juice ^ The object of the object and the response object 12 · If the application of the patent ribs Chu 〇 ^ identification unit more::: the interactive device described in the item, where the position to the coordinate corresponds to the single two sub-transmission of the position The position corresponding to the calculated C coordinate corresponding to the single 71 corresponds to the positioning position 13; • Affirmation of the interactive-connection unit described in item 9 of the patent scope, connected to the school; including: connecting another-interactive device. , U-, and used] 4. An interactive device operation method module, an image manipulation module, and a drawing: unit: = bracketing - processing control the drawing unit basis - the method includes: Stereoscopic image, 廿", 'draw-display object continuously from different angles of view to a display device; multiple images during moving; wishing that the object is oriented toward the display object to display the movement of the object according to the plurality of images of the response object-- open. According to the force, the object is determined according to the force::: the object according to the second movement track 47=moving and the return should judge the display object and the return ^^ no meeting; a reaction movement of the object is: ::, calculate the apparent reaction trajectory drawing and the private Yamada plot unit according to the output of the plurality of stereoscopic 35/37 201235932 images of the displayed object. 15. For example: the interaction described in item 14 of the patent scope: which further includes: when judging the display object: = l6f is generated - not notified and will be rotated to the display device is not the parent & such as 2 full-time _ 14 The operation of the interactive device; the front of the display = whether the object and the response object move when the call is taken ^ identifies the position corresponding to the response object; and the $ W position obtains the coordinates corresponding to the position Corresponding to Guan Wei. Block, should be related, whether the first-moving object will be handed over to the object and the response. 17. Such as: :Special two =: The operation method of the interactive device, the juice and the reaction The steps of the motion trajectory include: root ί I: Calculate the display object and (4) a parent coordinate of (4) 野g wild; and calculate the response __ according to the intersection coordinate. The second movement of the branch material. The operation method of the interactive device as described in item 17 is based on the air-k-display plane-object coordinate from different perspectives: the second:: shows:::f image; image According to the position of the response object in different angles of view when the object meets the object έ 36/37 S 201235932 = The response object corresponds to the image-response coordinate of the image manipulation, and the member takes + The root root== coordinate and the response coordinate calculate the coordinate correspondence of the display plane disk. IfIf If:: The operation method of the interactive device described in Item 18: Before the step of the coordinate correspondence relationship, the positioning position of the user of the object is included, and the confrontation relationship corresponds to the positioning Position record = the position of the position and the coordinate corresponding to the - record range g 14 items of the mutual 'in the case of age _, the operation method of the exhibition includes: the member does not object and the response object will meet And calculating a return stress path of the response object; and responding to the object according to the responding object, so that the response object is \=back; the letter = the feedback signal is generated (4) should (4) 贞 read the data source. I force vibration, sound or light 37/37
TW100106523A 2011-02-25 2011-02-25 Interactive device and operating method thereof TWI423114B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW100106523A TWI423114B (en) 2011-02-25 2011-02-25 Interactive device and operating method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW100106523A TWI423114B (en) 2011-02-25 2011-02-25 Interactive device and operating method thereof

Publications (2)

Publication Number Publication Date
TW201235932A true TW201235932A (en) 2012-09-01
TWI423114B TWI423114B (en) 2014-01-11

Family

ID=47222726

Family Applications (1)

Application Number Title Priority Date Filing Date
TW100106523A TWI423114B (en) 2011-02-25 2011-02-25 Interactive device and operating method thereof

Country Status (1)

Country Link
TW (1) TWI423114B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI564063B (en) * 2015-01-16 2017-01-01 Floating projection game and learning system
TWI600456B (en) * 2015-12-24 2017-10-01 Colopl Inc Battle-type game control methods and programs
TWI633521B (en) * 2016-02-04 2018-08-21 南韓商高爾縱新維度控股有限公司 Apparatus for base-ball practice, sensing device and sensing method used to the same and control method for the same

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101807005B1 (en) * 2015-12-18 2018-01-10 주식회사 골프존뉴딘 Apparatus for base-ball practice, sensing device and sensing method used to the same and pitching control method of the same
CN110968194A (en) * 2019-11-28 2020-04-07 北京市商汤科技开发有限公司 Interactive object driving method, device, equipment and storage medium

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007050885A2 (en) * 2005-10-26 2007-05-03 Sony Computer Entertainment America Inc. System and method for interfacing with a computer program
TWM318445U (en) * 2007-04-04 2007-09-11 Qyoung Tec Co Ltd Image-interactive throwing device
US8419545B2 (en) * 2007-11-28 2013-04-16 Ailive, Inc. Method and system for controlling movements of objects in a videogame
TW201104494A (en) * 2009-07-20 2011-02-01 J Touch Corp Stereoscopic image interactive system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI564063B (en) * 2015-01-16 2017-01-01 Floating projection game and learning system
TWI600456B (en) * 2015-12-24 2017-10-01 Colopl Inc Battle-type game control methods and programs
TWI633521B (en) * 2016-02-04 2018-08-21 南韓商高爾縱新維度控股有限公司 Apparatus for base-ball practice, sensing device and sensing method used to the same and control method for the same

Also Published As

Publication number Publication date
TWI423114B (en) 2014-01-11

Similar Documents

Publication Publication Date Title
CN106796453B (en) Driving a projector to generate a shared space augmented reality experience
KR101007947B1 (en) System and method for cyber training of martial art on network
Miles et al. A review of virtual environments for training in ball sports
KR101007944B1 (en) VR training system and its method using network
JP6467698B2 (en) Baseball batting practice support system
US20100259610A1 (en) Two-Dimensional Display Synced with Real World Object Movement
CN103732299B (en) Utilize three-dimensional devices and the 3d gaming device of virtual touch
KR101599303B1 (en) Baseball game system based on the virtual reality and game proceeding method using the same
JP6583055B2 (en) Virtual sports simulation device
TW201244789A (en) Virtual golf simulation apparatus and method
TW201238326A (en) Real-time interactive 3D entertainment device and 3D replication
US20230398427A1 (en) Mixed reality simulation and training system
WO2017094434A1 (en) Avatar display system, user terminal, and program
KR101738419B1 (en) Screen golf system, method for image realization for screen golf and recording medium readable by computing device for recording the method
TW201235932A (en) Interactive device and operating method thereof
WO2014050974A1 (en) Display device, control system, and control programme
CN106823333A (en) Intelligent baseball equipment and the helmet and the method for auxiliary judgment good shot
JP2012181616A (en) Program, information storage medium, game device and server system
TWI835289B (en) Virtual and real interaction method, computing system used for virtual world, and virtual reality system
CN202142008U (en) interactive installation
Li et al. Real-time immersive table tennis game for two players with motion tracking
JP2012141820A (en) Program, information storage medium, image generation system and server system
Li Development of immersive and interactive virtual reality environment for two-player table tennis
JP6739539B2 (en) Information processing equipment
TWM409872U (en) Care interacting instrument

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees