201220108 六、發明說明: 【發明所屬之技術領域】 本發明係關於一種用以顯示遊戲圖像等之顯示裝置, 且為使用者可搬運之可搬運型顯示裝置,以及使用該顯示 裝置之遊戲系統及遊戲處理方法。 【先前技術】 另艰默百1移動操作裝置本身來進行遊戲操作 之遊戲系統(參照例如曰本特許(發明專利)第4265814號 說明書)。例如在日本特許第4265814號說明書所述之遊戲 系統^以操作裝置具備的攝影手段將標示(咖㈣裝置 予以攝衫,再將從被攝影之圖像(攝影 予以使用作為遊戲輸入,即能夠進行移動操 梯作具體而言,由於可從攝影圖像内 來判斷操作裝置的位置及/武 、/、裴置之位置 統中,係取得該標示裝置之:立置來作::在前述的遊戲系 對應標示裝置之位置之遊戲# ·’’、則述資訊,並進行 中,標示裝置之設置場所雖為任音ρ 在前述遊戲系統 置朝向顯示遊戲圖像之晝面方内來^所,值由於將操作裝 而言無不協調感,所以通常枳示穿用之方式,對使用者 周園。 設置於顯示裝置的 【發明内容】 (發明所欲解決的課題) 如前述,在前述遊戲系統中 顯示裝置之方向來使用。* 考將操作裝置朝向 並且,在前述遊戲系統中’一般 323304 4 201220108 而言係採用以如電視之定置 > 因此,使用攝影_像來作為輪&地設置的顯示裝置。 向固定性地設置之顯示裝置之、,使用者必須恆常地截 即,在前述遊戲系統中,會冑肖來使用操作裝置。亦 向的情形,而有關於對操作::工可使用操作裝置之方 餘地。 、、操作之自由度之改善之 的係為提供 顯示裝置、 因此,本發明之目 裝置其操作之自由度之 法0 一種能夠提升對於操作 遊戲系統及遊戲處理方 (解決課題的手段) 下述(1)至(14)之 為了解决上述之課題本發明係採用 構成。 (1) 本發明之-例係為包含定置型遊戲裝置、操作裝置、 以及可搬運型顯示裝置之遊戲系統。 、遊戲褒㈣具财:操作資料·部、賴處理部、 遊戲圖像產生部、遊戲圖像壓縮部、資料傳送部。操作資 料接收部係從操作裝置接收操作資料 。遊戲處理部係根據 操作資=來载行遊戲處理。遊戲圖像產生部係根據遊戲處 理來逐次產生第1遊朗像。遊顏像㈣部係逐次壓縮 第1遊戲圖像來產主壓縮圖像資料。資料傳送部係以無線 方式逐次傳送壓縮圖像資料至可搬蓮螌顯示裝置。 操作敦I係具'備有:攝影部、慣性獻測器、至少一個 操作按鍵、及操作資料傳迭部。攝影部係能夠債測紅外線 5 323304 201220108 光。操作資料傳送部係以無線方式向遊戲裝置傳送操作資 料,該操作資料係包含有:表示由攝影部所作的偵測結果 之資料、表示慣性感測器的偵測結果之資料、及表示對於 操作按鍵的操作之資料。 可搬運型顯示裝置係具備有:紅外線發光部、遊戲圖 像接收部、遊戲圖像解壓縮部、及顯示部。紅外線發光部 係為能夠發出紅外線光。遊戲圖像接收部係從遊戲裝置逐 次接收壓縮圖像資料。遊戲圖像解壓縮部係將壓縮圖像資 料逐次解壓縮而獲得第1遊戲圖像。顯示部係逐次顯示藉 由解壓縮而獲得之第1遊戲圖像。 前述「遊戲裝置」,只要是執行遊戲處理並根據遊戲 處理來產生圖像之資訊處理裝置均可為任意裝置。前述遊 戲裝置可為遊戲專用的資訊處理裝置,或是一般的個人電 腦等之多用途資訊處理裝置。 前述「操作裝置」,除具備前述之構成外,復可具備有 其他之構成。例如,前述操作裝置復可具備有顯示手段及 /或聲音輸出手段。此外,前述「由攝影部所作的偵測結 果」係指可為藉由攝影部所攝影之圖像本身;亦可為從該 圖像所獲得之資訊(例如後述之標示座標等)。 前述「可搬運型」係指,使用者可握持於手而移動, 或於任意之位置進行變更配置之程度大小之涵意。但,「可 搬運型顯示裝置」,係例如如後述第1至第4遊戲例,可於 遊戲中移動使用之外,亦可例如如後述第5遊戲例,於遊 戲中固定性配置(不移動)使用。 6 323304 ⑧ 201220108 前述「遊戲系統」,只要包含有遊戲裝置、操作裝置、 及可搬運型顯示裝置即可,可不包含或是包含顯示後述之 第2遊戲圖像之外部顯示裝置。亦即,遊戲系統可在不包 含或是包含該外部顯示裝置之形態來提供。 根據前述(1)之構成,可搬運型顯示裝置係具備有紅外 線發光部與顯示部,並且操作裝置具備有攝影部,且根據 藉由攝影部所作之偵測結果來執行遊戲處理。因此,使用 者可以將操作裝置的攝影部朝向可搬運型顯示裝置的紅外 線發光部的方向來進行遊戲操作。在此,由於顯示裝置係 為可搬運型,故使用者可將顯示裝置配置於自由的位置, 藉此可朝向自由的方向使用操作裝置。如此,根據前述(1) 之構成,相較於會有限制可使用操作裝置之方向情形的習 知之遊戲系統,可提升對於操作裝置的操作之自由度。 此外,根據前述(1)之構成,可搬運型顯示裝置至少只 要執行圖像資料的解壓縮處理即可,針對遊戲處理只要在 遊戲裝置側執行即可。遊戲處理即使變得複雜,也由於只 增加了遊戲裝置側的處理,對在可搬運型顯示裝置的圖像 解壓縮處理之處理量幾乎不造成影響,所以縱使在需要複 雜的遊戲處理時,也可將可搬運型顯示裝置側的處理負載 抑制於預定的範圍,且不對可搬運型顯示裝置要求高資訊 處理能力。因此,可搬運型顯示裝置的小型化、輕量化變 得容易,製造亦變得容易。 再者,根據前述(1)之構成,由於第1遊戲圖像係被壓 縮後從遊戲裝置傳送至可搬運型顯示裝置,並可高速地無 7 323304 201220108 線方式傳送遊戲圖像,故可減少從進行遊戲處理至顯示遊 戲圖像的延遲。 μ (2) 遊戲系統復可包含有能夠發出紅外線光的標示妒置。 該情形時,遊戲圖像產生部根據遊戲處理復產生第2遊戲 圖像。此外,遊戲裝置復具備有圖像輸出部與發光控制部。 圖像輸出部係向與可搬運型顯示裝置不同之外部顯示裝置 逐次輸出第2遊戲圖像。發光控制部係控制標示裴置的發 光。 ^ 前述「外部顯示裝置」係指,只要為與可搬運型顯示 裝置不同即可,除了後述實施形態的電視2 ‘ 〜只要可 顯示出由遊戲裝置所產生之第2遊戲圖像者均可為枉音裝 置。例如,外部顯示裝置可與遊戲裝置形成為一體q = 體内)。 根據前述(2)之構成,在遊戲系統包含有與前述可搬運 型顯示裝置之紅外線發光部不同之標示裝置,並且在與顯 示第1遊戲圖像之可搬運型顯示裝置不同之外部顯示裝置 顯示第2遊戲圖像。根據此,纽置標示裝置於該外部顯 不裝置之周邊時,使用者可朝向兩個顯示骏置之任一者來 使用操作|置。換言之,由於使用者可將操作裝置予以朝 向兩個顯示裝置的哪-邊進行操作’故更提高了操作裝置 之操作之自由度。 根據前述(2)之構成,由於復可對可搬運型顯示裝置自 由地配置’故可自由地設^兩個顯示裝置的位置關係。因 323304 ⑧ 8 201220108 此,以在對應遊戲内容之適切的位置來配置兩個顯示裝 置,即可實現使用操作裝置更具有真實感之操作,及實現 更高趣味性之遊戲(參照後述第5遊戲例)。此外,以每次 變更兩個顯示裝置的位置關係,來形成能夠將兩個顯示裝 置設在各種的位置關係以對應使用之種種遊戲。 此外,根據前述(2)之構成,由於可在外部顯示裝置顯 示第2遊戲圖像,故可向遊戲者提示不相同之兩種遊戲晝 面。因此,可藉由兩種遊戲圖像以各種的方法來顯示遊戲 空間。因此,根據前述(2)之構成,可向遊戲者提示更易觀 看、且容易進行遊戲操作之遊戲圖像。 (3) 遊戲系統亦可包含兩個操作裝置。此情形時,遊戲處 理部係根據從各操作裝置接收的操作資料來執行遊戲處 理。 根據前述(3)之構成,可將一個操作裝置予以朝向可搬 運型顯示裝置之方向來進行遊戲操作,並且可將另一個操 作裝置予以朝向標示裝置之方向來進行遊戲操作。因此, 當在外部顯示裝置之周圍設置標示裝置時,在將操作裝置 與顯示裝置作為一組而使用之遊戲(例如後述第5遊戲例) 中,可使兩位遊戲者同時地進行遊戲。 (4) 發光控制部可因應遊戲處理之内容來控制標示裝置及 紅外線發光部的發光。 前述「因應遊戲處理之内容來進行控制(標示裝置及紅 9 323304 201220108 外線發光部的發光)」係指包含有因應在遊戲裝置中所執行 之遊戲程式之種類來控制發光,或即使執行同一個遊戲程 式時’亦因應遊戲狀況(遊戲者的操作對象及/或操作方 法,或者遊戲的進行狀況)來控制發光之涵義。 根據前述(4)之構成,遊戲裝置可因應遊戲處理内容, 來控制使標示裝置及紅外線發光部之任一者發光(或使雙 方發光)。其中,依據遊戲内容,會有只使用標示裝置與可 搬運型顯不裝置之紅外線發光部之兩個發光裝置中之任一 方之情形。此外,在使兩個發光裝置的雙方予以發光時, 因為遊戲裝置會無法判斷操作|置的攝影部是㈣到來自 哪-個㈣裝置的紅外線光,故會有無法正確地進行藉由 操作裝置所作的操作之情形。對此,根據前述⑷之構成, 由於可在兩個發光裝置中使尉應遊戲處理之内容的適切之 發光裝置發光,故可對A插絲 T應種種的遊戲,並且可防止由於誤 檢測而造成不正確地由操作裴置所作之操作。 (5) ^ 發光控制部,亦可基&主_ 生表示對藉由紅外線發光部所發 出的=㈣彳日示的控”料。該情形時,資料傳送部, 以無線方式傳送控制資料至可搬運型顯 顯示裝置復具備有用以從遊㈣置接收 ^ 料接收部。紅外線發光邱位分_ 刊貝竹<佐市J貝 動作。 ° $依據所接收之控制資料來進行 枓SI資:傳送部」可將前述「控制資料」與圖像資 枓一起傳送,亦可在與圖像資料的傳送時序不相同之時序 323304 ⑧ 201220108 來進行傳送。亦即’在逐次傳送圖像資料時,也可僅在必 要時傳送2送資料’而沒有必要與圖像資料n欠傳送。 根據前述(5)之構成,遊戲襄置可藉由向可搬運型顯示 裝置傳送控制資料,來容易地控制紅外線發光部的發光。 (6) 、此外,本發明之一例係為能夠與遊戲裝置進行無線方 可搬,顯示裝置。遊戲裝置係從具備能夠债測 ^果Ϊ攝影部之操作裝置接收表示由攝影部所作的偵 1果之貝料,錢次傳送將基於根據該資料所執行之遊 ::【所產生之遊戲圖像予以壓縮後之壓_象資料至顯 顯示裝置係具備有:紅外線發光部、遊戲圖像接收部、 :戲圖像解壓縮部、及顯示部。紅外線發光部係為能約發 出紅外線光。遊_像接收部魅_裝置私接 圖像資料。遊戲圖像解壓縮部係將壓縮圖像 壓 =^到遊戲圖像。顯示部係逐次顯示藉由解壓縮而= 之遊戲圖像。 τ 根據前述(6)之構成,與前述⑴之構成同樣 型顯示裝置係具備妹外線發光部與顯示部 ^ 置具備有攝影部,且根據藉由攝影部所得之❹梓== 將顯稀置配置在自由位置,而使操料置能由 向來使用,並可提升對操作裝置的操作自由产向自由方 此外,根據前述⑻之構成,與前述⑴^構成同樣, 323304 11 201220108 =對:搬運$顯示裝置要求高資 製造,,根據前述⑹之構二變,易 式傳送遊戲圖像,並可減少置故了冋速地無線方 像為止之延遲。 雜仃遊戲處理至顯示遊戲圖 (7) 作資備有:觸控面板、慣性感測器、及摔 資料傳送部係以無線方式向遊戲上。操作 時t 控面板及慣性感測器之輸出資料。該= 、’2裝置係根據操作資料來執行遊戲處理。Μ 有作可搬運型顯示裝置亦形成具 戲系統時如當將前述顯示裝置使用於遊 顯示裝置本Γΐ: 示部之晝面一邊移動操作 而政 身,或者,亦可將顯示裝置配置在自由位署 段來:ΓΓ置當作朝向顯示裝置方向來使用之顯示手 置、作為/即,根據前述⑺之構成,可提供作為操作裝 卞為顯不裝置均可使用之多用途裝置。 C8) 理所以無線方式向顯示υ傳送根據遊戲處 聲音。該情形時,顯示裝置復具備㈣ #由㈣ 賴裝置接收賴聲音;揚聲11,輸出 9遊戲聲音接收部所接收之遊戲聲音。 323304 ⑧ 12 201220108 在前述(8)中,從遊戲裝置向顯示裝置所無線方式傳送 之遊戲聲音,可如後述之實施形態之經壓縮後再傳送或是 未經壓縮而傳送。 根據前述(8)之構成,與遊戲圖像同樣,可從顯示裝置 輸出遊戲聲音。 (9) 可搬運型顯示裝置亦可復具備有麥克風。該情形時, 操作資料傳送部係復以無線方式向遊戲裝置傳送麥克風所 偵測之聲音的資料。 在前述(9)中,從操作裝置向遊戲裝置所無線方式傳送 之聲音的資料,可如後述之實施形態之經壓縮後再傳送、或 是未經壓縮而傳送。 根據前述(9)之構成,由操作裝置的麥克風所彳貞測之聲 音(麥克風聲音)係傳送至遊戲裝置。因此,遊戲裝置可將 麥克風聲音用於遊戲聲音,或將對麥克風聲音進行聲音辨 識處理後的結果用於作為遊戲輸入。 (10) 顯示裝置亦可復具備有:攝影機與攝影機圖像壓縮 部。攝影機圖像壓縮部係壓縮攝影機所攝影之攝影機圖像 來產生壓縮攝影資料。該情形時,操作資料傳送部係復以 無線方式向遊戲裝置傳送壓縮攝影資料。 根據前述(10)之構成,由顯示裝置的攝影機所攝影之 攝影機圖像係傳送至遊戲裝置。因此,遊戲裝置可將攝影 機圖像用於遊戲圖像,或將對攝影機圖像進行圖像辨識處 13 323304 201220108 理的結果用於作為遊戲輸入。此外,根據前述(5)之構成, 由於攝影機圖像係壓縮後被傳送,故可高速地無線方式傳 送攝影機圖像。 (11) 顯示裝置係亦可具備有··複數個表面操作按鍵與能夠 指示方向的方向輸入部。複數個表面操作按鍵係設置於在 設置有顯示部之晝面及觸控面板之表平面之該晝面的兩 側。方向輸入部係在表平面設置於晝面的兩侧。該情形時, 操作資料係復包含表示對於複數個表面操作按鍵及方向輸 入部之操作的資料。 。根據前述(11)之構成,在顯示裝置之晝面的兩侧設置 有操作按鍵及方向輸入部。因此,由於遊戲者可在握持顯 示裝置之狀態下(典型上以兩手姆指)來操作操作按鍵及方 向輸入部,故即使在進行移動顯示裝置之操作下,也可容 易地操作操作按鍵及方向輸入部。 (12) 顯示裝置亦可復具備有:複數個背面操作按鍵虚複數 個侧面操作按鍵。複數個背面操作按鍵係設置於背平面。 背平面係為設置有顯示部之晝面及觸控面板之表平面的相 反側之面。複數個侧面操作按鍵係設置於表平面盥背平面 之間的側面。該情形時’操作資料係復包含表示對於複數 個背面操作按鍵及側面操作按鍵之操作的資料。 根據前述⑽之構成,在顯示|置之背平面及側面設 置有操作按鍵。因此’由於遊戲者可在握持顯示裝置之狀 323304 ⑧ 14 201220108 態、下(典型上、 :一、二从食指或中指)來操作該等之操作按鍵,故gp 鍵。 顯不裝置之操作下,也可容易地操作操作按 (13) 顯不裝置亦可復具備有磁性感測器。該情形時,操作 資料係復^含有磁性感測器之_結果的資料。 根據刖述(13)之構成’顯示裝置係具備有磁性感測 器,在遊戲裝置巾磁性感測㈣輸出結果可使用於遊戲處 理。因此’遊戲者可藉由移動顯示裝置來進行遊戲操作。 此外’由於遊戲裝置可從磁性感測器之輸出結果來判斷於 實際空間中的顯示裝置之絕對性姿勢’故可例如以使用慣 性感測器的輸出結果與磁性感測器的輸出結果來正碟地計 算出顯示裝置的姿勢。 (14) 慣性感測器可為任意慣性感測器,例如為3轴加速度 感測器及3軸迴轉感測器(gyr〇 sens〇r)。 根據前述04)之構成,藉由使用加速度感測器及迴轉 感測器的2種感測器作為慣性感測器,可正確地計算出可 搬運型顯示裝置的動作或姿勢。 此外,本發明之其他一例,亦能夠以在前述(1)至(5) 之遊戲系統+所進行之遊戲處理方法的形 態來實施。 (發明的效果) 根據本發明’可藉由在可搬運型之顯示裝置設置有紅 外線發料’且使遊戲圖像㈣於顯示裝置,而能夠將操 15 323304 201220108 作裝置朝向自由的方向來使用,並提升對操作裝置之操作 的自由度。 【實施方式】 [1.遊戲系統的全體構成] 以下參照圖式來說明本發明的一項實施形態之遊戲系 統1。第1圖為遊戲系統1的外觀圖。第i圖中,遊戲系 統1係包含:以電視接收機等為代表之定置型顯示裝置(以 下記載為「電視」)2、定置型遊戲裝置3、光碟4、控制器 5、標示裝置6、及終端裝置7。遊戲系統卜係根^採用 控制器5之遊戲操作,在遊戲裝置3中執行遊戲處理,並 將藉由遊戲處理所得之遊戲圖像顯示於電視2及/或終端 裝置7。 …、 遊戲裝置3中,可更換使用之資訊記憶媒體的一例之 光碟4,係可裝卸地插入於該遊戲裝置3。光碟4係記憶有 用以在遊戲裝置3中執行之資訊處理程式(典型上為遊戲 程式)。於遊戲裝置3的前面設置有光碟4的插入口。遊戲 裝置3,係藉由讀取插入於插入口的光碟4中所記憶之資 訊處理程式並加以執行而執行遊戲處理。 電視2係經由連接線而連接於遊戲裝置3。電視2,係 顯示出藉由遊戲裝置3中所執行的遊戲處理而得之遊戲圖 像。電視2具有揚聲器2a(第2圖),揚聲器2a係輸出上 述遊戲處理的結果所得之遊戲聲音。其他實施形態中,遊 戲裝置3與定置型顯示裝置可構成為一體。此外,遊戲裝 置3與電視2之通訊可為無線方式通訊。 323304 ⑧ 16 201220108 於電視2的畫面周邊(第1圖中為晝面上侧),設置有 標示裝置6。詳細内容將於之後詳述,使用者(遊戲者)可 進行移動控制器5之遊戲操作,標示裝薏6係遊戲裝置3 用以算出控制器5的動作或位置或姿勢等而使用。標示裝 置6於其兩端具有2個標示器服及乩。襟示器6R(標示器 6L亦相同),具體而言為!個以上的紅外線LED(Light Emitting Diode,發光二極體),係朝向電視2的前方輸出 紅外線。標示裝置6連接於遊戲裝置3,遊戲裝置3可控 制標示裝置6所具備之各紅外線LED的點燈。標示裝置6 為可搬運型,使用者可將標示裝置6設置在自由位置。第 1圖中,係顯示標示裝置6設置在電視2的上方之型態, 但鼓置彳示示裝置6之位置及朝向可為任意。 控制器5 ’係將顯示出對本身機器所進行之操作内容 的操作資料賦予至遊戲裝置3者。控制器5與遊戲裝置3 之間可藉由無線方式通訊來進行通訊。本實施形態中’於 控制器5與遊戲裝置3之間的無線方式通訊,例如採用 Bluetooth(藍牙)(註冊商標)技術。其他實施形態中,控制 器5與遊戲裝置3能夠以有線方式連接。此外,本實施形 態中’遊戲系統1中所包含之控制器5係設為1個,但遊 戲裝置3可與複數個控制器5通訊,藉由同時使用預定台 數的控制器,可讓複數位使用者玩遊戲。控制器5的詳細. 構成將於之後詳述。 終端裝置7 ’為使用者所能夠握持之程度的大小’使 用者能夠以手握持終端裝置7來移動,或是將終端裝置7 323304 201220108 配置在自由位置來使用。詳細構成將於之後詳述,終端裝 置7係具備為顯示手段的LCD(Liquid Crystal Display : 液晶顯示裝置)51及輸入手段(後述觸控面板52或迴轉職 測器64等)。終端裝置7與遊戲裝置3之間可藉由無線方 式(或有線)方式來進行通訊。終端裝置7從遊戲裝置3接 收遊戲襄置3中所產生之圆像(例如遊戲圖像)的資料,教 將圖像顯不於LCD 51。本實施形態中,係使用LCD作為顯 不裝置,但終端裝置7亦可具有例如利用EL(Electr〇 Luminescence:電致發光)之顯示裝置等之其他任意的顯示 裝置。此外,終端裝置7係將顯示出對本身機器所進行之 操作的内容之操作資料傳送至遊戲裝置3。 [2.遊戲裝置3的内部構成] 接著參照第2圖,說明遊戲裝置3的内部構成。第2 圖係顯示遊戲裝置3的内部構成之方塊圖。遊戲裝置3係201220108 VI. Description of the Invention: The present invention relates to a display device for displaying a game image or the like, and a portable display device that can be carried by a user, and a game system using the display device And game processing methods. [Prior Art] A game system in which the mobile operation device itself is used to perform a game operation (see, for example, the transcript of the license (invention patent) No. 4265814). For example, the game system described in the specification of Japanese Patent No. 4265814 uses the photographing means provided in the operating device to display the label (the coffee machine), and then the image to be photographed (the photograph is used as the game input, that is, it can be performed Specifically, since the position of the operating device and the position of the device and/or the device are determined from the inside of the photographic image, the pointing device is obtained: The game is a game corresponding to the position of the pointing device. The information is displayed, and the setting place of the pointing device is the voice ρ. The game system is placed facing the display game image. In the case of the operation, there is no sense of discomfort. Therefore, the method of wearing is generally used for the user. The invention is provided on the display device. (Problems to be solved by the invention) As described above, The direction of the display device is used in the game system. * The test device is oriented and, in the aforementioned game system, 'General 323304 4 201220108 is used to set the TV as set. Therefore, the photographic image is used as the display device provided in the wheel & the display device that is fixedly disposed, the user must constantly cut off, and in the aforementioned game system, the operation is swayed. The device is also in the case of the operation, and there is room for the operation: the operator can use the operating device. The improvement of the degree of freedom of operation is to provide the display device. Therefore, the device of the present invention is free to operate. Method for improving the game system and the game processor (means for solving the problem) The following (1) to (14) are used to solve the above problems. The present invention adopts the configuration. (1) Example of the present invention The game system includes a fixed game device, an operation device, and a portable display device. The game game (four) has money: operation data, department, processing unit, game image generation unit, game image compression unit, data The operation unit receives the operation data from the operation device, and the game processing unit carries the game process based on the operation amount = the game image generation unit is based on the game. The first image is generated successively by the processing. The fourth image of the first image is compressed successively by the image (4), and the compressed image data is successively transmitted by the data transfer unit. The data transfer unit sequentially transmits the compressed image data to the portable display. The device is equipped with a camera unit, an inertial detector, at least one operation button, and an operation data transmission unit. The camera department is capable of measuring infrared rays 5 323304 201220108. The operation data transmission unit is wireless. The method transmits operation data to the game device, and the operation data includes: data indicating a detection result by the photographing unit, data indicating a detection result of the inertial sensor, and information indicating an operation of the operation button. The transport type display device includes an infrared light emitting unit, a game image receiving unit, a game image decompressing unit, and a display unit. The infrared ray emitting unit is capable of emitting infrared light. The game image receiving unit sequentially receives the compressed image data from the game device. The game image decompressing unit decompresses the compressed image data one by one to obtain a first game image. The display unit sequentially displays the first game image obtained by decompression. The "game device" may be any device as long as it is an information processing device that executes game processing and generates an image based on game processing. The aforementioned game device may be a game-specific information processing device or a multi-purpose information processing device such as a general personal computer. The "operating device" described above may have other configurations in addition to the above-described configuration. For example, the operation device may be provided with display means and/or sound output means. Further, the above-mentioned "detection result by the photographing unit" means an image itself which can be photographed by the photographing unit; and information obtained from the image (for example, a logo coordinate to be described later). The above-mentioned "portable type" means the degree to which the user can hold the hand and move it, or change the arrangement at any position. However, the "transportable display device" is, for example, the first to fourth game examples described later, and can be moved and used in the game, and can be fixedly arranged in the game (for example, as in the fifth game example described later). )use. 6 323304 8 201220108 The above-mentioned "game system" may include a game device, an operation device, and a portable display device, and may not include or include an external display device that displays a second game image to be described later. That is, the gaming system can be provided in a form that does not include or includes the external display device. According to the configuration of the above (1), the portable display device includes the infrared light emitting unit and the display unit, and the operation device includes the imaging unit, and the game processing is executed based on the detection result by the imaging unit. Therefore, the user can perform the game operation in the direction in which the photographing unit of the operation device faces the infrared light-emitting portion of the portable display device. Here, since the display device is of a transportable type, the user can arrange the display device at a free position, whereby the operating device can be used in a free direction. As described above, according to the configuration of the above (1), the degree of freedom in the operation of the operating device can be improved as compared with the conventional game system in which the direction of the operating device can be restricted. Further, according to the configuration of the above (1), the portable display device may perform at least the decompression processing of the image data, and the game processing may be performed on the game device side. Even if the game processing is complicated, since only the processing on the game device side is increased, the processing amount of the image decompression processing on the portable display device is hardly affected, so even when complicated game processing is required, The processing load on the side of the portable display device can be suppressed to a predetermined range, and high information processing capability is not required for the portable display device. Therefore, the size and weight of the portable display device are reduced, and manufacturing is also easy. Further, according to the configuration of the above (1), since the first game image is compressed and transmitted from the game device to the portable display device, the game image can be transmitted at a high speed without the 7 323304 201220108 line method, so that the game image can be reduced. The delay from the progress of the game to the display of the game image. μ (2) The game system can include a marker that emits infrared light. In this case, the game image generating unit generates the second game image in accordance with the game processing. Further, the game device is provided with an image output unit and a light emission control unit. The image output unit sequentially outputs the second game image to an external display device different from the portable display device. The lighting control unit controls the lighting of the marking device. ^ "External display device" means that it is different from the portable display device, except that the television 2' of the embodiment to be described later can be displayed as long as the second game image generated by the game device can be displayed. Sound device. For example, the external display device can be formed integrally with the game device q = body). According to the configuration of the above (2), the game system includes an indicator device different from the infrared light-emitting portion of the portable display device, and is displayed on the external display device different from the portable display device that displays the first game image. The second game image. According to this, when the home marking device is located around the external display device, the user can use the operation|positioning toward either of the two display devices. In other words, since the user can operate the operation device toward which side of the two display devices, the degree of freedom of operation of the operation device is further improved. According to the configuration of the above (2), since the portable display device can be freely disposed, the positional relationship between the two display devices can be freely set. 323304 8 8 201220108 Therefore, by arranging two display devices in an appropriate position corresponding to the game content, it is possible to realize a more realistic operation using the operating device and realize a more interesting game (refer to the fifth game described later). example). Further, each time the positional relationship between the two display devices is changed, it is possible to form a game in which the two display devices can be placed in various positional relationships for use. Further, according to the configuration of the above (2), since the second game image can be displayed on the external display device, the player can be presented with two different game faces. Therefore, the game space can be displayed in various ways by the two game images. Therefore, according to the configuration of the above (2), the game player can be presented with a game image that is easier to see and that is easy to play. (3) The game system can also contain two operating devices. In this case, the game processing department executes the game processing based on the operation data received from the respective operating devices. According to the configuration of the above (3), one of the operating devices can be operated in the direction of the transportable display device, and the other operating device can be operated in the direction of the pointing device. Therefore, when a pointing device is provided around the external display device, in a game (for example, a fifth game example described later) in which the operating device and the display device are used as a group, the two players can simultaneously play the game. (4) The light emission control unit controls the light emission of the pointing device and the infrared light emitting portion in accordance with the content of the game processing. The above-mentioned "control according to the content of the game processing (indicator and red 9 323304 201220108 external light emitting portion)" means that the lighting is controlled in accordance with the type of the game program executed in the game device, or even if the same one is executed In the game program, the meaning of the lighting is also controlled in accordance with the game situation (the player's operation object and/or operation method, or the progress of the game). According to the configuration of the above (4), the game device can control whether or not any of the pointing device and the infrared light emitting portion is caused to emit light (or both light-emitting) in response to the game processing content. Among them, depending on the content of the game, there is a case where either one of the two light-emitting devices of the infrared light-emitting portion of the pointing device and the portable display device is used. Further, when both of the two light-emitting devices are caused to emit light, the game device cannot determine whether the imaging unit of the operation device is (4) to which infrared light is emitted from the device (four), and thus the operation device cannot be accurately operated. The situation of the operation. On the other hand, according to the configuration of the above (4), since the appropriate light-emitting device that can respond to the content of the game processing can be illuminated in the two light-emitting devices, it is possible to play various kinds of games for the A-wire T, and it is possible to prevent erroneous detection due to erroneous detection. Causes an operation that is incorrectly performed by the operating device. (5) ^ The light-emitting control unit may also indicate the control of the (four) day indicated by the infrared light-emitting unit in the light-emitting control unit. In this case, the data transfer unit wirelessly transmits the control data. The portable display device is useful for receiving the receiving unit from the swim (four). Infrared illuminating Qiu _ _ _ _ _ _ _ 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐 佐The "Transfer Department" may transmit the "Control Data" together with the image data, or may transmit it at a timing different from the transmission timing of the image data 323304 8 201220108. That is, 'when the image data is sequentially transferred, it is also possible to transmit 2 data only when necessary' without the need to transmit the image data n. According to the configuration of the above (5), the game device can easily control the light emission of the infrared light-emitting portion by transmitting the control data to the portable display device. (6) Further, an aspect of the present invention is a display device that can be wirelessly connected to a game device. The game device receives the bedding material indicating the result of the detection by the photographing unit from the operation device having the fingerprint measurement unit, and the money transfer is based on the tour executed according to the data:: [generated game map The image-to-display device that has been compressed is provided with an infrared light-emitting unit, a game image receiving unit, a play image decompressing unit, and a display unit. The infrared light emitting portion is capable of emitting about infrared light. _ _ like the receiving part of the charm _ device private access image data. The game image decompression unit compresses the compressed image to a game image. The display unit sequentially displays the game image by decompression =. τ According to the configuration of the above (6), the display device of the same type as the configuration of the above (1) includes the camera-side light-emitting unit and the display unit, and the imaging unit is provided, and the image is obtained by the imaging unit. It is placed in a free position, and the operation can be used from the past, and the operation of the operating device can be improved to freely produce the free side. Further, according to the configuration of the above (8), the configuration is the same as the above (1), 323304 11 201220108 = Yes: handling The display device is required to be manufactured at a high cost, and the game image is easily transferred according to the configuration of the above (6), and the delay of arbitrarily arranging the wireless image can be reduced. The game is processed to display the game map. (7) The game is equipped with: touch panel, inertial sensor, and drop data transmission department to wirelessly play the game. The output data of the t-control panel and inertial sensor during operation. The =, '2 devices perform game processing based on the operation data. Μ When the portable display device is also formed into a play system, such as when the display device is used in the display device: the side of the display is moved while operating, or the display device can be configured freely. According to the configuration of the above (7), the multi-purpose device that can be used as the display device can be provided as the display device in the direction of the display device. C8) The wireless method transmits the sound according to the game to the display. In this case, the display device is provided with (4) # (4) the device receives the sound, and the speaker 11 outputs the game sound received by the game sound receiving unit. 323304 8 12 201220108 In the above (8), the game sound transmitted wirelessly from the game device to the display device can be transmitted after being compressed as in the embodiment described later or transmitted without being compressed. According to the configuration of the above (8), the game sound can be output from the display device similarly to the game image. (9) The portable display device can also be equipped with a microphone. In this case, the operation data transmission unit wirelessly transmits the data of the sound detected by the microphone to the game device. In the above (9), the data of the voice transmitted wirelessly from the operating device to the game device can be transmitted after being compressed as in the embodiment described later or transmitted without being compressed. According to the configuration of the above (9), the sound (microphone sound) detected by the microphone of the operating device is transmitted to the game device. Therefore, the game device can use the microphone sound for the game sound, or use the result of the sound recognition processing of the microphone sound as the game input. (10) The display device can also be equipped with a camera and camera image compression unit. The camera image compression unit compresses the camera image captured by the camera to generate compressed photographic material. In this case, the operation data transmission unit wirelessly transmits the compressed photographic material to the game device. According to the configuration of the above (10), the camera image captured by the camera of the display device is transmitted to the game device. Therefore, the game device can use the camera image for the game image, or use the result of the image recognition of the camera image as a game input. Further, according to the configuration of the above (5), since the camera image is compressed and transmitted, the camera image can be wirelessly transmitted at high speed. (11) The display device may be provided with a plurality of surface operation buttons and a direction input portion capable of indicating directions. A plurality of surface operation keys are disposed on both sides of the side surface on which the display surface is provided and the surface of the touch panel. The direction input portion is disposed on both sides of the kneading surface in the table plane. In this case, the operational data includes information indicating the operation of the plurality of surface operation buttons and the direction input portion. . According to the configuration of the above (11), the operation button and the direction input portion are provided on both sides of the face of the display device. Therefore, since the player can operate the operation button and the direction input portion while holding the display device (typically with two hand fingers), the operation button and the direction can be easily operated even under the operation of the mobile display device. Input section. (12) The display device can also be equipped with: a plurality of back operation buttons and a plurality of side operation buttons. A plurality of back operation buttons are disposed on the back plane. The back plane is the opposite side of the surface of the display panel and the surface of the touch panel. A plurality of side operating buttons are disposed on the side between the planes of the back plane of the table. In this case, the operation data includes information indicating the operation of a plurality of back operation buttons and side operation buttons. According to the configuration of the above (10), the operation buttons are provided on the back plane and the side surface of the display. Therefore, since the player can operate the operation buttons by holding the display device in the state of 323304 8 14 201220108, (typically, one or two from the index finger or the middle finger), the gp button is used. Under the operation of the display device, the operation can be easily performed. (13) The display device can also be equipped with a magnetic sensor. In this case, the operational data is copied to the data containing the results of the magnetic sensor. According to the configuration of the description (13), the display device is provided with a magnetic sensor, and the output of the game device can be used for game processing. Therefore, the player can perform the game operation by moving the display device. In addition, since the game device can judge the absolute posture of the display device in the actual space from the output result of the magnetic sensor, it can be positively used, for example, by using the output result of the inertial sensor and the output of the magnetic sensor. The position of the display device is calculated on the dish. (14) The inertial sensor can be any inertial sensor, such as a 3-axis acceleration sensor and a 3-axis rotary sensor (gyr〇 sens〇r). According to the configuration of the above 04), by using two kinds of sensors of the acceleration sensor and the gyro sensor as the inertial sensors, the action or posture of the portable display device can be accurately calculated. Further, another example of the present invention can be implemented in the form of the game processing method performed by the game system + of the above (1) to (5). (Effects of the Invention) According to the present invention, it is possible to use the operation of the machine 15 323304 201220108 in a free direction by providing the infrared light-emitting material in the portable display device and the game image (four) on the display device. And increase the freedom of operation of the operating device. [Embodiment] [1. Overall configuration of game system] A game system 1 according to an embodiment of the present invention will be described below with reference to the drawings. FIG. 1 is an external view of the game system 1. In the first embodiment, the game system 1 includes a fixed display device (hereinafter referred to as "television") represented by a television receiver or the like, a fixed game device 3, a compact disc 4, a controller 5, and a pointing device 6. And the terminal device 7. The game system is operated by the game of the controller 5, the game processing is executed in the game device 3, and the game image obtained by the game processing is displayed on the television 2 and/or the terminal device 7. In the game device 3, the optical disc 4, which is an example of a replaceable information storage medium, is detachably inserted into the game device 3. The disc 4 system has an information processing program (typically a game program) for execution in the game device 3. An insertion port of the optical disc 4 is provided in front of the game device 3. The game device 3 executes the game processing by reading and executing the information processing program stored in the optical disc 4 inserted in the insertion slot. The television 2 is connected to the game device 3 via a connection line. The television 2 displays a game image obtained by the game processing executed in the game device 3. The television 2 has a speaker 2a (Fig. 2), and the speaker 2a outputs a game sound obtained as a result of the game processing. In other embodiments, the game device 3 and the fixed display device may be integrally formed. In addition, the communication between the game device 3 and the television 2 can be wirelessly communicated. 323304 8 16 201220108 The display device 6 is provided on the periphery of the screen of the television 2 (the upper side in the first drawing). The details will be described in detail later, and the user (player) can perform the game operation of the mobile controller 5, and the display device 6 can be used to calculate the operation, position, posture, and the like of the controller 5. The marking device 6 has two marker garments and cymbals at its two ends. The display 6R (the same as the marker 6L), specifically! More than one infrared LED (Light Emitting Diode) outputs infrared light toward the front of the television 2. The pointing device 6 is connected to the game device 3, and the game device 3 can control the lighting of each of the infrared LEDs provided in the signing device 6. The marking device 6 is of a transportable type, and the user can set the marking device 6 in a free position. In Fig. 1, the type in which the indicator device 6 is placed above the television 2 is shown, but the position and orientation of the drum display device 6 can be arbitrary. The controller 5' displays the operation data indicating the contents of the operation performed by the own machine to the game device 3. Communication between the controller 5 and the game device 3 can be performed by wireless communication. In the present embodiment, wireless communication between the controller 5 and the game device 3 is performed by, for example, Bluetooth (registered trademark) technology. In other embodiments, the controller 5 and the game device 3 can be connected in a wired manner. Further, in the present embodiment, the controller 5 included in the game system 1 is one, but the game device 3 can communicate with a plurality of controllers 5, and by using a predetermined number of controllers at the same time, it is possible to Several users play the game. The details of the controller 5 will be detailed later. The terminal device 7' is a size that the user can hold. The user can move the terminal device 7 by hand or the terminal device 7 323304 201220108 can be used in a free position. The detailed configuration will be described in detail later, and the terminal device 7 includes an LCD (Liquid Crystal Display) 51 as a display means and an input means (a touch panel 52 or a rotary sensor 64 to be described later). Communication between the terminal device 7 and the game device 3 can be performed by wireless (or wired). The terminal device 7 receives the material of the circular image (e.g., game image) generated in the game device 3 from the game device 3, and teaches that the image is not displayed on the LCD 51. In the present embodiment, an LCD is used as the display device. However, the terminal device 7 may have any other display device such as a display device using EL (Electr® Luminescence). Further, the terminal device 7 transmits the operation data showing the content of the operation performed by the own device to the game device 3. [2. Internal Configuration of Game Device 3] Next, the internal configuration of the game device 3 will be described with reference to Fig. 2 . Fig. 2 is a block diagram showing the internal structure of the game device 3. Game device 3 series
具有:CPUCCentral Processing Unit:中央處理單元)1〇\ 系統LSI Η、外部主記憶體12、R〇m/RTc 13、光碟機(^浊 driver)14、及 AV-IC 15 等。 1S CPU 10係藉由執行記憶於光碟4之遊戲程式來執行遊 戲處理,具有遊戲處理器的功能。CPU 10連接於系统 1卜系統LSI U,除了 CPU 10之外,亦連接有外部主^ 憶體12、R0M/RTC 13、光碟機14及AV-IC 15。系统° 11係進行與其連接之各構成要素間之資料轉傳的控H 予顯示之圖像的產生、從外部t置取得資料等處理。系統 LSI 11的内部構成將於之後詳述。揮發性的外部主記恢體 323304 18 201220108 12,係記憶從光碟4所讀取之遊戲程式或是從快 "* *5^^ 17所讀取之遊戲程式等程式,或記憶各種資料,β ^ 並用作兔 CPU 10的工作區或緩衝區。R0M/RTC 13係具有 ^ 戲裝置3之啟動用的程式之ROM(所謂開機R〇w、以 ▲遊 時間之時脈電路(RTC : Real Time Clock)。光碟機14數 從光碟4讀取程式資料或材質圖形資料等,並將讀取之資' 料寫入於後述内部主記憶體lie或外部主記憶體12。 系統LSI 11中,係設置有輸出輸入處理器(I/O處理 器)11a、GPU(Graphics Processor Unit:續'圖處理單元) lib、DSP(Digital Signal Processor:數位訊號處理器) 11c、VRAM(Video RAM)lld、以及内部主記憶體lle。雖於 圖示中省略,但此等構成要素11a至lie係藉由内部匯流 排相互連接。 GPU lib係形成描繪手段的一部分,並依循來自cPu 1() 的繪圖指令(製圖指令)來產生圖像。VRAM lid,係記憶用 以讓GPU lib執行繪圖指令所需之資料(多邊形資料或材質 圖形資料等資料)。產生圖像時,GPU lib係使用記憶於VRam lid之資料來製成圖像資料。本實施形態中,遊戲骏置3 係產生顯示於電視2之遊戲圖像及顯示於終端裝置7之遊 戲圖像兩者。以下’有將顯示於電視2之遊戲圖像稱為「電 視用遊戲圖像」,將顯示於終端裝置7之遊戲圖像稱為「终 端用遊戲圖像」。 ' DSP 11c係具有音訊處理器的功能,係使用記憶於内 部主記憶體lie或外部主記憶體12之聲音資料或聲音波形 323304 19 201220108 (音色)資料,來產生聲音資料。本實施形態中,遊戲聲音 與遊戲圖像相同,係產生從電視2所輸出之遊戲聲音及從 終端裝置7所輸出之遊戲聲音兩者。以下有時將從電視2 的揚聲器所輸出之遊戲聲音稱為「電視用遊戲聲音」,將從 終端裝置7的揚聲器所輸出之遊戲聲音稱為「終端用遊戲 聲音」。 如上所述,在遊戲裝置3中所產生之圖像及聲音中, 電視2中所輸出之圖像及聲音的資料,係藉由AV- 1C 15所 讀取。AV-IC 15係將所讀取之圖像資料經由AV連接器16 輸出至電視2,並且將所讀取之聲音資料輸出至内建於電 視2的揚聲器2a。藉此,圖像被顯示於電視2,並從揚聲 器2a輸出聲音。 此外,在遊戲裝置3中所產生之圖像及聲音中,輸出 至終端裝置7中之圖像及聲音的資料,係藉由輸出輸入處 理器11a等傳送至終端裝置7。藉由輸出輸入處理器11a 等將資料傳送至終端裝置7之傳送方式將於之後詳述。 輸出輸入處理器11a,係在與其連接之構成要素間執 行資料的接收傳送,或是執行來自外部裝置的資料之下 載。輸出輸入處理器11a,係連接於快閃記憶體17、網路 通訊模組18、控制器通訊模組19、擴充連接器20、記憶 卡用連接器21、編解碼器LSI 27。此外,網路通訊模組 18連接有天線22。控制器通訊模組19連接有天線23。編 解碼器LSI 27連接於終端通訊模組28,終端通訊模組28 連接有天線29。 20 323304 ⑧ 201220108 遊戲裝置3,可連接於網際網路等網路而與外部資訊 處理裝置(例如其他遊戲裝置或各種伺服器等)進行通訊。 亦即,輸出輸入處理器11a可經由網路通訊模組丨8及天線 22連接於網際網路等網路,並與連接於網路之外部資訊處 理裝置進行通訊。輸出輸人處理n lla係定期存取快閃記 憶體17,偵測是否有須傳送至網路之資料,有該資料時, 經由網路通訊模組18及天線22傳送至網路。此Λ外\輸出 輸入處理器11a係經由網路、天線22及網路通訊模组18, 接收從外部資訊處理裝置傳絲之龍或從下載伺服器所 下載之資料,並將接收的資料記憶於快閃記憶體17。哪1〇 藉由執行該遊戲程式,來讀取記憶於快閃記憶體P之資料 f利用在遊戲程式。快閃記憶體17中,除了在遊戲裝置3 ”外部資訊處理裝置之間所魏傳送的㈣之外,亦可士己 遊戲裝置3進行遊戲之遊戲的儲存資料(遊戲的 ㈣途資料^此外,快閃記憶體17中亦可記憶 亦即此^ ’遊戲裝置3可接收來自控制器5的操作資料。 :9=輸入處理器❿可經由天線_丨器二 時=收從控制器5所傳送來之操作資料,並記憶(暫 、° 内::記憶體…或外部主記憶體12的緩衝區。 的接收傳送。輸出輸入處理器山 : == 終端料戲圖像)傳送至終端裝置?時,係將G ^ 斤產生之遊戲圖像的資料傳送至編解碼器⑶仏 323304 21 201220108 =S^27對來自輸出輸人處理器m的圖像資料進行預定 的壓縮處理。終端軌模組28在與終料置7之間進行無 線方式通訊。因此,由編解碼器LSI27所壓縮之圖像資料, 係藉由終端通賴組28,經由天線29傳送該端裝置7。 本實施形態中,從遊戲裝置3傳送至終端裝置7之圖像資 料係使用麵射,遊财若所顯㈣圖像產生延遲,則 會對遊戲的操作性產生不良影響。因此,關關像資料從 遊戲裝置3至終端裝置7之傳送,較佳為盡可能作成不會 產生延遲。因此,本實施形態中,編解碼器LSi27例如使 用H. 264規格之高效率壓縮技術來壓縮圖像資料。亦可使 用其他壓縮技術’在通訊速度足夠時,亦能夠以無壓縮來 傳送圖像資料而構成。此外,終端通訊模組28例如為接受 Wi-Fi認證之通訊模組,可使用例如以咖臓.un規格 所採用之 MIMCKMultiple Input Multiple Output)技術, 高速地進行與終端裝置7之_無線方式通訊,或是採用 其他通訊方式。 此外,遊戲裝置3除了圖像資料之外,亦將聲音資料 傳送至終端裝置7。亦即,輸出輸入處理器m係將DSpUc 所產生之聲音資料,經由編解碼器LSI 2?輸出至終端通訊 模組28、編解碼器LSI 27對於聲音資料,亦進行與圖像 資料相同之壓縮處理。對聲音㈣所進行之壓縮的方式, 可為任意方式,但較佳為壓縮率高鱗音劣化少之方式。 此外,其他實施形態中,聲音資料亦能約不壓縮來傳送。 終端通訊模組28將壓縮後的圖像資料及聲音資料,經由天 323304 ⑧ 22 201220108 線29傳送至終端裝置7。 再者,遊戲裝置3除了上述圖像資料及聲音資料之 外丄亦可因應必要將各種控制資料傳送至終端裝置7。控 制資料:為顯示出對終端裝置7所具備的構成要素進行控 ,之指示的資料,例如有控制標示部(第1〇圖所示之標示 部55)的點燈之指示、或是控制攝影機(第1()圖所示之攝 景/機56)的攝影之指示等。輸出輸入處理係因應哪 /的私示將控制>料傳送至終端裝置7。關於此控制資 p ’本實施形態中’編解崎器LSI27未進行資料的壓縮處 理,但在其他實施形態中,可進行壓縮處理。從遊戲裝置 傳送至終端裝置7之上述資料,可因應必要進行編碼或 不進行編碼。 ^此外,遊戲裝置3可從終端裝置γ接收各種資料。詳 細内容將於之後敘述,本實施形態中,終端裝置7係傳送 知作資料、圖像資料及聲音資料。從終端褒置7所傳送來 之各種資料,係經由天線29由終端通訊模組28所接收。 ,此’來自終«置7的圖像資料及聲音資料,亦施以與 ,遊戲裝置3至終端裝置7的關係資料及聲音資料同樣的 =縮處理。因此,關於此等圖像資料及聲音資料,係從終 ,通訊模組28傳送至編解碼器LSI 27 ’藉由編解碼器LSI 7施以解壓縮處理並輸出至輸出輸入處理器丨“ ^另一方 =:關於來自終端裝置7的操作資料,由於資料量與圖像 或聲音相比為較少,所以亦可不施以壓縮處理。此外,可 因應必要進行編碼或不進行編碼。因此,操作資料在由終 23 323304 201220108 端通訊模組28所接收後,經由編解碼H LSI 27被輸出至 輸出輸入處理器Ua。輸出輸入處理器旧,係將從終端裝 置7所接收之資料,記憶(暫時記憶)於内部主記憶體lie 或外部主記憶體12的缓衝區。 此外遊戲裝置3可連接於其他機器或外部記憶媒 體。亦即,於輪出輸入處理器11a係連接有擴充連接器2〇 及^憶卡料接器2卜擴充連接器20為USB或SCSI之類 的;1面用之連接器。可藉由將外部記憶媒體之類的媒體、 其他控制器等之周邊機器、或是有線的通訊用連接器連接 ,擴充連接H 20’來進行與網路之通訊以取代網路通訊模 沮18 „己It卡用連接器21為用以連接記憶卡之類的外部 减媒體之連接11。例如,輸出輸人處理H 11a經由擴充 連接=2G或#憶卡用連接器21來存取外部記憶媒體,可 將資料保存於外部記憶媒體或從外部記憶媒體讀取資料。 遊戲裂置3中,設置有電源鍵24、重設鍵25、及退片 _ ,電源鰱24及重設鍵25連接於系、统LSI 11。當導通 源鍵24時’藉由未圖示之AC轉接器,從外部電源將電 力供給至遊齡置3的各構成要素。按壓重設鍵25時,系 統LSH1 f新啟動遊戲裳置3的啟動程式。退片鍵26連 接於光碟機14。按壓退片鍵26時,糾4從光碟機U排 出。 其他實施形態中’遊戲教置3所具備的各構成要素中 ^某些構成要素’可構成在另與遊戲裝置3不同之擴充機 器。此時,擴充機器例如可經由上述擴充連接器2〇與遊戲 323304 ⑧ 24 201220108 裝置3連接。具體而言,擴充機器具備例如上述編解碼器 LSI 27、終端通訊模組28及天線29的各構成要素,並 裝卸於擴充連接器20。根據此,可藉由將上述擴充機器連 接於不具備上述各構成要素之遊戲裝置,而將該遊戲襄置 構成為可與終端裝置7進行通訊。 、 [3.控制器5的構成] 接著參照第3圖至第7圖來說明控制器5。第3圖係 顯示控制器5的外觀構成之立體圖。第4圖係顯示控制器' 5的外觀構成之立體圖。第3圖為從控制器5的上侧後方 觀看之立體圖,第4圖為從控制器5的下侧前方觀看之立 體圖。 第3圖及第4圖中’控制器5係具有例如由塑膠成型 所形成之外罩3卜外罩31具有以其前後方向(第3圖所示 之Z轴方向)為長邊方向之大致長方體形,全體為大人或小 孩的單手所能夠握持之大小。使用者藉由壓下控制器5上 所設置的鍵’以及移動控制器5本身來改變其位置或姿勢 (傾斜度),可進行遊戲操作。 料31上設置有複數個操作鍵。如第3圖所示,於外 罩31的上表面’ δ又置有:十字鍵咖、^號鍵挪、2號鍵 32c、A鍵32d、減號鍵32e、首頁鍵32f、正號鍵_、及 電源鍵娜。本說明書中,將設置有此等鍵32a至32h之 外罩31的上表面稱為「按按鍵面」。另-方面,如第4圖 所不,於外罩31的下表面形成有凹部,於該凹部的後面側 傾斜面設置有B鍵32i。於此等各操作鍵32a至奶,適當 323304 25 201220108 地分配有因應遊戲裝置3所執行之資訊處理程式的功能。 此外,電源鍵32h係用來以遠距方式將遊戲裝置3本體的 電源導通關閉者。首頁鍵32f及電源鍵32h的上表面係埋 入於外罩31的上表面。藉此可防止使用者誤壓首頁鍵32f 或電源鍵32h。 於外罩31的後面設置有連接器33。連接器33係用以 將其他機器(例如其他感測器單元或控制器)連接於控制器 5者。此外,於外罩31的後面之連接器33的兩側,設置 有用以防止上述其他機器輕易地脫離之卡止孔33a。 於外罩31上表面的後方設置有複數個(第3圖中為4 個)LED 34a至34d。在此,控制器5中,係為了與其他控 制器5區分而賦予控制器種類(號碼)。各LED 34a至34d, 係以將目前設定在控制器5之上述控制器種類通知至使用 者、或是將控制器5的電池殘量通知至使用者之目的所使 用。具體而言,使用控制器5進行遊戲操作時,係因應上 述控制器種類使複數個LED 34a至34d中的任一個點燈。 此外,控制器5係具有攝影資訊運算部35(第6圖), 如第4圖所示’於外罩31的前面設置有攝影資訊運算部 35的光入射面35a。光入射面35a,係以至少讓來自標示 器6R及6L的紅外線穿透之材質所構成。 於外罩31上表面的1號鍵32b與首頁鍵32f之間,形 成有用以將來自控制器5所内建之揚聲器47(第5圖)的聲 音往外部放出之放音孔31a。 接著參照第5圖及第6圖來說明控制器5的内部構 323304 ⑧ 26 201220108 一 第 造。第5圖及第6圖係顯示控制器5的内部構造之.n 部 5圖為卸下控制器5的上框體(外罩31的一部分)之 立體圖。第6圖為卸下控制器5的下框體(外罩31的 分)之狀態之立體圖。第6圖所示之立體圖,為從背 面觀看 第5圖所示之基板30之立體圖。 > 第5圖中,基板30固定設置在外罩31的内部’於— 基板30的上方主面上,設置有各各操作鍵32a至32h、各 LED 34a至34d、加速度感測器37、天線45、及揚聲器47 等。此等係藉由形成於基板30等之配線(未圖示)而連接於 微電腦(Micro Computer)42(參照第6圖)。本實施形態中 加速度感測器37係配置在X軸方向上從控制器5的中心偏 離之位置。藉此可容易算出使控制器5繞著Z軸旋轉時之 控制器5的動作。此外,加速度感測器3 7係配置在長邊方 向(Z軸方向)上較控制器5的中心為前方之位置。此外, 藉由無線方式模組44(第6圖)及天線45 ’使控制器5具有 無線方式控制器之功能。 另一方面,第6圖中,於基板30之下方主面上的前端 緣設置有攝影資訊運算部35。攝影資訊運算部35,從控制 器5的前方依序設有:紅外線濾波器38、透鏡39、攝影元 件40、及圖像處理電路41。此等構件38至41分別安裝於 基板30的下方主面。 此外,於基板30的下方主面上,設置有上述微電腦 42及振動器46。振動器46例如為振動馬達或螺線管,並 藉由形成於基板30等之配線而連接於微電腦42。藉由微 27 323304 201220108 電腦42的指示使振動器46動作,藉此使控制器5產生振 動。藉此可使該振動傳達至握持控制器5之使用者的手, 而實現所謂對應振動的遊戲。本實施形態中,振動器46係 配置在外罩31的稍微靠近前方處。亦即,藉由將振動器 46配置在較控制器5的中心更位於端側,可使振動器46 的振動對全體控制器5形成更大振動。此外,連接器33係 安裝在基板30之下方主面上的後端緣。除了第5圖及第6 圖所示者外,控制器5亦具備:產生微電腦42的基本時脈 之水晶振動元件、將聲音訊號輸出至揚聲器47之擴大器 等。 第3圖至第6圖所示之控制器5的形狀、各操作鍵纪 形狀加速度感測器或振動器的數目及設置位置等僅為-例,可為其他形狀、數目及設置位置。此外,本實施形楚 中丄攝影手段的攝影方向為Z軸正方向,但攝影方向可I 任意方向。亦即,控制器5中之攝影資訊運算部35的位3 貝訊運算部35的光人射面35a),亦可不在外罩3 的别面’只要可從外罩31外部取光,則亦可設置在其他面 第7圖係顯不控制器5的構成之方塊圖。控制器5令 ^5、、操:^叫各操作鍵如至32i)、攝影資訊運算奇 剖,訊°卩36、加速度感測器37、及迴轉感測器48。老 心容的資料 稱為「_;:’,== 、’、貝;稱為「終端操作資料」之情形。 323304 ⑧ 28 201220108 操作部32係包含上述各操作鍵32a至32i,並將顯示 出對各操作鍵32a至32i的輸入狀態(是否壓下各操作鍵 32a至32i)之操作鍵資料輸出至通訊部36的微電腦42。 攝影資訊運算部35,係用以解析攝影手段所攝影之圖 像資料,判別當中亮度較高的區域,並算出該區域的重心 位置或大小等之系統。攝影資訊運算部35,由於具有例如 最大約200圖框/秒的取樣週期,故即使是相對高速的控 制器5的動作,亦可跟隨而進行解析。 攝影資訊運算部35係具備:紅外線濾波器38、透鏡 39、攝影元件40、及圖像處理電路41。紅外線濾波器38, 係使從控制器5的前方所入射之光中,僅讓紅外線通過。 透鏡39,係將穿透紅外線濾波器38之紅外線聚光並入射 至攝影元件40。攝影元件40例如為CMOS感測器或CCD感 測器之類的固體攝影元件,將透鏡39所聚光之紅外線感光 並輸出圖像訊號。在此,成為攝影對象之終端裝置7的標 示部55及標示裝置6,是由輸出紅外線之標示器所構成。 因此,藉由設置紅外線濾波器38,攝影元件40可僅將通 過紅外線濾波器38之紅外線感光而產生圖像資料,因此更 可正確地將攝影對象(標示部55及/或標示裝置6)的圖像 予以攝影。以下,將藉由攝影元件40所攝影之圖像稱為攝 影圖像。藉由攝影元件40所產生之圖像資料,係在圖像處 理電路41中進行處理。圖像處理電路41係算出攝影圖像 内之攝影對象的位置。圖像處理電路41將顯示出所算出的 位置之座標輸出至通訊部36的微電腦42。該座標的資料, 29 323304 201220108 係藉由微電腦42作為操作資料傳送至遊戲裝置3 β以下, 將上述座標稱為「標示器座標」。由於標示器座標對應於控 制器5本身的朝向(傾斜角度)或位置而改變,所以遊戲裝 置3可使用該標示器座標來算出控制器5的朝向或位置。 其他實施形態中,控制器5可構成為不具備圖像處理 電路41,攝影圖像本身可從控制器5傳送至遊戲裝置3。 此時,遊戲裝置3可具有與圖像處理電路41為相同功能之 電路或程式,並算出上述標示器座標。 加速度感測器37,係偵測出控制器5的加速度(包含 重力加速度),亦即偵測出施加於控制器5之力(包含重 力)。加速度感測器37,在施加於該加速度感測器37的偵 測部之加速度中,係偵測出沿著感測軸方向之直線方向的 加速度(直線加速度)之值。例如,為2轴以上的多轴加速 度感測器時,係分別偵測出沿著各軸之成分的加速度作為 施加於加速度感測器的偵測部之加速度。加速度感測器 37,例如為靜電電容型的 MEMS(Micro Electro Mechanical System :微機電系統)型加速度感測器,但亦可使用其他方 式的加速度感測器。 本實施形態中,加速度感測器37係對以控制器5為基 準之上下方向(第3圖所示之Y軸方向)、左右方向(第3圖 所示之X軸方向)及前後方向(第3圖所示之Z轴方向)的3 轴方向分別偵測直線加速度。由於加速度感測器37 ^貞測出 沿著各轴之直線方向上的加速度’所以加速度感測器37的 輸出係顯示出3軸的各軸之直線加速度的值。亦即’所偵 30 323304 ⑧ 201220108 測之加速度’係顯示為以控制器5為基準所設定之χγζ座 標系(控制器座標系)上的3維向量。 顯示出加速度感測器37所偵測之加速度之資料(力〇速 度資料),被輸出至通訊部36。由於加速度感測器37所侦 測之加速度對應於控制器5本身的朝向(傾斜角度)或動作 而改變,所以遊戲裝置3可使用所取得的加速度資料來算 出控制器5的朝向或動作。本實施形態中,遊戲裝置3係 根據所取得之加速度資料來算出控制器5的姿勢或傾斜角 對本業業者而言可從本說明書的說明中容易理解的 是,根據從加速度感測器37(關於後述的加速度感測器63 亦相同)所輸出之加速度的訊號,遊戲裝置3的處理器(例 如CPU 10)或控制器5的處理器(例如微電腦42)等電腦進 仃處理,藉此可推測或算出(判定)關於控制器5之更進一 步的資訊者。例如’當以裝載加速度感測器37之控制器5 處於靜止&態者為前提來執行電腦侧的處理時(亦即在由 加速度感測輯制之加速度僅有重力加速度時來執行 夺)’、要控制器5實質上處於靜止狀態者,則可根據所 偵測之加速度來得知控制器5的姿勢相對於重力方向是否 ^斜或領斜何種私度。具體而言,以加速度感測器打的债 ^轴朝垂直正下方之狀態為基準時,可藉由是否施加有1G 力加速度知㈣器5相對於基準是否 =該大小來得知相對於基準傾斜何種程度。此外 ^ 的加速度感測器37時,可-^ …夕軸 吋了藉由對各軸的加速度訊號施以處 323304 31 201220108 理而更4細得知控制器5相對於重力方向傾斜何種程度。 此時’處理器可根據來自加速度感測器37的輸出來算出控 制器5的傾斜角度,或是不算出該傾斜角度而算出控制器 5的傾斜方向。如此,藉由將加速度感測器37與處理器組 合使用,可判定控制器5的傾斜角度或姿勢。 另方面,以控制器5處於動作狀態(控制器5為移動 之狀態)者為前提時,由於加速度感測器37除了重力加速 度之外亦偵測出因應控制器5的動作之加速度,故藉由預 定處理從偵測出之加速度去除重力加速度的成分,藉此可 得知控制器5的動作方向。此外’即使在以控制器5處於 動作狀態者為前提時,藉由預定處理從偵測出之加速度去 除因應加速度感測器的動作之加速度的成分,藉此亦可得 知控制器5相對於重力方向之斜率。其他實施例中,加速 度感測器37亦可具備:用以在將以内建的加速度偵測手段 所偵測之加速度訊號輸出至微電腦42前對該加速度訊號 進行預定處理之組入式的處理裝置或其他種類的專用處理 裝置。組入式或專用的處理装置,例如當用以使加速度感 測器37偵測出靜態加速度(例如重力加速度)時,可將加速 度訊號轉換為傾斜角(或是其他的較佳參數)。 迴轉感測器48,係债測出繞著3轴(本實施形 態中為 XYZ轴)之角速度。本說明書中,以控制器5的攝影方向(z 軸正方向)為基準’將繞著X轴的旋轉方向稱為縱搖(pitch) 方向’繞著Y轴的旋轉方向稱為偏搖(yaw)方向,繞著z軸 的旋轉方向稱為橫移(roll)方向。迴轉感測器48只要可偵 323304 201220108 測出繞著3軸之角速度即可,所使用之迴轉感測器的數目 及組合可為任意。例如,迴轉感測器48可為3軸迴轉感測 器,或是組合2軸迴轉感測器與1軸迴轉感測器來偵測繞 著3軸之角速度。顯示出迴轉感測器48所偵測之角速度之 資料,被輸出至通訊部36。此外,迴轉感測器48亦可偵 測出繞著1軸或2軸之角速度。 通訊部36係包含:微電腦42、記憶體43、無線方式 模組44、及天線45。微電腦42,於進行處理時一邊將記 憶體43用作為記憶區,一邊以將微電腦42所取得之資料 以無線方式傳送至遊戲裝置3之方式控制無線方式模組 44。 從操作部32、攝影資訊運算部35、加速度感測器37、 及迴轉感測器48輸出至微電腦42之資料,被暫時儲存於 記憶體43。此等資料係作為操作資料(控制器操作資料)被 傳送至遊戲裝置3。亦即,微電腦42係當往遊戲裝置3的 控制器通訊模組19之傳送時序來到時,將儲存於記憶體 43之操作資料輸出至無線方式模組44。無線方式模組44 例如採用Bluetooth(藍牙)(註冊商標)技術,以操作資料 將預定頻率的傳輸波進行調變,並從天線45將該微弱電波 訊號播送。亦即,操作資料以無線方式模組44調變為微弱 電波訊號,並從控制器5傳送出。微弱電波訊號由遊戲裝 置3側的控制器通訊模組19接收。藉由對所接收之微弱電 波訊號進行解調或解碼,遊戲裝置3可取得操作資料。此 外,遊戲裝置3的CPU 10係使用從控制器5所取得之操作 33 323304 201220108 ===處:二通訊邹36往控制器通訊模組19 處理一如曰傳 母個預定週期逐次進行’由於遊戲 較佳传^M 1/6G秒為單位(1圖框時間)來進行’所以 係例聲進行。控㈣5的通訊部36 置3的控制器通訊模組欠^比率’將操作資㈣出至遊戲裝 資料如制加速度 器所進行操作之操作^ ΓΓ 顯示出對本身機 操作資料作為' ,遊戲裝置3係使用上述It has: CPUCCentral Processing Unit: 1 〇 \ System LSI Η, external main memory 12, R〇m/RTc 13, CD player 14 and AV-IC 15. The 1S CPU 10 performs game processing by executing a game program stored in the disc 4, and has a function of a game processor. The CPU 10 is connected to the system 1 LSI U, and in addition to the CPU 10, an external main memory 12, a ROM/RTC 13, an optical disk drive 14, and an AV-IC 15 are connected. The system ° 11 performs processing such as generation of an image to be displayed by data transfer between the constituent elements connected thereto, and acquisition of data from an external t setting. The internal structure of the system LSI 11 will be described in detail later. Volatile external master recovery body 323304 18 201220108 12, is to memorize the game program read from CD 4 or a program such as a game program read from fast "* *5^^ 17, or to memorize various materials, β ^ is used as a work area or buffer for the rabbit CPU 10. The R0M/RTC 13 is a ROM having a program for starting up the device 3 (so-called R〇w, VR clock (RTC: Real Time Clock). The number of the CD player 14 is read from the disc 4 Data or material graphic data, etc., and the read information is written in the internal main memory lie or the external main memory 12 described later. In the system LSI 11, an output input processor (I/O processor) is provided. 11a, GPU (Graphics Processor Unit: continued 'image processing unit) lib, DSP (Digital Signal Processor) 11c, VRAM (Video RAM) lld, and internal main memory lle. Although omitted from the illustration, However, these components 11a to lie are connected to each other by internal bus bars. The GPU lib forms part of the drawing means and generates an image according to a drawing command (drawing command) from cPu 1(). VRAM lid, memory The data (polygon data or material graphic data, etc.) required for the GPU lib to execute the drawing command. When the image is generated, the GPU lib uses the data stored in the VRam lid to create the image data. In this embodiment, Game Jun 3 Series Both the game image displayed on the television 2 and the game image displayed on the terminal device 7 are generated. Hereinafter, the game image displayed on the television 2 is referred to as a "game image for television", and is displayed on the terminal device 7. The game image is called "terminal game image". 'DSP 11c has the function of audio processor, which uses the sound data or sound waveform stored in the internal main memory lie or the external main memory 12 323304 19 201220108 ( In the present embodiment, the game sound is the same as the game image, and both the game sound output from the television 2 and the game sound output from the terminal device 7 are generated. The game sound output from the speaker of the television 2 is referred to as "game sound for television", and the game sound output from the speaker of the terminal device 7 is referred to as "game sound for the terminal". As described above, it is generated in the game device 3. In the image and sound, the image and sound data outputted in the TV 2 are read by the AV-1C 15. The AV-IC 15 transmits the read image data via the AV connector 1 6 is output to the television 2, and the read sound material is output to the speaker 2a built in the television 2. Thereby, the image is displayed on the television 2, and the sound is output from the speaker 2a. Further, in the game device 3 Among the generated images and sounds, the images of the images and sounds outputted to the terminal device 7 are transmitted to the terminal device 7 by the output/output processor 11a, etc. The data is transmitted to the output processor 11a or the like. The transmission method of the terminal device 7 will be described in detail later. The output/input processor 11a performs reception and transmission of data between constituent elements connected thereto, or performs data download from an external device. The output input processor 11a is connected to the flash memory 17, the network communication module 18, the controller communication module 19, the expansion connector 20, the memory card connector 21, and the codec LSI 27. In addition, the network communication module 18 is connected to the antenna 22. The controller communication module 19 is connected to the antenna 23. The codec LSI 27 is connected to the terminal communication module 28, and the terminal communication module 28 is connected to the antenna 29. 20 323304 8 201220108 The game device 3 can be connected to a network such as the Internet to communicate with an external information processing device (for example, other game devices or various servers). That is, the output input processor 11a can be connected to a network such as the Internet via the network communication module 丨8 and the antenna 22, and communicates with an external information processing device connected to the network. The output input processing lla periodically accesses the flash memory 17 to detect whether there is data to be transmitted to the network. When the data is available, it is transmitted to the network via the network communication module 18 and the antenna 22. The external input/output processor 11a receives the data downloaded from the external information processing device or downloaded from the download server via the network, the antenna 22, and the network communication module 18, and memorizes the received data. For flash memory 17. Which one is used to read the data stored in the flash memory P by using the game program, f is used in the game program. In the flash memory 17, in addition to the (4) transmitted between the game device 3 and the external information processing device, the game device 3 can also store the game data (the game's (four) way data ^ In addition, The flash memory 17 can also be memorized, that is, the game device 3 can receive the operation data from the controller 5. : 9 = the input processor can be transmitted via the antenna _ 二 2 = the slave controller 5 From the operation data, and memorize (temporary, °:: memory... or external main memory 12 buffer. Receive transmission. Output input processor mountain: == terminal material image) transmitted to the terminal device? At the time, the data of the game image generated by G ^ kg is transmitted to the codec (3) 仏 323304 21 201220108 = S^27 The predetermined compression processing is performed on the image data from the output input processor m. The wireless communication is performed between the terminal and the final device 7. Therefore, the image data compressed by the codec LSI 27 is transmitted through the terminal multiplex group 28 via the antenna 29 in the terminal device 7. , transmitted from the game device 3 to the terminal device The image data of 7 is used for face-to-face shooting. If the image is delayed (4), the image will be delayed, which will adversely affect the operability of the game. Therefore, the image data from the game device 3 to the terminal device 7 is transmitted. Preferably, the delay is not generated as much as possible. Therefore, in the present embodiment, the codec LSi27 compresses image data using, for example, the H.264 high-efficiency compression technique. Other compression techniques can be used. It is also possible to transmit image data without compression. Further, the terminal communication module 28 is, for example, a communication module that accepts Wi-Fi authentication, and can use, for example, MIMCK Multiple Input Multiple Output which is used in the specification of the curry. The technology performs high-speed communication with the terminal device 7 or other communication methods. In addition, the game device 3 transmits the sound data to the terminal device 7 in addition to the image data. That is, the output input processing The m is used to output the sound data generated by the DSpUc to the terminal communication module 28 and the codec LSI 27 via the codec LSI 2? The compression processing is the same as the image data. The method of compressing the sound (4) may be any method, but it is preferably a method in which the compression ratio is high and the scale sound is less deteriorated. In addition, in other embodiments, the sound data can be about uncompressed. The terminal communication module 28 transmits the compressed image data and sound data to the terminal device 7 via the line 323304 8 22 201220108 line 29. Furthermore, the game device 3 is in addition to the image data and sound data described above. It is also possible to transmit various types of control data to the terminal device 7 as necessary. The control data: for indicating the components of the terminal device 7, the control information is, for example, a control indicator unit (the first figure is shown). The indication of the lighting of the indicator portion 55) or the instruction to control the shooting of the camera (the camera/machine 56 shown in Fig. 1) is performed. The output input processing transmits the control > material to the terminal device 7 in accordance with which/her private indication. In the present embodiment, the knapsack LSI 27 does not perform data compression processing, but in other embodiments, compression processing can be performed. The above-mentioned data transmitted from the game device to the terminal device 7 may or may not be encoded as necessary. Further, the game device 3 can receive various materials from the terminal device γ. The details will be described later. In the present embodiment, the terminal device 7 transmits the known data, the image data, and the sound data. The various data transmitted from the terminal device 7 are received by the terminal communication module 28 via the antenna 29. This image data and sound data from the final set 7 are also subjected to the same reduction processing as the relationship data and sound data of the game device 3 to the terminal device 7. Therefore, the image data and the sound data are transmitted from the terminal, communication module 28 to the codec LSI 27' by the codec LSI 7 and decompressed and output to the output input processor 丨 " ^ The other party =: Regarding the operation data from the terminal device 7, since the amount of data is smaller than that of the image or the sound, the compression processing may not be applied. Further, encoding may be performed or not performed as necessary. After receiving the data from the terminal 23 323304 201220108 terminal communication module 28, the data is output to the output input processor Ua via the codec H LSI 27. The output input processor is old, and the data received from the terminal device 7 is memorized ( Temporarily memorized in the buffer of the internal main memory lie or the external main memory 12. Further, the game device 3 can be connected to other machines or external memory media. That is, the expansion connector is connected to the wheel input processor 11a. 2 〇 and ^ memory card connector 2 expansion connector 20 for USB or SCSI; 1 side connector. By means of external media such as media, other controllers, etc. Connected to the wired communication connector, expand the connection H 20' to communicate with the network to replace the network communication mode 18 „It It card connector 21 is used to connect the external card or the like Reduce media connection 11. For example, the output input processing H 11a accesses the external memory medium via the expansion connection = 2G or # memorandum connector 21, and the data can be stored in an external memory medium or read from an external memory medium. In the game split 3, a power button 24, a reset button 25, and a pull-back _ are provided, and the power switch 24 and the reset button 25 are connected to the system LSI 11. When the source key 24 is turned on, power is supplied from the external power source to each component of the ageing setting 3 by an AC adapter (not shown). When the reset button 25 is pressed, the system LSH1 f newly starts the startup program of the game slot 3. The eject button 26 is connected to the disc player 14. When the eject button 26 is pressed, the correction 4 is discharged from the disc player U. In the other embodiments, some of the constituent elements included in the game teaching unit 3 may constitute an expansion machine different from the game device 3. At this time, the expansion machine can be connected to the game 323304 8 24 201220108 device 3 via the expansion connector 2, for example. Specifically, the expansion device includes, for example, the components of the codec LSI 27, the terminal communication module 28, and the antenna 29, and is attached to and detached from the expansion connector 20. According to this, the game device can be configured to communicate with the terminal device 7 by connecting the expansion device to a game device that does not have the above-described components. [3. Configuration of Controller 5] Next, the controller 5 will be described with reference to FIGS. 3 to 7. Fig. 3 is a perspective view showing the appearance of the controller 5. Fig. 4 is a perspective view showing the appearance of the controller '5. Fig. 3 is a perspective view as seen from the upper rear side of the controller 5, and Fig. 4 is a perspective view seen from the lower front side of the controller 5. In the third and fourth figures, the controller 5 is formed of, for example, plastic molded outer cover 3, and the outer cover 31 has a substantially rectangular parallelepiped shape in the longitudinal direction (the Z-axis direction shown in Fig. 3). The size that can be held by one hand of an adult or a child. The user can perform a game operation by pressing the key set on the controller 5 and moving the controller 5 itself to change its position or posture (inclination). A plurality of operation keys are provided on the material 31. As shown in Fig. 3, on the upper surface 'δ of the outer cover 31, there are: a cross key, a ^ key shift, a 2nd key 32c, an A key 32d, a minus key 32e, a home key 32f, a positive key _ And the power button Na. In the present specification, the upper surface of the outer cover 31 provided with the keys 32a to 32h is referred to as a "press key surface". On the other hand, as shown in Fig. 4, a concave portion is formed on the lower surface of the outer cover 31, and a B key 32i is provided on the inclined surface on the rear side of the concave portion. The functions of the information processing program executed by the game device 3 are assigned to the respective operation keys 32a to 36, and 323304 25 201220108. Further, the power key 32h is used to turn the power of the main body of the game device 3 to the close person in a remote manner. The upper surface of the home key 32f and the power key 32h is embedded in the upper surface of the outer cover 31. Thereby, the user can be prevented from accidentally pressing the first key 32f or the power key 32h. A connector 33 is provided behind the outer cover 31. Connector 33 is used to connect other machines (e.g., other sensor units or controllers) to controller 5. Further, on both sides of the connector 33 on the rear side of the outer cover 31, locking holes 33a for preventing the above-mentioned other devices from being easily detached are provided. A plurality of (four in FIG. 3) LEDs 34a to 34d are disposed behind the upper surface of the outer cover 31. Here, the controller 5 is given a controller type (number) in order to distinguish it from the other controllers 5. Each of the LEDs 34a to 34d is used for the purpose of notifying the user of the type of controller currently set in the controller 5 or notifying the user of the remaining amount of the battery of the controller 5. Specifically, when the game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on in response to the type of the controller. Further, the controller 5 includes a photographing information computing unit 35 (Fig. 6). As shown in Fig. 4, the light incident surface 35a of the photographing information computing unit 35 is provided on the front surface of the outer cover 31. The light incident surface 35a is made of a material that allows at least infrared rays from the markers 6R and 6L to penetrate. Between the first key 32b on the upper surface of the outer cover 31 and the first key 32f, a sound emitting hole 31a for discharging the sound from the speaker 47 (Fig. 5) built in the controller 5 to the outside is formed. Next, the internal structure of the controller 5 will be described with reference to Figs. 5 and 6 to 323304 8 26 201220108. Figs. 5 and 6 show an internal structure of the controller 5, and Fig. 5 is a perspective view showing the upper frame of the controller 5 (a part of the cover 31). Fig. 6 is a perspective view showing a state in which the lower casing (the portion of the outer cover 31) of the controller 5 is removed. The perspective view shown in Fig. 6 is a perspective view of the substrate 30 shown in Fig. 5 as viewed from the back. > In Fig. 5, the substrate 30 is fixedly disposed inside the outer cover 31 on the upper main surface of the substrate 30, and is provided with respective operation keys 32a to 32h, LEDs 34a to 34d, acceleration sensors 37, and antennas. 45, and speaker 47 and so on. These are connected to a microcomputer (see Fig. 6) by wiring (not shown) formed on the substrate 30 or the like. In the present embodiment, the acceleration sensor 37 is disposed at a position deviated from the center of the controller 5 in the X-axis direction. Thereby, the operation of the controller 5 when the controller 5 is rotated about the Z axis can be easily calculated. Further, the acceleration sensor 37 is disposed at a position forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). Further, the controller 5 has the function of the wireless mode controller by the wireless module 44 (Fig. 6) and the antenna 45'. On the other hand, in Fig. 6, a photographing information computing unit 35 is provided on the front end of the lower main surface of the substrate 30. The photographing information computing unit 35 is provided with an infrared filter 38, a lens 39, a photographing element 40, and an image processing circuit 41 in this order from the front of the controller 5. These members 38 to 41 are attached to the lower main faces of the substrate 30, respectively. Further, the microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator 46 is, for example, a vibration motor or a solenoid, and is connected to the microcomputer 42 by wiring formed on the substrate 30 or the like. The vibrator 46 is actuated by the instruction of the computer 42 by micro 27 323304 201220108, whereby the controller 5 is caused to vibrate. Thereby, the vibration can be transmitted to the hand of the user holding the controller 5, and a so-called corresponding vibration game can be realized. In the present embodiment, the vibrator 46 is disposed slightly forward of the outer cover 31. That is, by arranging the vibrator 46 at the end side more than the center of the controller 5, the vibration of the vibrator 46 can cause greater vibration to the entire controller 5. Further, the connector 33 is mounted on the rear end edge of the lower main surface of the substrate 30. The controller 5 includes a crystal vibrating element that generates a basic clock of the microcomputer 42 and an amplifier that outputs an audio signal to the speaker 47, in addition to those shown in Figs. 5 and 6. The shape of the controller 5 shown in Figs. 3 to 6 and the number and arrangement positions of the operation key shape acceleration sensors or vibrators are merely examples, and may be other shapes, numbers, and setting positions. In addition, in the present embodiment, the photographing direction of the photographing means is the Z-axis positive direction, but the photographing direction can be any direction. In other words, the light human face 35a) of the bit 3 calculation unit 35 of the photographing information computing unit 35 in the controller 5 may not be taken from the outside of the outer cover 3 as long as it can be taken from the outside of the outer cover 31. The block diagram of the configuration of the controller 5 is shown in the other figure. The controller 5 commands ^5, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , The information of the old heart is called "_;:", ==, ', and shell; it is called "terminal operation data". 323304 8 28 201220108 The operation unit 32 includes the above-described operation keys 32a to 32i, and outputs operation key data indicating the input states of the operation keys 32a to 32i (whether or not the operation keys 32a to 32i are depressed) to the communication unit. 36 of the microcomputer 42. The photographing information calculation unit 35 is a system for analyzing image data photographed by the photographing means, discriminating a region having a high luminance, and calculating the position or size of the center of gravity of the region. Since the photographing information calculation unit 35 has, for example, a sampling period of at most 200 frames per second, even if the operation of the relatively high-speed controller 5 is performed, it can be analyzed. The photographing information calculation unit 35 includes an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41. The infrared filter 38 allows only infrared rays to pass through the light incident from the front of the controller 5. The lens 39 condenses the infrared rays penetrating the infrared ray filter 38 and enters the photographic element 40. The photographic element 40 is, for example, a solid-state photographic element such as a CMOS sensor or a CCD sensor, and illuminates the infrared ray collected by the lens 39 and outputs an image signal. Here, the indicator portion 55 and the pointing device 6 of the terminal device 7 to be imaged are constituted by a marker that outputs infrared rays. Therefore, by providing the infrared ray filter 38, the photographic element 40 can generate image data only by absorbing the infrared ray through the infrared ray filter 38, so that the photographic subject (the indicator portion 55 and/or the marking device 6) can be more accurately The image is taken. Hereinafter, an image photographed by the photographing element 40 is referred to as a photographed image. The image data generated by the photographic element 40 is processed in the image processing circuit 41. The image processing circuit 41 calculates the position of the photographic subject in the captured image. The image processing circuit 41 outputs the coordinates showing the calculated position to the microcomputer 42 of the communication unit 36. The coordinates of the coordinates, 29 323304 201220108 are transmitted to the game device 3β by the microcomputer 42 as operational data, and the coordinates are referred to as "marker coordinates". Since the marker coordinates change depending on the orientation (tilt angle) or position of the controller 5 itself, the game device 3 can use the marker coordinates to calculate the orientation or position of the controller 5. In other embodiments, the controller 5 may be configured not to include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. At this time, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41, and calculate the marker coordinates. The acceleration sensor 37 detects the acceleration of the controller 5 (including the acceleration of gravity), that is, the force applied to the controller 5 (including the gravity). The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the direction along the direction of the sensing axis in the acceleration applied to the detecting portion of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor of two or more axes, the acceleration of the component along each axis is detected as the acceleration applied to the detecting portion of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro Mechanical System) type acceleration sensor, but other types of acceleration sensors can also be used. In the present embodiment, the acceleration sensor 37 is oriented in the up-down direction (the Y-axis direction shown in FIG. 3), the left-right direction (the X-axis direction shown in FIG. 3), and the front-rear direction (the third direction shown in FIG. 3). The linear acceleration is detected in the three-axis directions of the Z-axis direction shown in Fig. 3, respectively. Since the acceleration sensor 37 detects the acceleration in the linear direction along the respective axes, the output of the acceleration sensor 37 shows the value of the linear acceleration of each of the three axes. That is, the 'acceleration measured by 30 323304 8 201220108' is shown as a 3-dimensional vector on the χγ coordinate system (controller coordinate system) set by the controller 5. The data (force speed data) indicating the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (tilt angle) or action of the controller 5 itself, the game device 3 can use the acquired acceleration data to calculate the orientation or action of the controller 5. In the present embodiment, the game device 3 calculates the posture or the tilt angle of the controller 5 based on the acquired acceleration data. It can be easily understood by the operator from the description of the present specification, based on the slave acceleration sensor 37 ( The signal of the acceleration outputted by the acceleration sensor 63 to be described later is also processed by a processor of the game device 3 (for example, the CPU 10) or a processor of the controller 5 (for example, the microcomputer 42). It is speculated or calculated (determined) about the further information about the controller 5. For example, when the computer side processing is performed on the premise that the controller 5 of the acceleration sensor 37 is in the stationary state, that is, when the acceleration generated by the acceleration sensing has only the gravitational acceleration, the execution is performed. 'When the controller 5 is substantially in a stationary state, it can be known from the detected acceleration whether the posture of the controller 5 is oblique or oblique to the direction of gravity. Specifically, when the state of the offset axis of the acceleration sensor is directly below the vertical direction, whether or not the force is applied to the reference is determined by whether or not the 1G force is applied to the reference. To what extent. In addition, when the acceleration sensor 37 of the ^ is used, the degree of inclination of the controller 5 with respect to the direction of gravity can be further determined by applying the acceleration signal to each axis 323304 31 201220108. . At this time, the processor can calculate the tilt angle of the controller 5 based on the output from the acceleration sensor 37, or calculate the tilt direction of the controller 5 without calculating the tilt angle. Thus, by using the acceleration sensor 37 in combination with the processor, the tilt angle or posture of the controller 5 can be determined. On the other hand, when the controller 5 is in the operating state (the controller 5 is in the moving state), the acceleration sensor 37 detects the acceleration of the action of the controller 5 in addition to the gravitational acceleration. The component of the gravitational acceleration is removed from the detected acceleration by a predetermined process, whereby the direction of operation of the controller 5 can be known. In addition, even when the controller 5 is in the operating state, the component of the acceleration of the motion sensor is removed from the detected acceleration by a predetermined process, thereby knowing that the controller 5 is relative to the controller 5 The slope of the direction of gravity. In other embodiments, the acceleration sensor 37 may further include: an integrated processing device for performing predetermined processing on the acceleration signal before outputting the acceleration signal detected by the built-in acceleration detecting means to the microcomputer 42. Or other kinds of special processing devices. An integrated or dedicated processing device, for example, when used to cause the acceleration sensor 37 to detect a static acceleration (e.g., gravitational acceleration), can convert the acceleration signal to a tilt angle (or other preferred parameter). The swing sensor 48 measures the angular velocity around the three axes (the XYZ axis in this embodiment). In the present specification, the direction of rotation about the X-axis is referred to as the pitch direction of the controller 5, and the direction of rotation about the Y-axis is referred to as yaw (yaw). The direction of rotation about the z-axis is called the roll direction. As long as the gyro sensor 48 can detect the angular velocity around the three axes by detecting 323304 201220108, the number and combination of the gyro sensors used can be arbitrary. For example, the swing sensor 48 can be a 3-axis swivel sensor or a combined 2-axis swivel sensor and a 1-axis swivel sensor to detect angular velocity about the 3 axes. The data showing the angular velocity detected by the swing sensor 48 is output to the communication unit 36. In addition, the swing sensor 48 can also detect angular velocities about 1 or 2 axes. The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the wireless mode module 44 so as to wirelessly transmit the data acquired by the microcomputer 42 to the game device 3 while using the memory unit 43 as a memory area during processing. The data output from the operation unit 32, the imaging information computing unit 35, the acceleration sensor 37, and the gyro sensor 48 to the microcomputer 42 is temporarily stored in the memory 43. These data are transmitted to the game device 3 as operation data (controller operation data). That is, the microcomputer 42 outputs the operation data stored in the memory 43 to the wireless mode module 44 when the transmission timing of the controller communication module 19 of the game device 3 comes. The wireless mode module 44, for example, uses Bluetooth (registered trademark) technology to modulate a transmission wave of a predetermined frequency with an operation data, and broadcasts the weak electric wave signal from the antenna 45. That is, the operational data is modulated into a weak electric wave signal by the wireless module 44 and transmitted from the controller 5. The weak electric wave signal is received by the controller communication module 19 on the side of the game device 3. The game device 3 can acquire the operation data by demodulating or decoding the received weak wave signal. In addition, the CPU 10 of the game device 3 uses the operation 33 323304 201220108 === obtained from the controller 5: the second communication Zou 36 to the controller communication module 19 processes the same as the predetermined cycle of the mother-in-law. The game is better to transmit ^M 1/6G seconds as a unit (1 frame time) to perform 'so the system sounds. Control unit (4) 5 communication unit 36 set 3 controller communication module under ^ ratio 'to operate the operation (4) out to the game equipment data such as the operation of the accelerometer operation ^ ΓΓ display the machine operation data as ', game device 3 series use the above
周㈣為賴輸人來執行遊戲處理 I 上地控制器5,使用者除了壓下 因此’藉由使用 戲操作外,更可進行移動控制器、'之以往的-般遊 ^造行使控㈣5倾斜為任意姿之遊料作。例如, :::r的㈣一,移動:¾ 顯示手段。 歹戈蕙的圖像等用之 [4·終端裝置7的構成] 接著,參照第8圖至第10圖來說明 t第8圖係顯示终端装置7的外_、1=敬置7的構 的⑷圖為終端襄置7的前視圖,⑻ ^圖1 8圖中 =視圖,⑷圖為仰視圖。此外,j_,(c)圖為 持终蠕裝置7之模樣之圖。 系*、、具示使用者握 34 323304 201220108 如第8圖所示’終端裝置7具備大敢為横向較長之 方形板狀形狀的外罩50。外罩50為使用者所能夠握持之 程度的大小,因此,使用者能夠以手握持終端裝置7來移 動’或是改變終端裝置7的配置位置。 終端裝置7於外罩50的表面具有LCD 51。ixD 51 # 置在外罩50表面的中央附近。因此,如第9圖 叹 者藉由握持LCD 51兩侧部分的外罩50,可一邊觀看Lcd 的晝面一邊握持終端裝置來移動。第9圖中,係顯示使5用1 者握持LCD 51左右兩侧的部分之外罩50,而以橫握方式 向較長的朝向)握持終端裝置7之例子,但亦能夠以縱握^ 式(縱向較長的朝向)握持終端裝置7。 如第8圖的(a)圖所示’終端裝置7於LCD 51的佥面 上具有觸控面板52作為操作手段。本實施形態中,觸控面 板52為電阻膜方式的觸控面板。惟觸控面板並不限於電阻 膜方式,例如可使用例如靜電電容方式等之任意方式的觸 控面板。此外,觸控面板52可為單點觸控方式或多點觸控 方式。本實施形態中,觸控面板52係應用與LCD 51的解 析度為相同解析度(偵測精度)者。惟觸控面板52的解析度 並不須與LCD 51的解析度一致❶對觸控面板52之輸入, 通常用觸控筆來進行,但並不限於觸控筆,亦能夠以使用 者的手指對觸控面板52進行輸入。外罩5〇上,可設置有 用以收納用來對觸控面板52進行操作之觸控筆:收納 孔。如此,由於終端裝置7具有觸控面板52,所以使用者 可一邊移動終端裝置7-邊操作觸控面板52。亦即,使用 323304 35 201220108 者可一邊移動LCD 51的晝面,一邊對該畫面直接(藉由觸 控面板52)進行輸入。 如第8圖所示,終端裝置7具備2個類比搖桿53A及 53B、以及複數個操作鍵54A至54L作為操作手段。各類比 搖桿53A及53B為指示方向之裝置。各類比搖桿53A及 53B ’係構成為可使由使用者的手指所操作之搖桿部相對於 外罩50的表面往任意方向(上下左右及斜向的任意角度) 滑動或傾倒。此外,左類比搖桿53A及右類比搖桿53B分 別设置在LCD 51畫面的左側及右侧。因此,使用者可藉由 左右任一手使用類比搖桿來進行指示方向之輸入。此外, 如第9圖所示’各類比搖桿53A及53B設置在使用者可於 握持終h裝置7的左右部分之狀態下進行操作之位置上, 因此’即使使用者握持終端裝置7來移動時,亦可容易操 作各類比搖桿53A及53B。 各操作鍵54A至54L為用以進行預定輸入之操作手 段。如以下所示,各操作鍵54A至54L係設置在使用者可 於握持終端裝置7的左右部分之狀態下進行操作之位置上 (參照第9圖)。因此,即使使用者握持終端裝置7來移動 時’亦可容易操作此等操作手段。 如第8圖的(a)圖所示,於外罩50的表面,設置有各 操作鍵54A至54L中之十字鍵(方.向输入鍵)54A及鍵54B 至54H。亦即,此等鍵54A至54H係配置在使用者的拇指 所能夠操作之位置上(參照第9圖)。 十字鍵54A係設置在LCD 51的左侧且為左類比搖桿 323304 ⑧ 36 201220108 53A的下側。亦即’十字鍵54A配置在使用者的左手所能 夠操作之位置上。十字鍵54A具有十字形,為可指示上下 左右的方向之鍵。此外,鍵54B至54D設置在LCD 51的下 侧。此等3個鍵54B至54D,係配置在左右兩手所能夠操 作之位置上。此外’ 4個鍵54E至54H係設置在LCD 51的 右侧且為右類比搖桿53B的下側。亦即,4個鍵54E至54H 係配置在使用者的右手所能夠操作之位置上。再者,4個 鍵54E至54H係以(相對於4個鍵54E至54H的中心位置) 成為上下左右的位置關係之方式來配置。因此,終端裝置 7可使4個鍵54E至54H具有用以將上下左右的方向指示 於使用者之鍵的功能。 此外,如第8圖的(a)圖、圖及圖所示,第1L 鍵541及第1R鍵54J係設置在外罩5〇的斜上方部分(左上 方部分及右上方部分)。具體而言,第1L鍵541設置在板 狀外罩50之上側的侧面左端,並從上側及左侧的側面露 出。此外,第1R鍵54J設置在外罩5〇之上側的側面右端, 並從上側及右侧的侧面露出。如此,第1L^ 541配置在使 用者的左手食指所能夠操作之位置上,第1R鍵54J配置在 使用者的右手食指所能夠操作之位置上(參照第9圖)。 此外,如第8圖的(b)圖及(c)圖所示,第2L鍵54K及 第2R鍵54L’係配置在板狀外罩5〇的背面(亦即設置有lCE) 51之表面的相反侧之面)上所突起設置之足部59A及59B。 …體而。,第鍵54K設置在外罩50之背面左侧(從表面 側觀看時之左侧)的稍微上方處,第鍵54L·設置在外罩 37 323304 201220108 50之背面右侧(從表面側觀看時之右侧)的稍微上方處。換 言之,第2L鍵54K係設置在表面上所設置之左類比搖桿 53A的大致相反侧的位置’第2R鍵54L係設置在表面上所 設置之右類比搖桿53B的大致相反側的位置。如此,第孔 鍵54K配置在使用者的左手中指所能夠操作之位置上,第 2R鍵54L配置在使用者的右手中指所能夠操作之位置上 (參照第9圖)。此外,如第8圖的(c)圖所示,第2L鍵54K 及第2R鍵54L ’係設置在朝上述足部59A及59B的斜向上 方之面上,並具有朝斜向上方之按鍵面。由於使用者握持 終端裝置7時中指推測會朝上下方向動作,因此,藉由使 按鍵面朝上方,使用者可容易壓下第2L鍵54K及第找鍵 54L。此外,藉由在外罩50的背面設置足部,使用者可容 易握持外罩50,且藉由在足部設置鍵,可在握持外罩5〇 下容易操作。 關於第8圖所示之終端裝置7,由於第2L鍵54Κ及第 2R鍵54L設置在背面,當以使LCD 51的晝面(外罩50的 表面)朝上之狀態下載置終端裝置7時,有時晝面未完全呈 水平。因此,其他實施形態中,可在外罩5〇的背面設置3 個以上的足部。根據此’可在使LCD 51的晝面朝上之狀態 下使足部接觸於地面而載置於地面,因此可使晝面呈水平 地載置終端裝置7。此外,亦可追加能夠裝卸的足部而水 平地載置終端裝置7。 各操作鍵54A至54L,係適當地分配有因應遊戲程式 之功能。例如’十字鍵54A及鍵54E至54H可用在方向指 323304 ⑧ 201220108 不操作或選擇操作等 取消操作等。 各鍵5仙15处可用在決定操作或 雖然未圖示,但終端裝置7具有用以導 褒置7的電源之電源鍵。此外,終端裝置广 導通/關閉LCD51的晝面顯示之鍵,或是用以進行盘 ,置3的連接設定(配對)之鍵’或是以調節揚聲器(第 U圖所不之揚聲器67)的音量之鍵。 如第8圖的(a)圖所示,終端裝置7係在外罩5〇的表 面具備有由標示器55A及標示器哪所構成之標示部(第 ,所示之標示部55)。標示部55設置在咖Μ的上側。 各標示器55A及標示器55B’與標示裝置6的各標示器6r 及6L相同,是由上個以上的紅外線LED所構成。標示部 Μ與上述標示裝置6相同,係用以讓遊戲農置3算出控制 器5的動作等所用。此外,遊戲裝置3可控制標示部J所 具備之各個紅外線led的點燈。 終端裝置7,係具備作為攝影手段之攝影機56。攝; 機。56係包含具有預定解析度之攝影元件(例如影像| 測器或CCD影像感測器等)及透鏡。如第§圖所示,本實; 形態中,攝影機56設置在外罩50的表面。因此,攝與 %可將握持著終端裝置7之使用者的臉予以攝影,例二 將邊觀看LCD 51 —邊進行遊戲時之使用者予以攝与。 終端裝置7’係具備作為聲音輸人手段之麥克風二 圖所示之麥克風69)。於外罩5〇的表面設置有 6Q °麥克風69設置在雜克風用孔60内之外罩 39 323304 201220108 部。麥克風係偵測出使用者的聲音等、以及終端裝置7周 圍的聲音。 終端裝置7,係具備作為聲音輸出手段之揚聲器(第10 圖所示之揚聲器67)。如第8圖的(d)圖所示,於外罩50 的下側側面設置有揚聲器孔57。揚聲器67的輸出聲音從 該揚聲器孔57輸出。本實施形態中,終端裝置7具有2個 揚聲器,於左揚聲器與右揚聲器的各位置設置有揚聲器孔 57。 此外,終端裝置7係具備用以將其他裝置與終端裝置 7連接之擴充連接器58。本實施形態中,如第8圖的(d) 圖所示,擴充連接器58係設置在外罩50的下側侧面。連 接於擴充連接器58之其他裝置,可為任意裝置,例如為特 定遊戲中所使用之控制器(槍型控制器等;)或是鍵盤等輸入 裝置。若無連接其他裝置之必要性,則亦可不設置擴充連 接器58。 關於第8圖所示之終端裝置7,各操作鍵或外罩50的 形狀,或是各構成要素的數目及設置位置等僅僅為一例, 亦可為其他形狀、數目及設置位置。 接著參照第10圖來說明終端裝置7的内部構成。第 10圖係顯示終端裝置7的内部構成之方塊圖。如第10圖 所示Λ .終端裝置7除了第.8圖所示之槔成外,亦具備:觸 控面板控制器61、磁性感測器62、加速度感測器63、迴 轉感測器64、使用者介面控制器(UI控制器)65、編解碼器 LSI 66、揚聲器67、聲音1C 68、麥克風69、無線方式模 40 323304 ⑧ 201220108 組70、天線71、紅外線通訊模組72、快閃記憶體”、電 源1C 74、及電池75。此等電子零件係構裝於電子電路基 板上並收納於外罩50内。 UI控制器65 ’為用以對各種輪出輸入部控制資料的輸 出輸入之電路。ΙΠ控制器65,係連接於觸控面板控制器 、類比搖桿53(類比搖桿53A及53B)、操作鍵54(各操 作鍵54A至54L)、標示部55、磁性感測器62、加速度感 測器63、及迴轉感測器64。此外,υ〗控制器65係與編解 碼器LSI 66及擴充連接器58連接。此外,電源IC '74連 接於UI控制器65,並經由UI控制器65將電力供給至各 部。内建的電池75連接於電源Ic 74並供給電力。此外, 可將經由連接器等從外部電源取得電力之充電器76 線連接於電源1C 74,終端裝置7,當使用該充電器76或 纜線從外部電源供給電力時,可進行充電。終端裝置7, 亦可藉由將終端裝置7裝著於未圖示之具有充電桃的座 充來進行充電》 觸控面板控制器61連接於觸控面板,為進行觸控 面板52的控制之電路。觸控面板控制器61係根據來自觸 控面板52的訊號,產生預定形式的觸控位置資料並輸出至 UI控制器65。觸控位置資料,係顯示出在觸控面板52的 ,入面上進行輸人後之位置的座標。觸控面板控制器61係 "貝取來自觸控面板52的訊號,並以每隔預定時間為i次之 比率來產生觸控位置資料。此外,對雜面板52所進行之 各種控制指示’係從㈣制器65輸出至觸控面板控制器 323304 201220108 61 〇 類比搖桿5 3 ’係將顯示出以使用者的手指 搖桿部所滑動(或傾倒)之方向及| 、之 缺㈣P 门及量之搖桿資料輸出至υτ 控制器65。此外,操作鍵54係將頓 2- S貝不出對各才呆作鍵 ㈣制=之輸入狀況(是否壓下)之操作鍵資料輪出至 磁性感測器62’係藉由偵測磁場的大小及 方位。顯示出所制之方位的方位資料被輸出至心 的。此外’對磁性感測器62所進行之控制指示,器 控制器65輸出至磁性感測器62。關於磁性感測器62 I 採用MI(磁性阻抗)元件、磁通量閑感測器、霍爾元件= (巨量磁性電阻)元件、TMR(穿隧磁性電阻)元件、或3 MR (異向磁性電阻)元件等之感測器,但只要可偵測方:者: 可使用。嚴格來說,在除了地磁之外產生磁場之場所中句 所仔之方位資料未顯示出方位,但即使在此情況下, 終端震置7移動時方位資料亦產生變化,因此可算出紙; 裝置7的姿勢變化。 、嗔 加速度感測器6 3係設置在外罩5 0内部,並偵填j出 著3軸(第8圖的(a)圖所示之xyz軸)方向之直線加迷户 大小。具體而言,加速度感測器63以外罩50的長邊方向 .為X軸’以外罩5〇的短邊方向為y轴,α垂直於外罩如 的表面之方向為ζ軸,來偵測出各軸之直線加速度的值^ 顯示出所偵測之加速度之加速度資料被輸出至UI控 65。此外,對加速度感測器63所進行之控制指示, 42 ^233〇4 201220108 UI控制器65輸出至加速度感測器63。加速度感測器63在 本實施形態例如為靜電電容塑的MEMS型加速度感測器,但 在其他實施形態中,亦可使用其他方式的加速度感測器。 此外,加速度感測器63亦可為偵測出1軸或2軸方向之加 速度感測器。 迴轉感測器64係設置在外罩50内部,並摘測出繞著 上述X軸、y軸、z轴的3軸之角速度。顯示出所偵測之角 速度之角速度資料被輸出至UI控制器65。此外,對迴轉 感測器64所進行之控制指示,係從UI控制器65輸出至迴 轉感測器64。用於偵測出3轴的角速度之迴轉感測器的數 目及組合可為任意,迴轉感測器64與迴轉感測器48相同, 可由2軸迴轉感測器與1軸迴轉感測器所構成。此外,迴 轉感测器64亦可為债測出1轴或2轴方向之迴轉减測器 UI控制器65,係將包含從上述各構成要素所接收之觸 控位置資料、搖桿資料、操作鍵資料、方位資料、加速度 資料、及角速度資料之操作資料輸出至編解碼器LSI 66: 當經由擴充連接器58將其他裝置與終端裝置?連接時,上 述操作資料更可包含顯示出對該其他裝置所進行之操作之 編解碼器LSI 66為對傳送至遊戲裝置3之資料進行壓 電:及戲裝置3所傳送來之資料進行解壓縮 56 56#,1(:68、無線方式模組7()、快閃記 外線通訊模組72。此外,編解躺lsi 66包 及 323304 43 201220108 内部記憶體78。終端裝置7雖然構成為不進行遊戲處理本 身,但必須執行用於終端裝置7的管理或通訊之最低程度 的程式。開啟電源時將儲存於快閃記憶體73之程式讀取至 内部記憶體78並由CPU 77來執行,藉此來啟動終端裝置 7。此外,内部記憶體78的一部分區域係用作為LCD 51的 VRAM。 攝影機56係依循來自遊戲裝置3的指示將圖像予以攝 影,並將攝影後的圖像資料輸出至編解碼器LSI 66。此外, 圖像的攝影指示等之對攝影機56所進行之控制指示,係從 編解碼器LSI 66輸出至攝影機56。攝影機56亦可進行動 晝的攝影。亦即,攝影機56可進行重複攝影並將圖像資料 重複輸出至編解碼器LSI 66。 聲音1C 68係連接於揚聲器67及麥克風69,且為控 制對揚聲器67及麥克風69之聲音資料的輸出輸入之電 路。亦即,當從編解碼器LSI 66接收聲音資料時,聲音 1C 68係將對該聲音資料進行D/A轉換所得之聲音訊號輸 出至揚聲器67,並從揚聲器67輸出聲音。此外,麥克風 69偵測出傳達至終端裝置7之聲音(使用者的聲音等),並 將顯示該聲音之聲音資料輸出至聲音1C 68。聲音1C 68 對來自麥克風69的聲音訊號進行A/D轉換,並將預定形 式的聲音資料輸出至編解碼器LSI 65。. 編解碼器LSI 66,係將來自攝影機56的圖像資料、 來自麥克風69的聲音資料、以及來自UI控制器65的操作 資料,作為終端操作資料經由無線方式模組70傳送至遊戲 44 323304 ⑧ 201220108 裝置3。本實施形態中,編解碼器LSI 66對圖像資料及聲 音資料進行與編解碼器LSI 2了相同的壓縮處理。上述終端 操作資料以及壓縮後的圖像資料及聲音資料,係作為傳送 資料被輸出至無線方式模組70。於無線方式模組7〇連接 有天線71,無線方式模組70經由該天線71將上述傳送資 料傳送至遊戲裝置3。無線方式模組70具有與遊戲襄置3 的終端通訊模組28同樣功能。亦即,無線方式模組7〇係 具有例如藉由依據IEEE802. lln規格之方式而連接於無線 方式LAN之功能。所傳送之資料,可因應必要進行編褐或 不進行編碼。 如上所述 衣亙〖得送至遊戲裝置3之傳送資 料中,係包含操作資料(終端操作資料)、圖像資料、 音資料。當經由擴充連接器58將其他裝置與終端I置 接時,上述傳送資料更可包含㈣其他裝置所接收之次 料。此外’紅外線通訊模組72,在與其他裝置之間進行例 如依循腿規格之紅外線通訊。編解碼器 進仃例 應必要將經由紅外線通訊所接收包 6值可因 料而傳送至遊戲裝置3。 〜於上述傳送資 此外,如上所述,壓縮後的圖像 遊戲裝置3傳送至終端裝置7。此等資料·^ >料係從 線方式模組70而被編解碼5! uI66 . 線7丨及無 將接收的圖像資料及聲音;:壓:接 料被輸出至LCD5! ’而在LCD51 像資 磨縮後的聲音資料被輸出至聲音圖 ^此外’解 卓《 ic 68從揚 t 323304 45 201220108 聲器67輸出聲音。 此外’當從遊戲裝置3所接收之資料中包含控制資料 時’編解碼器LSI 66及UI控制器65係對各部進行依循控 制資料之控制指示如上所述,控制資料為表示出對終端 裝置7所具備之各構成要素(本實施形態中,為攝影機56、 觸控面板控制器61、標示部55、各感測器62至64、及紅 72)所進行之控制指示的資料。本實施形態 二表示之控制指示,可推測為使上述各構成 要素動作、或是使動作休止(停 戲中未使用之i…主 不。亦即’對於遊 I使狀構成要素,為了抑龍力消耗可進行休止, ^不置7傳送至遊戲裝置3之傳送資料中,係 . 休止的構成要素之資料。由於標示部55為 、.卜線LED,所以該控制可僅設為電力供給的導通/關閉。 如上所述终端裝置7具備觸控面板52、類比搖桿53、 :操作鍵54之操作手段,但在其他實施形態中,可構成為 ,、備其他祕h來減此等操斜段或―同具備此等操 作手段。 此外,終端裴置7係具備磁性感測器62、加速度感測 器63、及迴轉感測$ 64,作為用以算出終端裝置7的動作 (包含位置或姿勢或是位置或姿勢的變化)之感測器,但在 其他實施形態中,-可.構成為僅具備此等感測器+的1個成 2個。此外’ |他實施形態中,可構成為具備其他感測器 來取代此等感測器或一同具備此等感測器。 此外’終端裴置7係具備攝影機56及麥克風69而構 46 323304 ⑧ 201220108 成,但在其他實施形態中,亦可不具備攝影機56及麥克風 6 9或僅具備當中任一個。 此外,終端裝置7係具備標示器55作為用以算出終端 裝置7與控制器5之位置關係(從控制器5觀看時之終端裝 置7的位置及/或姿勢)的構成,但在其他實施形態中,亦 可不具備標示器55而構成。此外,其他實施形態中,終端 裝置7可具備其他手段作為用以算出上述位置關係的構 成。例如,其他實施形態中,控制器5可具備標示部且終 端裝置7具備攝影元件而構成。此外,此時標示裝置6可 具備攝影元件來取代紅外線LED而構成。 [5.遊戲處理] 接著,說明在本遊戲系統中所執行之遊戲處理的詳細 内容。首先說明遊戲處理中所使用之各種資料。第11圖係 顯示遊戲處理中所使用之各種資料之圖。第11圖中,為顯 示出遊戲裝置3的主記憶體(外部主記憶體12或内部主記 憶體lie)中所記憶之主要資料之圖。如第11圖所示,遊 戲裝置3的主記憶體中,記憶有遊戲程式90、接收資料91、 及處理用資料106。於主記憶體除了第11圖所示之資料 外,係記憶有在遊戲中登場之各種物件的圖像資料或遊戲 中所使用之聲音資料等之遊戲所需的資料。 遊戲程式90,係在對遊戲裝置3開啟電源後的適當時 機中,從光碟4讀取該全部或一部分並記憶於主記憶體。 遊戲程式90,亦可從快閃記憶體17或遊戲裝置3的外部 裝置(例如經由網際網路)來取得,以取代從光碟4讀取之 47 323304 201220108 方式。此外’關於遊戲程式90中所包含之一部分(例士 以算出控制器5及/或終端裝置7的姿勢之程式口用 記憶於遊戲裝置3内。 可預先 接收資料91,為從控制器5及終端裝置7所接收之各 種資料。接收資料91係包含:控制器操作資料q 操作資料97、攝影機圖像資料104、及麥克風聲音資料 105。當連接有複數個控制器5時,控制器操作資料θ ,、 有複數個。當連接有複數個終端日夺,終端操作: 97、攝影機圖像資料1〇4、及麥克風聲音資料1〇5 , ’、 數個。 亦有複 控制器操作資料92,為顯示出使用者(遊戲者)對控制 器5所進行之操作的資料。控制器操作資料9 2係從控=器 5被傳送並在遊戲m中取得,並被記憶於主記^體。 控制器操作資料92係包含:第1操作鍵資料93、第j加 速度資料94、第i角速度資料95、及標示器座標資料/ 己憶體巾’可從最新(錢取得)者依序記_ 控制器操作資料。 刃 ^操作鍵資料93,為顯示出對設置在控制器5之各 : 他至321的輸入狀態之資料。具體而言,第i摔 作鍵資料93係顯示各操作鰱32a至32i是否被壓下。 測器的加速度感 第3圖所―軸方向上 為各成刀之3維的加速度,但在其他實施形 323304 48 201220108 中’只要可顯示出任意1個方向以上的加速度即可。 第1角速度資料95,為顯示出由控制器5的迴轉感測 器48所偵測出之角速度的資料。在此,第1角速度資料 95顯示出第3圖所示之繞著XYZ的3軸方向上之各角速 度,但在其他實施形態中,只要可顯示出繞著任意丨軸以 上的角速度即可。 心示器座;^負料96,為顯示出由攝影資訊運算部35 =圖像處理電路41所算出之座標,亦即上述標示器座標之 資料。標示器座標,是以用來顯示對應於攝影圖像之平面 上的位置之2維座標系所表現,標示器座標資料⑽顯示出 該2維座標系上的庙楹佶。 控制器操作資料92,只要可顯示出操作控制器5之使 用者的操作者即可,亦可僅包含上述各資料93至96的一 ==卜,當控制器5具有其他輸入手段(例如觸控面板 時,控制器操作資料92亦可包含表示對該 手段所進行之猶的㈣。如本實施 作用在遊戲操作時,咖 干輸^ 料94、第1肖速度資料95、及標 變該般,可包含因應控制器5本身的動作而改 之接:=作資料97為顯示出使用者對終端裝置7所⑹ 在作資料97係從終端裝置购^ 料97 人取传’並被§己憶於主記憶體。終端操作1 323304 49 201220108 置資# 100、第2加速度資料1(H、第2角速度資料1〇2、 及方位資料。主記憶體中’可從最新(最後取得)者依序 憶預疋個數的終端操作資料。 第2操作鍵資料98,為顯示出對設置在終端裴置7 各操作鍵54A至54L _入狀態之資料。具體而言 操作鍵資料98係顯示各操作鍵54A至54L是否被壓下。 搖桿資料99,為顯示出類比搖桿53(類比搖椁 類比搖桿53B)的搖桿部所滑動(或傾倒)之方向及量的資 料。上述方向及量,例如可顯示為2維座標或2維向量。 觸控位置資料100,為顯示出在觸控面板52的輪入面 進行輸入之位置(觸控位置)的資料。本實施形態中,觸押 位置資料100係顯示出用以顯示上述的輸入面上的位置^ 2維座標系的座標值。當觸控面板52為多點觸控方式時, 觸控位置資料1〇〇亦顯示出複數個觸控位置。 第2加速度資料101,為顯示出由加速度感測器63所 偵測出之加速度(加速度向量)的資料。本實施形態中,第 2加速度資料1〇1顯示出將第8圖所示之xyz的3輛方向 上之加速度設為各成分之3維的加速度,但在其他實施形 態中’只要可顯示出任意1個以上方向的加速度即可。 第2角速度資料1 〇2,為顯示出由迴轉感測器64所偵 測出之角速度的資料^本實施形態中,第2角速度資料i 〇 2 顯示出第8圖所示之繞著xyz的3轴之各角速度’但在其 他實施形態中,只要可顯示出繞著任意丨轴以上的角迷度 即可0 50 323304 201220108 方位資料103,為顯示出由磁性感測器62所偵測出之 方位的資料。本實施形態中,方位資料103係以終端裝置 7為基準顯示出預定方位(例如北方)的朝向。惟在產生地 磁以外的磁場之場所中,方位資料103雖然嚴格來說並未 顯示絕對方位(北方等),但顯示出終端裝置7相對於該場 所上的磁場方向之相對方向,故即使在此情況下,亦可算 出終端裝置7的姿勢變化。 終端操作資料97,只要可顯示出操作終端裝置7之使 用者的操作者即可,亦可僅包含上述各資料98至103中的 任一個。此外,當終端裝置7具有其他輸入手段(例如觸控 墊或控制器5的攝影手段等)時,終端操作資料97亦可包 含對該其他輸入手段所進行之操作的資料。如本實施形 態,當將終端裝置7本身的動作用在遊戲操作時,終端操 作資料97,如第2加速度資料101、第2角速度資料102、 或方位資料103,可包含因應終端裝置7本身的動作而改 變該值之資料。 攝影機圖像資料104,為顯示出由終端裝置7的攝影 機56所攝影之圖像(攝影機圖像)的資料。攝影機圖像資料 104,為藉由編解碼器LSI 27將來自終端裝置7的壓縮圖 像資料解壓縮後之圖像資料,並藉由輸出輸入處理器11a 記憶於主記憶體。主記憶體中,可從最新(最後取得)者依 序記憶預定個數的攝影機圖像資料。 麥克風聲音資料105,為顯示出由終端裝置7的麥克 風69所偵測出之聲音(麥克風聲音)的資料。麥克風聲音資 51 323304 201220108 料105,為藉由編解碼器LSI27將從終端裝置7傳送來的 壓縮聲音資料解壓縮後之聲音資料,並藉由輸出輸入處理 器11a記憶於主記憶體。 處理用資料106,為後述遊戲處理(第12圖)中所使用 之資料。處理用資料106係包含:控制資料107、控制器 姿勢資料108、終端姿勢資料109、圖像辨識資料110、及 聲音辨識資料111。除了第11圖所示之資料外,處理用資 料106亦包含:顯示出在遊戲中登場之各種物件中所設定 的各種參數之資料等之遊戲中所使用之各種資料。 控制資料107,為顯示出對終端裝置7所具備之構成 要素所進行的控制指示之資料。控制資料107,例如顯示 出控制標示部55的點燈之指示,或是控制攝影機56的攝 影之指示等。控制資料107係在適當的時機被傳送至終端 裝置7。 控制器姿勢資料108,為顯示出控制器5的姿勢之資 料。本實施形態中,控制器姿勢資料108係根據上述控制 器操作資料92中所包含之第1加速度資料94、第1角速 度資料95、及標示器座標資料96來算出。關於控制器姿 勢資料108的算出方法,將於步驟S23中說明。 終端姿勢資料109,為顯示出終端裝置7的姿勢之資 料。、本實施形態中.,終.端姿勢資料109係根據上述終端操 作資料97中所包含之第2加速度資料101、第2角速度資 料102、及方位資料103來算出。關於終端姿勢資料109 的算出方法,將於步驟S24中說明。 52 323304 ⑧ 201220108 圖像辨識資料110,為顯_、 預定的圖像辨識處理之結果的^對上述攝影機圖像進行 要可從攝影機圖像偵測出某稀資料。該圖像辨識處理,只 任意處理,例如可為從攝影特徵並輪出該結果者均可為 用者的臉或標示等),並算圖像中_取預定對象(例如使 理。 與擷取對象相關之資訊之處 聲音辨識資料111,為裔 預定的聲音觸處理之結果7對上祕克風聲音進行 要可從麥克風聲音偵測出某音辨識處理,只 任意處理,例如可為_::=== 輸出音量之處理。 幻一之處理或僅為 =著參照第12圖,說明麵戲裝置3中所進行之遊戲 處,的禅細内容。第12圖係顯示遊戲裝置3中所執行之遊 戲處理的流程之主流程圖。當開啟遊戲裝置3的電源時, 遊齡置3的CPU H)係執行記憶於未圖示的開機_之啟 動程式’藉此進行主記憶體等之各單元的初始^然後將 記憶於光碟4之遊戲程式讀取至主記憶體,並藉由cpu 1〇 開始執行該遊戲程式。遊戲裝置3中,可構成為電源開啟 後立即執行記憶於光碟4之遊戲程式,或是構成為電源開 啟後首先執行顯示出預定選單畫面之内建程式,然後在藉 由使用者指示遊戲的開始時再執行記憶於光碟4之遊戲程 式。第12圖所示之流程圖,為結束上述處理後所進行的處 理之流程圖。 第12圖所示之流程圖的各步驟的處理僅為一例,只要 53 323304 201220108 可得到同樣結果,則亦可替換各步驟的處理順序。此外, 變數值或判斷步驟中所運用之臨限值亦僅為一例,可因應 必要採用其他值。此外,本實施形態中,係說明由CPU 10 來執行上述流程圖之各步驟的處理,但亦可由CPU 10以外 的處理器或專用電路來執行上述各步驟之一部分步驟的處 理。 首先,步驟S1中,CPU 10執行初始處理。初始處理, 例如為建構虛擬遊戲空間,並將在遊戲空間中登場之各物 件配置在初始位置,或是設定遊戲處理中所使用之各種參 數的初始值之處理。 此外,本實施形態中,初始處理中,CPU 10係根據遊 戲程式的種類來控制標示裝置6及標示部55的點燈。在 此,遊戲系統1係具有標示裝置6與終端裝置7的標示部 55的兩者作為控制器5的攝影手段(攝影資訊運算部35) 之攝影對象。因遊戲内容(遊戲程式種類)的不同來使用標 示裝置6及標示部55中的任一個或兩者。遊戲程式90中, 係包含顯示出是否對標示裝置6及標示部55的各個進行點 燈之資料。CPU 10讀取該資料並判斷是否該點燈。將標示 裝置6及/或標示部55點燈時,係執行下列處理。 亦即,將標示裝置6點燈時,CPU 10將使標示裝置6 所具備冬各紅外線LEP點燈之控制_訊號傳送至標示裝置 6。該控制訊號的傳送可為僅供給電力者。因應於此,標示 裝置6的各紅外線LED被點燈。另一方面,將標示部55點 燈時,CPU 10產生表示出使標示部55點燈之指示的控制 54 323304 ⑧ 201220108 資料,並記憶於主記憶體中。所產生之控制資料,在後述 步驟S10中被傳送至終端裝置7。在終端裝置7的無線方 式模組70中所接收之控制資料,經由編解碼器LSI 66被 傳送至UI控制器65,UI控制器65進行將標示部55點燈 之指示。藉此使標示部55的紅外線LED點燈。上述中係說 明將標示裝置6及標示部55點燈之情況,關於標示裝置6 及標示部55的熄滅,可藉由與點燈時為相同處理來進行。 在以上的步驟S1後,執行步驟S2的處理。以下,係 以每隔預定時間(1圖框時間)為1次之比率來重複執行由 步驟S2至S11的一連串處理所構成之處理迴路。 步驟S2中,CPU 10取得從控制器5所傳送來之控制 器操作資料。由於控制器5重複將控制器操作資料傳送至 遊戲裝置3,所以在遊戲裝置3中,控制器通訊模組19逐 次接收該控制器操作資料,並藉由輸出輸入處理器11a將 所接收之控制器操作資料逐次記憶於主記憶體。接收傳送 的間隔,較佳係較遊戲的處理時間為短,例如為2 0 0分之 一秒。步驟S2中,CPU 10從主記憶體中讀取最新的控制 器操作資料92。在步驟S2後執行步驟S3的處理。 步驟S3中,CPU 10取得從終端裝置7所傳送來之各 種資料。終端裝置7,由於將終端操作資料及攝影機圖像 資料及麥克風聲音資料重複傳送至遊戲裝置3,所以遊戲 裝置3逐次接收此等資料。遊戲裝置3中,終端通訊模組 28逐次接收此等資料,藉由編解碼器LSI 27對攝影機圖 像資料及麥克風聲音資料逐次施以解壓縮處理。此外,輸 55 323304 201220108 出輸入處理器lla將將終端操作資料及攝影機圖像資料及 麥克風聲音資料逐次記憶於主記憶體。步驟S3中’ CPU 1〇 從主§己憶體中凟取最新的終端操作資料9γ。在步驟S3後 執行步驟S4的處理。 步驟S4中,CPU 10執行遊戲控制處理。遊戲控制處 理,為依循使用者的遊戲操作來執行使遊戲空間内的物件 動作之處理等,以使遊戲進行之處理。本實施形態中,使 用者可使用控制器5及/或終端|置7來進行種種遊戲。 以下參照第13圖來說明遊戲控制處理。 第13圖係㈣賴控制處理料細流程之流程圖。第 13圖所不之-連串處理,為將控制器5及終端裝置7用作 為操作裝置時所能夠執行之種種處理,但並不—定須執行 全4的各處理’可因應遊戲的種類或内容而僅執行一部分 處理。 遊戲控制處理中,首先在步驟奶中,哪1〇係判定 是否變更所使㈣私ϋ。如上我,本實施雜中,遊 戲處理的開始時(步驟S1),係執行控制標示裝置6及標示 部55的點燈之處f在此,因遊戲的Μ,㈣量有在遊 戲的中途會變更,標示裝置6及標示部55中所使用(點燈) 之對象的情況。此外,因游戧沾 卜⑽戲的不同,雖考量有使用標示 裝.置.6及標示部55兩者之情jγ ^ . 有11况但將兩者點燈時,會有將 某一方的“示器誤债測為另—方 . ^ . Α 万的軚不器之疑慮。因此, 遊戲中,亦有較佳僅將任一方駐 丰睞Q91 彳點燈來切換點燈而使用之情 況。步驟S21的處理,為考量嗲 里μ If % ’來判定是否在遊戲 323304 56 201220108 的中途需變更點燈對象之處理。 上述步驟S21的判定,例如可藉由以下方法來進行。 亦即,CPU 10可因應遊戲狀況(遊戲的階段或操作對象等) 是否產生變化來進行上述判定。此係由於可考量到當遊戲 狀況產生變化時,會在朝向標示裝置6來操作控制器5之 操作方法、與朝向標示部55來操作控制器5之操作方法之 間變更操作方法之故。此外,CPU 10可根據控制器5的姿 勢來進行上述判定。亦即,可藉由判斷控制器5朝向標示 裝置6或朝向標示部55者來進行上述判定。控制器5的姿 勢例如可根據加速度感測器37或迴轉感測器48的偵測結 果來算出(參照後述步驟S23)。此外,CPU 10亦可藉由判 斷是否有使用者的變更指示來進行上述判定。 當上述步驟S21的判定結果為肯定時,執行步驟S22 的處理。另一方面,上述步驟S21的判定結果為否定時, 跳過步驟S22的處理而執行步驟S23的處理。 步驟S22中,CPU 10係控制標示裝置6及標示部55 的點燈。亦即,變更標示裝置6及/或標示部55的點燈狀 態。將標示裝置6及/或標示部55點燈或熄滅之具體處 理,可與上述步驟S1時同樣地進行。在步驟S22後執行步 驟S23的處理。 如上所述,根據本實施形態,藉由上述步驟S1的處 理,可因應遊戲程式的種類來控制標示裝置6及標示部55 的發光(點燈),並藉由上述步驟S21及步驟S22的處理, 因應遊戲狀況來控制標示裝置6及標示部55的發光(點 57 323304 201220108 燈)。 步驟S23中’ CPU ι0係算出控制器5的姿勢。本實施 形態中,控制器5的姿勢係根據第】加速度資料%、第ι 角速度資料95、及標4座標㈣96來算出。以下說明 控制器5的姿勢之算出方法。 首先,CPU 1〇根據!己憶於主記憶體之第i角速度資料 95,算出控制器5的姿勢。從角速度算出控制器5的姿勢 之方法,可為任意方法’該姿勢可使用前:欠的姿勢(前次所 算出之姿勢)與此次的角連度(此次的處理迴路中在步驟幻 所取得之角速度)來算出。具體而言,CPU 10係藉由以此 次的角速度使前次的姿勢旋轉某單位時間份來算出姿勢。 剛次的姿勢是由主錢心所記憶之控制_姿勢資料⑽ 來顯示,此次的角速度由主記憶體中所記憶之第1角速产 資料95來顯示。因此,Cpui〇係從主記憶體讀取控制二 姿勢資料108及第1角迷度資料的,而算出控制器5的姿 勢。顯示出以上述方式所算出之「依據角速度之姿勢」的 資料,被記憶於主記憶體。 從角速度算出姿勢時,可預先決定初始姿勢。亦即, 從角速度算出控制器5的姿勢時,Gpu 1Q可於最初先算出 控制器^的初始姿勢。控制器5的初始姿勢可根據加速度 資料來算出.」或是在使控制器5處於預定姿勢之狀態下讓 遊戲者進行預定操作,而將進行預定操作之時點中的特定 姿勢用作為初始姿勢。當算出控制器5的姿勢作為以空間 上的預定方向為基準之絕對姿勢時,可算出上述初始姿 323304 ⑧ 58 201220108 勢,而當算出控制器5的姿勢作為以例如在遊戲開始時點 中之控制器5的姿勢為基準之相對姿勢時,可不算 初始姿勢。 接著,CPU 10係使用第1加速度資料94來修正 角速度所算出之控制器5的姿勢。具體而言,CPU 1〇首先 從主記憶體中讀取第1加速度資料94,並根據該第丨加迷 度資料94來算出控制器5的姿勢。在此,在控制器$ 呈靜止之狀態下,施加於控制器5之加速度係意味著重为 加速度。因此,該狀態下,可使用加速度感測器37路& 汀w出 之第1加速度資料94來算出重力加速度的方向(重为 向),因此可根據該第1加速度資料94來算出相對於重 方向之控制器5的朝向(姿勢)。顯示出以上述方式所算出 之「依據加速度之姿勢」的資料,被記憶於主記憶體。 當算出依據加速度之姿勢時,CPU 10接著使用依據加 速度之姿勢,來修正依據角速度之姿勢。具體而言,cpui〇 係從主記憶體中讀取顯示出依據角速度之姿勢的資料以及 顯示出依據加速度之姿勢的資料,並以預定比率使依據角 速度之姿勢接近於依據加速度之姿勢來進行修正。該預定 比率可為預先決定的固定值,或是因應第i加速度資料94 所顯不之加速度等來設定。此外,關於依據加速度之姿勢, 由於對以重力方向為軸之旋轉方向無法算出姿勢’因此cpu 1〇對該旋轉方向亦可不進行修正。本實施形態中,將顯示 出以上述方式所得之修正後的姿勢之資料記憶於主記憶 體0 323304 59 201220108 以上述方式來修正依據角速度之姿勢後,CPU 10更使 用標不器座標資料96將修正後的姿勢進行修正。首先,cpu 10根據標不器座標資料96來算出控制器5的姿勢(依據標 示器座私之姿勢)。標示器座標資料96,由於顯示出攝影 圖像内之;^示H 6R及6L的位置,戶斤以可從此等位置中算 出與橫搖(ro⑴方向(繞著z軸之旋轉方向)相關之控制器 5 ^姿勢。亦即,攝影圖像内,可從連結標示器6R的位置 及=器6L的位置之直線的斜率中,算出與橫搖方向相關 之控制4 5的姿勢。此外,當可特定出控制器5相對於標 不裝置6之位置時(假定例如控制器5位於標示裝置6的正 ^時),可從攝影圖像内之標示裝置6的位置巾,算出與縱 偏轉方向相關之控制器5的姿勢。例如,當攝影 号5改:L6R及6L的位置往左移動時,可判斷為控制 器5改艾朝向(姿勢)往右。如此,可從標示器 :置中’算出與縱搖方向及偏轉方向相關之控制器5的; 制二:^方式’可根據標示器座標資料96來算出控 依據標示器座標之姿勢時,CPU 10接著藉由依 不不器座標之姿勢來修正上述修正後的姿勢 勢進行修正後之姿勢)。亦即,⑽1Q係以預定比 更U正後的姿勢接近於依據標示器座標之姿 定比率可為預先決定的蚊值1外’由依據標 丁°座払之姿勢所進行的修正,可僅對橫榣方向、縱搖方 向、及偏轉方向中的任意方向或2個方向進行。例如,使 323304 ⑧ 201220108 用標示器座標資料96時可對橫搖方向精度佳地算出姿 勢,所以CPU 10可使用依據標示器座標資料96之姿勢僅 對橫搖方向進行修正。此外,當未藉由控制器5的攝影元 件40將標示裝置6或標示部55攝影時,無法算出依據標 示器座標資料96之姿勢,故此時亦可不使用標示器座標資 料96來執行修正處理。 根據上述,CPU 10係使用第1加速度資料94及標示 器座標資料96,來修正根據第1角速度資料95所算出之 控制器5的姿勢。在此,於算出控制器5的姿勢之方法中, 使用角速度之方法中,不論控制器5如何動作,均可算出 姿勢。另一方面,使用角速度之方法中,由於累積加算所 逐次偵測出之角速度來算出姿勢,故會有因誤差的累積等 而導致精度惡化,或是因所謂溫度漂移的問題使迴轉感測 器的精度惡化之疑慮。此外,使用加速度之方法中,雖不 會累積誤差,但在控制器5激烈動作之狀態下,無法精度 佳地算出姿勢(無法正確地偵測重力方向之故)。此外,使 用標示器座標之方法中,可精度佳地算出姿勢(尤其對橫搖 方向),但在無法將標示部55攝影時無法算出姿勢。相對 於此,根據本實施形態,如上所述由於使用不同特長的3 種方法,所以更能夠正確地算出控制器5的姿勢。其他實 施形態中,亦可使用上述3種方法中的任一種或2種來算 出姿勢。此外,在上述步驟S1或步驟S22的處理中進行標 示器的點燈控制時,CPU 10較佳係至少使用標示器座標來 算出控制器5的姿勢。 61 323304 201220108 在上述步驟S23後勃私此η rpii iii在筲钒步驟S24的處理。步驟S24中, 所取:之:鳊裝置7的姿勢。亦即,由於從終端裝置 7所取付之終端操作資料 1〇1 - 〇 + 97中包含有第2加速度資料 m Q2、*方位資_,因此cPU 1〇 可根據此料料來μ 一裝置Week (4) is to perform the game processing I on the controller 5, the user is in addition to pressing, so the user can use the play operation, but also the mobile controller, 'the past-like tour control exercise (4) 5 Tilting for any position. For example, :::r (four) one, move: 3⁄4 display means. [4. Configuration of Terminal Device 7] Next, with reference to Figs. 8 to 10, t is shown in Fig. 8 showing the structure of the outer device _, 1 = respect 7 of the terminal device 7. (4) is a front view of the terminal device 7, (8) ^ Fig. 18 is a view, and (4) is a bottom view. Further, j_, (c) are diagrams showing the appearance of the final device 7. The system is provided with a user's grip. 34 323304 201220108 As shown in Fig. 8, the terminal device 7 is provided with a cover 50 having a square plate shape which is long in the horizontal direction. Since the outer cover 50 is of a size that can be gripped by the user, the user can hold the terminal device 7 by hand to move 'or change the arrangement position of the terminal device 7. The terminal device 7 has an LCD 51 on the surface of the cover 50. ixD 51 # is placed near the center of the surface of the outer cover 50. Therefore, as shown in Fig. 9, by holding the cover 50 on both sides of the LCD 51, the terminal device can be moved while viewing the side of the Lcd. In Fig. 9, the example in which the outer cover 50 is held by the one of the left and right sides of the LCD 51, and the terminal device 7 is held in the longitudinal direction by the lateral grip, but the vertical position can be grasped. The type (longitudinal orientation) holds the terminal device 7. As shown in (a) of Fig. 8, the terminal device 7 has a touch panel 52 as an operation means on the face of the LCD 51. In the present embodiment, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film method, and for example, a touch panel of any type such as a capacitive method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present embodiment, the touch panel 52 is applied to the same resolution (detection accuracy) as the resolution of the LCD 51. However, the resolution of the touch panel 52 does not have to be consistent with the resolution of the LCD 51. The input to the touch panel 52 is usually performed by a stylus, but it is not limited to the stylus, and can also be a user's finger. Input is made to the touch panel 52. The hood 5 is provided with a stylus for accommodating the touch panel 52: a accommodating hole. Thus, since the terminal device 7 has the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 7. That is, using 323304 35 201220108, the screen can be directly input (by the touch panel 52) while moving the side of the LCD 51. As shown in Fig. 8, the terminal device 7 is provided with two analog rockers 53A and 53B and a plurality of operation keys 54A to 54L as operating means. The various types of rockers 53A and 53B are means for indicating the direction. The various types of rocker bars 53A and 53B' are configured such that the rocker portion operated by the user's finger can slide or tilt in any direction (any angle of up, down, left, right, and oblique directions) with respect to the surface of the outer cover 50. Further, the left analog stick 53A and the right analog stick 53B are respectively disposed on the left and right sides of the LCD 51 screen. Therefore, the user can use the analog joystick to enter the direction of the indication by either the left and right hands. Further, as shown in Fig. 9, the various types of rocker levers 53A and 53B are disposed at positions where the user can operate in the state where the left and right portions of the terminal device 7 are held, so that 'even if the user holds the terminal device 7 When moving, it is also easy to operate various types of joysticks 53A and 53B. Each of the operation keys 54A to 54L is an operation means for performing a predetermined input. As will be described below, each of the operation keys 54A to 54L is provided at a position where the user can operate while holding the left and right portions of the terminal device 7 (refer to Fig. 9). Therefore, even when the user holds the terminal device 7 to move, the operation means can be easily operated. As shown in Fig. 8(a), on the surface of the outer cover 50, a cross key among the operation keys 54A to 54L is provided. The input key 54A and the keys 54B to 54H are input. That is, the keys 54A to 54H are disposed at positions where the user's thumb can operate (refer to Fig. 9). The cross key 54A is disposed on the left side of the LCD 51 and is the lower side of the left analog stick 323304 8 36 201220108 53A. That is, the 'cross key 54A' is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape and is a key that can indicate the directions of up, down, left, and right. Further, keys 54B to 54D are provided on the lower side of the LCD 51. These three keys 54B to 54D are disposed at positions where the left and right hands can operate. Further, the 'four keys 54E to 54H' are disposed on the right side of the LCD 51 and are the lower side of the right analog rocker 53B. That is, the four keys 54E to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device 7 can have the four keys 54E to 54H having a function for indicating the up, down, left and right directions to the user's keys. Further, as shown in Fig. 8(a), Fig. and Fig. 8, the first L key 541 and the first R key 54J are provided at obliquely upper portions (upper left portion and upper right portion) of the outer cover 5''. Specifically, the first L-key 541 is provided on the left side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the left side. Further, the first R key 54J is provided on the right side end of the upper side of the outer cover 5A, and is exposed from the upper side and the right side surface. Thus, the first L^541 is placed at a position where the user's left index finger can be operated, and the first R key 54J is placed at a position where the user's right index finger can be operated (see Fig. 9). Further, as shown in Fig. 8 (b) and (c), the second L key 54K and the second R key 54L' are disposed on the surface of the back surface of the plate-like outer cover 5 (i.e., the lCE is provided) 51. On the opposite side, the prosthetic portions 59A and 59B are provided. ... body. The first key 54K is disposed slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the first key 54L is disposed on the right side of the back cover of the cover 37 323304 201220108 50 (right when viewed from the front side) A little above the side). In other words, the 2nd L key 54K is disposed at a position substantially opposite to the left analog rocker 53A provided on the surface. The 2nd R key 54L is disposed at a position substantially opposite to the right analog rocker 53B provided on the surface. Thus, the first hole key 54K is disposed at a position where the user's left middle finger can be operated, and the second R key 54L is disposed at a position where the user's right middle finger can be operated (refer to Fig. 9). Further, as shown in Fig. 8(c), the second L key 54K and the second R key 54L' are provided on the obliquely upward faces of the leg portions 59A and 59B, and have buttons that are obliquely upward. surface. Since the middle finger is supposed to move in the up and down direction when the user holds the terminal device 7, the user can easily press the second L key 54K and the first search key 54L by pressing the button face upward. Further, by providing the foot portion on the back surface of the outer cover 50, the user can easily hold the outer cover 50, and by providing a key at the foot, it is easy to operate under the grip of the outer cover 5. In the terminal device 7 shown in FIG. 8, when the second L-key 54A and the second R-key 54L are provided on the back surface, when the terminal device 7 is downloaded with the top surface of the LCD 51 (the surface of the cover 50) facing upward, Sometimes the face is not completely level. Therefore, in other embodiments, three or more feet may be provided on the back surface of the outer cover 5A. According to this, the foot can be placed on the ground while the face of the LCD 51 is facing upward, and the foot can be placed horizontally on the terminal device 7. Further, the terminal device 7 can be placed horizontally by adding a detachable foot. Each of the operation keys 54A to 54L is appropriately assigned a function corresponding to the game program. For example, the 'cross key 54A and the keys 54E to 54H can be used to cancel the operation or the like in the direction finger 323304 8 201220108. Each of the keys 5 sen is available for decision operation or although not shown, but the terminal device 7 has a power button for guiding the power of the device 7. In addition, the terminal device widely turns on/off the key of the LCD display of the LCD 51, or the button for setting the connection (pairing) of the disk, and the adjustment speaker (the speaker 67 of the U-shaped figure) Volume button. As shown in Fig. 8(a), the terminal device 7 is provided with a marker portion (the first indication portion 55) including the marker 55A and the marker on the surface of the cover 5''. The indicator portion 55 is provided on the upper side of the curry. Each of the marker 55A and the marker 55B' is the same as each of the markers 6r and 6L of the indicator device 6, and is composed of the upper or lower infrared LEDs. The indicator unit 相同 is used in the same manner as the above-described indicator device 6, and is used to calculate the operation of the controller 5 and the like. Further, the game device 3 can control the lighting of the respective infrared LEDs provided in the indicator portion J. The terminal device 7 is provided with a camera 56 as a photographing means. Photograph; machine. The 56 series includes a photographic element having a predetermined resolution (for example, an image detector or a CCD image sensor, etc.) and a lens. As shown in the § diagram, in the form, the camera 56 is disposed on the surface of the outer cover 50. Therefore, the photographing and % can photograph the face of the user holding the terminal device 7, and the user can take a picture while watching the LCD 51 while playing the game. The terminal device 7' is provided with a microphone 69) as a microphone input means. A 6Q ° microphone 69 is disposed on the surface of the outer cover 5〇 in the outer hole 60 for the cover 39 323304 201220108. The microphone detects the sound of the user and the like, and the sound around the terminal device 7. The terminal device 7 includes a speaker as a sound output means (a speaker 67 shown in Fig. 10). As shown in FIG. 8(d), a speaker hole 57 is provided on the lower side surface of the outer cover 50. The output sound of the speaker 67 is output from the speaker hole 57. In the present embodiment, the terminal device 7 has two speakers, and speaker holes 57 are provided at respective positions of the left speaker and the right speaker. Further, the terminal device 7 is provided with an expansion connector 58 for connecting another device to the terminal device 7. In the present embodiment, as shown in Fig. 8(d), the expansion connector 58 is provided on the lower side surface of the outer cover 50. The other device connected to the expansion connector 58 may be any device such as a controller (gun type controller or the like) used in a specific game or an input device such as a keyboard. If there is no need to connect other devices, the expansion connector 58 may not be provided. Regarding the terminal device 7 shown in Fig. 8, the shape of each of the operation keys or the cover 50, the number of each component, the installation position, and the like are merely examples, and may be other shapes, numbers, and installation positions. Next, the internal configuration of the terminal device 7 will be described with reference to Fig. 10. Fig. 10 is a block diagram showing the internal configuration of the terminal device 7. As shown in Figure 10Λ. Terminal device 7 in addition to the first. 8 is also provided with: a touch panel controller 61, a magnetic sensor 62, an acceleration sensor 63, a gyro sensor 64, a user interface controller (UI controller) 65, Decoder LSI 66, speaker 67, sound 1C 68, microphone 69, wireless mode module 40 323304 8 201220108 group 70, antenna 71, infrared communication module 72, flash memory, power supply 1C 74, and battery 75. The electronic component is mounted on the electronic circuit board and housed in the housing 50. The UI controller 65' is a circuit for inputting and outputting control data to various wheel input units. The controller 65 is connected to the touch panel. The controller, the analog rocker 53 (analog rocker 53A and 53B), the operation key 54 (each operation key 54A to 54L), the indicator portion 55, the magnetic sensor 62, the acceleration sensor 63, and the rotation sensor 64 Further, the controller 65 is connected to the codec LSI 66 and the expansion connector 58. Further, the power source IC '74 is connected to the UI controller 65, and supplies power to each unit via the UI controller 65. Built-in The battery 75 is connected to the power source Ic 74 and supplied with electric power. A charger 76 that receives electric power from an external power source via a connector or the like is connected to the power source 1C 74, and the terminal device 7 can perform charging when the charger 76 or the cable is used to supply electric power from an external power source. The terminal device 7 is also The charging can be performed by attaching the terminal device 7 to a charging device having a charging peach (not shown). The touch panel controller 61 is connected to the touch panel and is a circuit for controlling the touch panel 52. The touch panel The controller 61 generates a predetermined form of touch position data according to the signal from the touch panel 52 and outputs the touch position data to the UI controller 65. The touch position data is displayed on the input surface of the touch panel 52. The position of the rear position. The touch panel controller 61 is configured to take the signal from the touch panel 52 and generate the touch position data at a ratio of i times every predetermined time. Further, the miscellaneous panel 52 is performed. The various control indications are output from the (four) controller 65 to the touch panel controller 323304 201220108 61 〇 analog rocker 5 3 ' will display the direction in which the user's finger rocker slides (or dumps) and |The missing (4) P gate and the amount of the rocker data are output to the υτ controller 65. In addition, the operation key 54 is the operation of the input state (whether or not) for each of the two keys. The key data is rotated out to the magnetic sensor 62' by detecting the magnitude and orientation of the magnetic field. The orientation data showing the prepared orientation is output to the heart. In addition, the control indication to the magnetic sensor 62 is The controller 65 outputs to the magnetic sensor 62. With respect to the magnetic sensor 62 I, an MI (magnetic impedance) element, a magnetic flux sensor, a Hall element = (major magnetoresistive) element, TMR (tunneling magnetic) are used. A sensor such as a resistor or a 3 MR (external magnetoresistive) component, but as long as it can be detected: It can be used. Strictly speaking, the position data of the sentence in the place where the magnetic field is generated other than geomagnetism does not show the orientation, but even in this case, the orientation data changes when the terminal is shaken 7 , so the paper can be calculated; The posture of 7 changes.嗔 The acceleration sensor 63 is placed inside the outer cover 50, and the detection j is out of the straight line in the direction of the three axes (the xyz axis shown in the figure (a) of Fig. 8). Specifically, the acceleration sensor 63 is in the longitudinal direction of the outer cover 50. For the X-axis, the short-side direction of the outer cover 5 is the y-axis, and the direction perpendicular to the surface of the outer cover is the ζ axis to detect the value of the linear acceleration of each axis. ^ The acceleration data of the detected acceleration is displayed. It is output to UI control 65. In addition, a control indication made by the acceleration sensor 63 is outputted to the acceleration sensor 63 by the 42^233〇4 201220108 UI controller 65. In the present embodiment, the acceleration sensor 63 is, for example, a capacitive MEMS type acceleration sensor. However, in other embodiments, other types of acceleration sensors may be used. In addition, the acceleration sensor 63 can also be an acceleration sensor that detects a 1-axis or 2-axis direction. The gyro sensor 64 is disposed inside the outer cover 50, and measures the angular velocity of the three axes around the X-axis, the y-axis, and the z-axis. The angular velocity data showing the detected angular velocity is output to the UI controller 65. Further, a control instruction to the swing sensor 64 is output from the UI controller 65 to the swing sensor 64. The number and combination of the slewing sensors for detecting the angular velocities of the three axes may be arbitrary, and the gyro sensor 64 is the same as the gyro sensor 48, and may be a 2-axis gyro sensor and a 1-axis gyro sensor. Composition. In addition, the gyro sensor 64 can also measure the 1-way or 2-axis direction of the slewing reducer UI controller 65, which includes the touch position data, the joystick data, and the operation received from the above components. The operation data of the key data, the orientation data, the acceleration data, and the angular velocity data are output to the codec LSI 66: When the other device and the terminal device are connected via the expansion connector 58? When connected, the above operation data may further include a codec LSI 66 that displays operations performed on the other devices to perform piezoelectric processing on the data transmitted to the game device 3: and decompress the data transmitted from the game device 3. 56 56#, 1 (: 68, wireless mode module 7 (), flashing external line communication module 72. In addition, the lying lsi 66 package and 323304 43 201220108 internal memory 78 are programmed. The terminal device 7 is configured not to perform The game process itself, but must execute the minimum level of management or communication for the terminal device 7. When the power is turned on, the program stored in the flash memory 73 is read to the internal memory 78 and executed by the CPU 77. In this case, the terminal device 7 is activated. Further, a part of the internal memory 78 is used as the VRAM of the LCD 51. The camera 56 photographs the image in accordance with an instruction from the game device 3, and outputs the image data after the shooting to the image. The codec LSI 66. The control instruction for the camera 56, such as an image shooting instruction, is output from the codec LSI 66 to the camera 56. The camera 56 can also be operated. That is, the camera 56 can perform repeated photography and repeatedly output image data to the codec LSI 66. The sound 1C 68 is connected to the speaker 67 and the microphone 69, and is used to control the sound data of the speaker 67 and the microphone 69. The circuit for outputting input, that is, when receiving sound data from the codec LSI 66, the sound 1C 68 outputs an audio signal obtained by D/A conversion of the sound data to the speaker 67, and outputs sound from the speaker 67. Further, the microphone 69 detects the sound transmitted to the terminal device 7 (the user's voice or the like), and outputs the sound data indicating the sound to the sound 1C 68. The sound 1C 68 performs A/D on the sound signal from the microphone 69. Converting and outputting the sound data of a predetermined form to the codec LSI 65. The codec LSI 66 transmits image data from the camera 56, sound data from the microphone 69, and operation data from the UI controller 65 as terminal operation data to the game via the wireless module 70. 323304 8 201220108 Device 3. In the present embodiment, the codec LSI 66 performs the same compression processing on the image data and the sound data as the codec LSI 2. The terminal operation data and the compressed image data and sound data are output to the wireless mode module 70 as transmission data. An antenna 71 is connected to the wireless module 7A, and the wireless module 70 transmits the transmission data to the game device 3 via the antenna 71. The wireless mode module 70 has the same function as the terminal communication module 28 of the gaming device 3. That is, the wireless mode module 7 has, for example, by IEEE 802. The function of the LLn specification to connect to the wireless LAN. The information transmitted may be browned or not encoded as necessary. As described above, the clothes that are sent to the game device 3 contain operation data (terminal operation data), image data, and sound data. When the other device is connected to the terminal I via the expansion connector 58, the above-mentioned transmission data may further include (d) the secondary material received by the other device. In addition, the infrared communication module 72 performs infrared communication with other devices, for example, following the leg specifications. Codec Further example It is necessary to transmit the value of the packet 6 received via infrared communication to the game device 3. Further, as described above, the compressed image game device 3 is transferred to the terminal device 7 as described above. These data ^^ > are encoded and decoded from the line mode module 70! uI66. Line 7丨 and no image data and sound to be received;: Pressure: The material is output to LCD5! 'And the sound data after the LCD51 is shrunk is output to the sound map ^ In addition, 'Xie Zhuo' ic 68 from Yang t 323304 45 201220108 Sounder 67 outputs sound. Further, 'when the control data is included in the data received from the game device 3', the codec LSI 66 and the UI controller 65 control the respective units to follow the control data. As described above, the control data indicates that the terminal device 7 is displayed. The information of each of the components (in the present embodiment, the camera 56, the touch panel controller 61, the indicator unit 55, the sensors 62 to 64, and the red 72) is controlled. In the control instruction shown in the second embodiment, it is presumed that the above-described respective components are operated or the operation is stopped (i is not used in the stop-motion; the main is not the same. The power consumption can be stopped, and the data is transmitted to the game device 3 without transmission. Information on the components of the rest. Since the indicator portion 55 is , . The line LED, so this control can be set only to turn on/off the power supply. As described above, the terminal device 7 includes the operation means of the touch panel 52, the analog rocker 53, and the operation key 54, but in other embodiments, it may be configured to provide other secrets to reduce the skew or " Have the same means of operation. Further, the terminal device 7 includes a magnetic sensor 62, an acceleration sensor 63, and a rotation sensor $64 as calculations for calculating the operation (including position or posture or position or posture change) of the terminal device 7. Sensor, but in other embodiments, - can. It is configured to include only one of these sensors +. Further, in the embodiment, the sensor may be provided with other sensors instead of or in combination with the sensors. Further, the terminating device 7 is provided with a camera 56 and a microphone 69, and is configured as 46 323304 8 201220108. However, in other embodiments, the camera 56 and the microphone 6 may not be provided or only one of them may be provided. Further, the terminal device 7 includes a marker 55 as a configuration for calculating the positional relationship between the terminal device 7 and the controller 5 (the position and/or posture of the terminal device 7 when viewed from the controller 5), but other embodiments In addition, the indicator 55 may not be provided. Further, in another embodiment, the terminal device 7 may include other means as a configuration for calculating the positional relationship. For example, in another embodiment, the controller 5 may be provided with an indicator portion and the terminal device 7 may be provided with an imaging element. Further, at this time, the indicator device 6 may be provided with a photographic element instead of the infrared LED. [5. Game Processing] Next, the details of the game processing executed in the game system will be described. First, various materials used in game processing will be described. Figure 11 shows a diagram of the various materials used in game processing. In Fig. 11, a diagram showing main data stored in the main memory (external main memory 12 or internal main memory lie) of the game device 3 is shown. As shown in Fig. 11, in the main memory of the game device 3, the game program 90, the received data 91, and the processing material 106 are stored. In addition to the information shown in Fig. 11, the main memory stores the information required for the game data such as the image data of various objects appearing in the game or the sound data used in the game. The game program 90 reads all or a portion of the game device 3 from the optical disc 4 and stores it in the main memory at an appropriate timing after the game device 3 is powered on. The game program 90 can also be retrieved from the flash memory 17 or an external device of the game device 3 (e.g., via the Internet) in place of the 47 323304 201220108 mode read from the disc 4. In addition, one part of the game program 90 is included (the routine is used to calculate the posture of the controller 5 and/or the terminal device 7 in the game device 3. The data 91 can be received in advance, and the slave controller 5 and The various materials received by the terminal device 7. The received data 91 includes: controller operation data q operation data 97, camera image data 104, and microphone sound data 105. When a plurality of controllers 5 are connected, the controller operates the data. θ , , there are multiple. When there are multiple terminals, the terminal operation: 97, camera image data 1〇4, and microphone sound data 1〇5, ', several. There are also complex controller operation data 92 In order to display the data of the operation performed by the user (the player) on the controller 5. The controller operation data 9 is transmitted from the controller 5 and acquired in the game m, and is memorized in the main memory. The controller operation data 92 includes: the first operation key data 93, the jth acceleration data 94, the i-th angular velocity data 95, and the marker coordinate data/remembered body towel' can be recorded from the latest (money acquisition) _ controller The data operation key 93 is used to display the data of the input state set to the controller 5: his to 321. Specifically, the i-th strike key data 93 displays the respective operations a32a to 32i. Whether it is pressed or not. The acceleration sense of the detector is the three-dimensional acceleration of each of the knives in the axial direction. However, in other embodiments, 323304 48 201220108, the acceleration can be displayed in any one direction or more. The first angular velocity data 95 is data showing the angular velocity detected by the rotation sensor 48 of the controller 5. Here, the first angular velocity data 95 shows the XYZ shown in FIG. The angular velocity in the three-axis direction, but in other embodiments, the angular velocity may be displayed around any of the 丨 axes. The erector 96 is displayed by the photographic information computing unit 35 = The coordinate calculated by the image processing circuit 41, that is, the information of the marker coordinates. The marker coordinates are represented by a 2-dimensional coordinate system for displaying the position on the plane corresponding to the photographic image, and the marker coordinate data is displayed. (10) shows the 2-dimensional coordinate system The controller operation data 92 can be displayed as long as the operator of the user operating the controller 5 can be displayed, or only one of the above-mentioned materials 93 to 96 can be included, and when the controller 5 has other When the input means (for example, the touch panel, the controller operation data 92 may also include (4) indicating that the means is performed. If the present embodiment is used in the game operation, the dry data 94, the first sth speed data 95 And the standard change can be included in response to the action of the controller 5 itself: = the data 97 is displayed to indicate that the user is on the terminal device 7 (6) is working on the data 97 is from the terminal device purchase 97 Passing 'and being remembered by the main memory. Terminal operation 1 323304 49 201220108 Funding # 100, 2nd acceleration data 1 (H, 2nd angular velocity data 1〇2, and orientation data. In the main memory, you can recall from the latest (final) The number of terminal operation data. The second operation key data 98 is for displaying information on the operation keys 54A to 54L that are set in the terminal device 7. Specifically, the operation key data 98 displays the operation keys 54A to 54L. Whether the rocker data 99 is used to display the direction and amount of sliding (or dumping) of the rocker portion of the analog rocker 53 (analog analog rocker analog rocker 53B), the above direction and amount, for example, The touch position data 100 is data for displaying the position (touch position) input on the wheel entry surface of the touch panel 52. In the embodiment, the touch position data 100 is displayed. The coordinate value of the 2-dimensional coordinate system for displaying the position on the input surface is displayed. When the touch panel 52 is in the multi-touch mode, the touch position data 1 〇〇 also displays a plurality of touch positions. The second acceleration data 101 is shown to be added by The data of the acceleration (acceleration vector) detected by the sensor 63. In the present embodiment, the second acceleration data 1〇1 shows that the accelerations in the three directions of the xyz shown in Fig. 8 are The three-dimensional acceleration of the component, but in other embodiments, the acceleration can be displayed in any one or more directions. The second angular velocity data 1 〇 2 is displayed by the rotation sensor 64. Data of angular velocity. In the present embodiment, the second angular velocity data i 〇 2 shows the angular velocities of the three axes around xyz shown in Fig. 8, but in other embodiments, it is possible to display any yaw axis around The above angular ambiguity is 0 50 323304 201220108. The orientation data 103 is data showing the orientation detected by the magnetic sensor 62. In the present embodiment, the orientation data 103 is displayed based on the terminal device 7. The orientation of the predetermined orientation (for example, the north). However, in the place where the magnetic field other than the geomagnetism is generated, the orientation data 103 does not strictly indicate the absolute orientation (north, etc.), but shows the terminal device 7 relative to the location. Since the direction of the magnetic field is opposite to each other, the posture change of the terminal device 7 can be calculated in this case. The terminal operation data 97 can be displayed only as long as the operator of the user operating the terminal device 7 can be displayed. Any one of the data 98 to 103. Further, when the terminal device 7 has other input means (for example, a touch pad or a photographing means of the controller 5, etc.), the terminal operation data 97 may also include the other input means. In the present embodiment, when the operation of the terminal device 7 itself is used in a game operation, the terminal operation data 97, such as the second acceleration data 101, the second angular velocity data 102, or the orientation data 103, may include The information of the value is changed by the action of the terminal device 7 itself. The camera image data 104 is data for displaying an image (camera image) captured by the camera 56 of the terminal device 7. The camera image data 104 is an image data obtained by decompressing the compressed image data from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the output/output processor 11a. In the main memory, a predetermined number of camera image data can be memorized in order from the latest (final acquired). The microphone sound data 105 is data for displaying the sound (microphone sound) detected by the microphone 69 of the terminal device 7. The microphone sound is 51, which is the sound data decompressed from the compressed sound data transmitted from the terminal device 7 by the codec LSI 27, and is memorized in the main memory by the output input processor 11a. The processing material 106 is the material used in the game processing (Fig. 12) to be described later. The processing data 106 includes control data 107, controller posture data 108, terminal posture data 109, image recognition data 110, and sound recognition data 111. In addition to the data shown in Fig. 11, the processing material 106 also includes various materials used in the game which display information on various parameters set in various objects appearing in the game. The control data 107 is information for displaying a control instruction to the constituent elements included in the terminal device 7. The control data 107 displays, for example, an instruction to light the control indicator unit 55, or an instruction to control the shooting of the camera 56, and the like. The control data 107 is transmitted to the terminal device 7 at an appropriate timing. The controller posture data 108 is a material showing the posture of the controller 5. In the present embodiment, the controller posture data 108 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The method of calculating the controller posture data 108 will be described in step S23. The terminal posture data 109 is a material for displaying the posture of the terminal device 7. In this embodiment. ,end. The end posture data 109 is calculated based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The method of calculating the terminal posture data 109 will be described in step S24. 52 323304 8 201220108 The image identification data 110 is for the result of the predetermined image recognition process, and the camera image is detected to detect a certain thin material from the camera image. The image recognition process can be arbitrarily processed, for example, the face or the mark of the user can be taken from the photographing feature and the result can be rotated, and the predetermined object can be calculated in the image (for example, the reason). The sound identification data 111 of the information related to the object is taken as the result of the predetermined sound touch processing for the ethnic group. 7 The sound of the upper secret wind is detected by the microphone sound, and only a random sound processing is performed, for example, _ ::=== The processing of the output volume. The processing of the magic one or only the reference to Fig. 12 illustrates the sacred content of the game played in the face device 3. The twelfth figure shows the game device 3 The main flowchart of the flow of the executed game processing. When the power of the game device 3 is turned on, the CPU H) of the game setting 3 executes the main memory, etc., which is stored in the boot program (not shown). The initial ^ of each unit then reads the game program stored in the disc 4 to the main memory, and starts executing the game program by cpu 1〇. In the game device 3, the game program stored in the disc 4 can be executed immediately after the power is turned on, or the built-in program for displaying the predetermined menu screen is first executed after the power is turned on, and then the user is instructed to indicate the start of the game. Then execute the game program stored on CD 4. The flowchart shown in Fig. 12 is a flowchart of the processing performed after the above processing is completed. The processing of each step of the flowchart shown in Fig. 12 is only an example, and as long as 53 323304 201220108 can obtain the same result, the processing order of each step can be replaced. In addition, the variable value or the threshold used in the judgment step is only an example, and other values may be used as necessary. Further, in the present embodiment, the processing of each step of the above-described flowchart is executed by the CPU 10. However, the processing of some of the above steps may be performed by a processor or a dedicated circuit other than the CPU 10. First, in step S1, the CPU 10 performs initial processing. The initial processing, for example, is to construct a virtual game space, and to arrange the items that appear in the game space at the initial position, or to set the initial values of various parameters used in the game processing. Further, in the present embodiment, in the initial processing, the CPU 10 controls the lighting of the pointing device 6 and the indicator portion 55 in accordance with the type of the game program. Here, the game system 1 has both the pointing device 6 and the indicator portion 55 of the terminal device 7 as the imaging target of the imaging means (photographing information computing unit 35) of the controller 5. Either or both of the display device 6 and the indicator portion 55 are used depending on the game content (game program type). The game program 90 includes information indicating whether or not the pointing device 6 and the indicator portion 55 are turned on. The CPU 10 reads the data and judges whether or not the lighting is to be performed. When the marking device 6 and/or the indicator portion 55 are turned on, the following processing is performed. That is, when the indicator device 6 is turned on, the CPU 10 transmits a control_signal for each winter infrared LEP lighting provided in the indicator device 6 to the indicator device 6. The transmission of the control signal can be a power supply only. In response to this, each of the infrared LEDs of the marking device 6 is lit. On the other hand, when the indicator portion 55 is turned on, the CPU 10 generates a control 54 323304 8 201220108 data indicating that the indicator portion 55 is turned on, and is stored in the main memory. The generated control data is transmitted to the terminal device 7 in step S10 which will be described later. The control data received by the wireless module 70 of the terminal device 7 is transmitted to the UI controller 65 via the codec LSI 66, and the UI controller 65 performs an instruction to turn on the indicator portion 55. Thereby, the infrared LED of the indicator portion 55 is turned on. In the above description, the marking device 6 and the indicator portion 55 are turned on, and the extinguishing of the indicator device 6 and the indicator portion 55 can be performed by the same processing as when lighting. After the above step S1, the processing of step S2 is performed. Hereinafter, the processing circuit constituted by the series of processes of steps S2 to S11 is repeatedly executed at a ratio of one predetermined time (1 frame time). In step S2, the CPU 10 acquires the controller operation data transmitted from the controller 5. Since the controller 5 repeatedly transmits the controller operation data to the game device 3, in the game device 3, the controller communication module 19 successively receives the controller operation data, and controls the reception by the output input processor 11a. The device operation data is memorized in the main memory. The interval at which the transmission is received is preferably shorter than the processing time of the game, for example, one second of the second. In step S2, the CPU 10 reads the latest controller operation data 92 from the main memory. The process of step S3 is performed after step S2. In step S3, the CPU 10 acquires various kinds of materials transmitted from the terminal device 7. The terminal device 7 repeatedly transmits the terminal operation data, the camera image data, and the microphone sound data to the game device 3, so that the game device 3 sequentially receives the data. In the game device 3, the terminal communication module 28 successively receives the data, and the codec LSI 27 sequentially applies decompression processing to the camera image data and the microphone sound data. In addition, the input processor lla will sequentially store the terminal operation data, the camera image data and the microphone sound data in the main memory. In step S3, the CPU 1 凟 retrieves the latest terminal operation data 9γ from the main § memory. The processing of step S4 is performed after step S3. In step S4, the CPU 10 executes game control processing. The game control process performs processing for moving the objects in the game space in accordance with the game operation of the user, so that the game is processed. In the present embodiment, the user can use the controller 5 and/or the terminal 7 to perform various types of games. The game control processing will be described below with reference to Fig. 13. Figure 13 is a flow chart of the process flow of the control processing material. In the same manner as in Fig. 13, the series of processes are various processes that can be performed when the controller 5 and the terminal device 7 are used as the operation device, but the process of the game 4 is not required to be executed. Or content and only perform a part of the processing. In the game control process, first of all, in the step milk, which one determines whether or not to change (4) privately. As described above, in the present embodiment, at the start of the game processing (step S1), the lighting point f of the control signing device 6 and the indicator portion 55 is executed here, and the amount of the game is 中, (4) the amount is in the middle of the game. The case where the object to be used (lighting) is used in the marking device 6 and the indicator portion 55 is changed. In addition, due to the difference between the play and the play (10), although the use of the mark is considered. Set. 6 and the indication part 55 both feelings jγ ^ . There are 11 conditions, but when the two lights up, there will be a certain party's "missing debts measured as another." ^ . Α Wan’s ignorance is not a doubt. Therefore, in the game, it is better to use only one of the parties to favor the Q91 彳 lighting to switch the lighting. The process of step S21 is to determine whether or not the lighting object needs to be changed in the middle of the game 323304 56 201220108 in consideration of If μ μ % % '. The determination of the above step S21 can be performed, for example, by the following method. That is, the CPU 10 can perform the above determination in accordance with whether or not the game situation (the stage of the game, the operation target, and the like) changes. This is because it is possible to change the operation method between the operation method of operating the controller 5 toward the pointing device 6 and the operation method of operating the controller 5 toward the indicator portion 55, since it is considered that the game situation changes. Further, the CPU 10 can make the above determination based on the posture of the controller 5. That is, the above determination can be made by judging that the controller 5 faces the pointing device 6 or the pointing portion 55. The posture of the controller 5 can be calculated, for example, based on the detection results of the acceleration sensor 37 or the rotation sensor 48 (refer to step S23 described later). Further, the CPU 10 can also perform the above determination by judging whether or not there is a change instruction of the user. When the result of the determination in the above step S21 is affirmative, the processing of step S22 is performed. On the other hand, if the result of the determination in the above step S21 is negative, the processing of step S23 is executed while the processing of step S22 is skipped. In step S22, the CPU 10 controls the lighting of the marking device 6 and the indicator portion 55. That is, the lighting state of the marking device 6 and/or the indicator portion 55 is changed. The specific processing for turning on or off the marking device 6 and/or the indicator portion 55 can be performed in the same manner as in the above step S1. The processing of step S23 is performed after step S22. As described above, according to the present embodiment, by the processing of the above-described step S1, the light emission (lighting) of the pointing device 6 and the indicator portion 55 can be controlled in accordance with the type of the game program, and the processing of the above-described steps S21 and S22 can be performed. The illumination of the marking device 6 and the indicator portion 55 is controlled in response to the game condition (point 57 323304 201220108 lamp). In step S23, 'CPU io 0 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated based on the first acceleration data %, the ι angular velocity data 95, and the standard 4 coordinates (four) 96. The method of calculating the posture of the controller 5 will be described below. First of all, the CPU 1 〇 according to! The posture of the controller 5 is calculated by recalling the i-th angular velocity data 95 of the main memory. The method of calculating the posture of the controller 5 from the angular velocity may be any method: the posture before use: the posture that is owed (the posture calculated in the previous time) and the angle of the current angle (the step in the processing loop of this time) The angular velocity obtained is calculated. Specifically, the CPU 10 calculates the posture by rotating the previous posture by a certain unit time portion by the angular velocity at this time. The posture of the first time is displayed by the control of the master money heart _ posture data (10), and the angular velocity of this time is displayed by the first angular velocity data 95 memorized in the main memory. Therefore, the CPU generates the posture of the controller 5 by reading and controlling the two posture data 108 and the first angle data from the main memory. The data of "posture based on angular velocity" calculated in the above manner is displayed and stored in the main memory. When the posture is calculated from the angular velocity, the initial posture can be determined in advance. That is, when the posture of the controller 5 is calculated from the angular velocity, the Gpu 1Q can first calculate the initial posture of the controller ^. The initial posture of the controller 5 can be calculated from the acceleration data. Alternatively, the player is caused to perform a predetermined operation while the controller 5 is in the predetermined posture, and a specific one of the points at which the predetermined operation is performed is used as the initial posture. When the posture of the controller 5 is calculated as an absolute posture based on a predetermined direction in space, the initial posture 323304 8 58 201220108 potential can be calculated, and when the posture of the controller 5 is calculated as, for example, control at the time when the game starts When the posture of the device 5 is the relative posture of the reference, the initial posture may not be counted. Next, the CPU 10 corrects the posture of the controller 5 calculated by the angular velocity using the first acceleration data 94. Specifically, the CPU 1 first reads the first acceleration data 94 from the main memory, and calculates the posture of the controller 5 based on the first ambiguous data 94. Here, in the state where the controller $ is stationary, the acceleration applied to the controller 5 means that the acceleration is the acceleration. Therefore, in this state, the direction (gravity direction) of the gravitational acceleration can be calculated using the first acceleration data 94 of the acceleration sensor 37 & w, so that the first acceleration data 94 can be calculated based on the first acceleration data 94. The orientation (posture) of the controller 5 in the heavy direction. The data of "posture based on acceleration" calculated in the above manner is displayed and stored in the main memory. When calculating the posture according to the acceleration, the CPU 10 then corrects the posture according to the angular velocity using the posture according to the acceleration. Specifically, the cpui system reads data showing the posture according to the angular velocity and data showing the posture according to the acceleration from the main memory, and corrects the posture according to the angular velocity close to the posture according to the acceleration at a predetermined ratio. . The predetermined ratio may be a predetermined fixed value or may be set in accordance with an acceleration that is not displayed by the i-th acceleration data 94. Further, regarding the posture according to the acceleration, since the posture cannot be calculated for the direction of rotation in which the direction of gravity is the axis, the cpu 1 亦可 may not be corrected for the direction of rotation. In the present embodiment, the data showing the corrected posture obtained as described above is stored in the main memory 0 323304 59 201220108. After the posture according to the angular velocity is corrected as described above, the CPU 10 further uses the marker coordinate data 96. Corrected posture to correct. First, the CPU 10 calculates the posture of the controller 5 based on the marker coordinate data 96 (in accordance with the posture of the indicator holder). The marker coordinate data 96, which shows the position of the H 6R and 6L in the photographic image, can be calculated from these positions in relation to the roll (ro (1) direction (rotation direction around the z axis)) In the photographic image, the posture of the control 45 related to the roll direction can be calculated from the slope of the line connecting the position of the marker 6R and the position of the erector 6L. When the position of the controller 5 relative to the pointing device 6 can be specified (assuming, for example, the controller 5 is located at the timing of the pointing device 6), the direction of the longitudinal deflection can be calculated from the position of the pointing device 6 in the photographic image. The posture of the controller 5 is related. For example, when the position of the photographic number 5: L6R and 6L is moved to the left, it can be determined that the controller 5 is turned to the right (posture) to the right. Thus, the marker can be set: 'Calculating the controller 5 related to the pitch direction and the yaw direction; System 2: ^ mode' can be calculated according to the marker coordinate data 96 to calculate the posture according to the marker coordinates, the CPU 10 then by means of the coordinates Posture to correct the corrected posture potential Perform the corrected posture). That is, (10) 1Q is closer to the posture according to the coordinate of the marker according to the posture of the predetermined U-positive posture, and may be a predetermined mosquito value of 1 except for the correction according to the posture of the target. It is performed in any of the lateral direction, the pitch direction, and the deflection direction, or in two directions. For example, when the marker coordinates data 96 is used for the 323304 8 201220108, the posture can be accurately calculated in the pan direction, so the CPU 10 can correct only the pan direction using the posture according to the marker coordinate data 96. Further, when the pointing device 6 or the indicator portion 55 is not photographed by the photographing element 40 of the controller 5, the posture according to the marker coordinate data 96 cannot be calculated, and therefore the correction processing can be performed without using the marker coordinate data 96 at this time. According to the above, the CPU 10 corrects the posture of the controller 5 calculated based on the first angular velocity data 95 by using the first acceleration data 94 and the marker coordinate data 96. Here, in the method of calculating the posture of the controller 5, in the method of using the angular velocity, the posture can be calculated regardless of how the controller 5 operates. On the other hand, in the method of using the angular velocity, since the posture is calculated by the angular velocity detected by the cumulative addition, the accuracy is deteriorated due to the accumulation of errors or the like, or the rotary sensor is caused by the so-called temperature drift problem. The doubt that the accuracy is deteriorating. Further, in the method of using the acceleration, the error is not accumulated, but in the state in which the controller 5 is in a state of intense operation, the posture cannot be accurately calculated (the direction of gravity cannot be accurately detected). Further, in the method using the marker coordinates, the posture can be accurately calculated (especially in the pan direction), but the posture cannot be calculated when the indicator portion 55 cannot be photographed. On the other hand, according to the present embodiment, since the three methods of different characteristics are used as described above, the posture of the controller 5 can be accurately calculated. In other embodiments, the posture may be calculated using either or both of the above three methods. Further, when the lighting control of the indicator is performed in the processing of the above step S1 or step S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates. 61 323304 201220108 After the above step S23, the treatment of the η rpii iii in the vanadium vanadium step S24 is carried out. In step S24, it is taken: the posture of the device 7. That is, since the terminal operation data 1〇1 - 〇 + 97 taken from the terminal device 7 includes the second acceleration data m Q2, * azimuth _, the cPU 1 〇 can be based on the material
可藉由第2角速度資料1〇 女男隹LPU1U pe . 〇2而侍知終端裝置7之每單位時 姿勢的變化量)。此外,在終端裝置7幾乎呈 下,施加於終端震置7之加速度係意味著重 加速度’因此可藉由第2铋、* λ:次 刀 往番1々舌七七以 加速度貝料101得知施加於終端 裝置7之重力方向(亦即以重力方向為基準之終端裝置^ 姿勢)。此外,可藉由方位資料1Q3來得知祕端裝置^ 基準之預定方位(亦即以預定方位為基準之終端裝置、 姿勢)。即使在產魏磁以相磁場時,亦可得知終蠕敦置 7的旋轉量。因此’CPU 10可根據該等第2加速度資料^ 第2角速度資料1G2、及方位資料1Q3來算出終端製置7 的姿勢。本實施形態中,係根據上述3種資料來算出終端 裝置7的姿勢’但在其他實施形態中,亦可使用上迷3'種 資料中的任一種或2種來算出姿勢。 終端裝置7的姿勢之具體的算出方法可為任意方法, 例如可考量使用第2加速度資料ιοί及方位資料1〇3,來 修居根據第2角速度資料丨⑽所顯示<角速度所算出的姿 勢之方法。具體而言,CPU 10首先根據第2角速度資料1〇2 來算出終端裝置7的姿勢。根據角速度來算出姿勢之方 法,可與上述步驟S23中的方法相同。接著,cpu 1 〇在適 323304 62 201220108 當的時機中(例如終端裝置7接近於靜止狀態時),藉由根 據第2加速度資料101所算出之姿勢及/或根據方位資料 103所算出之姿勢,來修正根據角速度所算出之姿勢。以 依據加速度之姿勢來修正依據角速度之姿勢之方法,可使 用與上述算出控制器5的姿勢時相同之方法。此外,當以 依據方位資料之姿勢來修正依據角速度之姿勢時,Cpu 1〇 係能夠以預定比率使依據角速度之姿勢接近於依據方位資 料之姿勢來進行修正。根據以上内容,可正確地算出終端 裝置7的姿勢。 由於控制器5具備作為紅外線偵測手段之攝影資訊運 算部35,所以遊戲裝置3可取得標示器座標資料96。因此, 關於控制器5 ’遊戲裝置3可從標示器座標資料96得知實 ,空間中的絕對姿勢(於設定在實際空間之座標系中,控制 盗5處於何種姿勢)。另一方面,終端裝置7並不具備如攝 影資訊遥曾 ^ ^ 具。卩35之類的紅外線偵測手段。因此,遊戲裝置 犯夠從第2加速度資料1〇1及第2角速度資料1〇2中, 對於以曹士 ^ 刀方向為軸之旋轉方向’得知實際空間中的絕對 姿勢。 此,本實施形態中,終端裝置7構成為具備磁性 感測器^ Z ’以使遊戲裝置3取得方位資料103。根據此, 遊戲_胃 次、夏3,對於以重力方向為軸之旋轉方向,可從方位 貝;1〇3算出實際空間中的絕對姿勢,而更能夠正確地算 出終端農置7的姿勢。 體―就上述步驟S24的具體處理而言,CPU 1〇係從主記憶 °買取第2加速度資料101、第2角速度資料1〇2、及方位 63 323304 201220108 資料103,並根據此等資料來算出終端裝置7的姿勢。並 將顯示出所算出之終端裝置7的姿勢之資料,作為終端姿 勢資料109記憶於主記憶體》在步驟S24後執行步驟S25 的處理。 步驟S25中,CPU 10係執行攝影機圖像的辨識處理。 亦即,CPU 10對攝影機圖像資料104進行預定的辨識處理。 該辨識處理,只要可從攝影機圖像偵測出某種特徵並輸出 該結果者均可為任意處理。例如當於攝影機圖像中包含遊 戲者的臉時,可為辨識出臉之處理。具體而言,可為辨識 出臉的一部分(眼或鼻或嘴等)之處理,或是偵測出臉的表 情之處理。此外,顯示出辨識處理的結果之資料,係作為 圖像辨識資料110記憶於主記憶體。在步驟S25後執行 驟S26的處理。 ’ 步驟S26中’ CPU 1〇係執行麥克風聲音的辨識處 該結果者均可為任意處理^聲音侧出某種特徵蓮輪出 戲者的指*之處理,或僅為如為從麥克風聲音_出遊 理。此外,顯4㈣處理克絲音的音量之處 識資料111記憶於主記憶體的、、、。果之資料’係作為聲音辨 的處理。 。在步驟S26後執行步騍S27 步驟S27中,CPU 10係热,· 在此,所謂遊戲輸人,可為竹因應遊戲輸人之遊戲處理。 送來之資料技從該資料^控㈣5或終端裝置7所傳 件到之資料均可。具體而言, 323304 201220108 遊戲輸入除了控制器操作資料92及終端操作資料97中所 包含之各種資料之外,亦可為從該各資料所得到之資料(控 制器姿勢資料108、終端姿勢資料109、圖像辨識資料110、 及聲音辨識資料111)。此外,步驟S27中之遊戲處理的内 容可為任意内容,例如為使遊戲中登場之物件(角色)動作 之處理、控制虛擬攝影機之處理、或是使顯示於晝面上之 遊標移動之處理。此外,將攝影機圖像(或該一部分)用作 為遊戲圖像之處理、或是將麥克風聲音用作為遊戲聲音之 處理等。上述遊戲處理的例子將於之後說明。步驟S27中, 例如遊戲中登場之角色(物件)中所設定之各種參數的資 料、或是與配置在遊戲空間之虛擬攝影機相關之參數的資 料、得分的資料等之顯示出遊戲控制處理結果之資料,係 記憶於主記憶體。在步驟S27後,CPU 10結束步驟S4的 遊戲控制處理。 返回第12圖的說明,步驟S5中,用以顯示於電視2 之電視用遊戲圖像,係由CPU 10及GPU lib所產生。亦即, CPU 10及GPU lib係從主記憶體中,讀取顯示出步驟S4 之遊戲控制處理的結果之資料,並從VRAM lid中讀取用以 產生遊戲圖像所需之資料,來產生遊戲圖像。遊戲圖像只 要可顯示出步驟S4之遊戲控制處理的結果者即可,可由任 意方法來產生。例如,遊戲圖像的產生方法,可為藉由將 虛擬攝影機配置在虛擬的遊戲空間内並計算從虛擬攝影機 所觀看之遊戲空間,來產生3維CG圖像之方法,或是(不 使用虛擬攝影機)產生2維圖像之方法。所產生之電視用遊 65 323304 201220108 戲圖像被記憶於VRAM lid。在上述步驟S5後執行步驟S6 的處理。 步驟S6中,用以顯示於終端裝置7之終端用遊戲圖 像,係由CPU 10及GPU lib所產生。終端用遊戲圖像亦與 上述電視用遊戲圖像相同,只要可顯示出步驟S4之遊戲控 制處理的結果者即可,可由任意方法來產生。此外,終端 用遊戲圖像,可藉由與上述電視用遊戲圖像相同之方法來 產生,或是不同方法來產生。所產生之終端用遊戲圖像被 記憶於VRAM lid。因遊戲内容的不同,電視用遊戲圖像與 終端用遊戲圖像可為同一,此時在步驟S6中可不執行遊戲 圖像的產生處理。在上述步驟S6後執行步驟S7的處理。 步驟S7中,係產生用以輸出至電視2的揚聲器2a之 電視用遊戲聲音。亦即,CPU 10係於DSP 11c產生因應步 驟S4之遊戲控制處理的結果之遊戲聲音。所產生之遊戲聲 音,例如為遊戲的效果音、在遊戲中登場之角色的聲音、 或是BGM(背景音樂)等。在上述步驟S7後執行步驟S8的 處理。 步驟S8中,係產生用以輸出至終端裝置7的揚聲器 67之終端用遊戲聲音。亦即,CPU 10係於DSP 11c產生因 應步驟S4之遊戲控制處理的結果之遊戲聲音。終端用遊戲 聲音,可與上述電視用遊戲聲音相同或不同。此外,例如 效果音雖不同,但BGM可相同之僅有一部分不同。當電視 用遊戲聲音與終端用遊戲聲音為同一時,在步驟S8中可不 執行遊戲聲音的產生處理。在上述步驟S8後執行步驟S9 66 323304 ⑧ 201220108 的處理。 步驟S9中,CPU 10將遊戲圖像及遊戲聲音輸出至電 視2。具體而言,CPU 10將記憶於VRAM lid之電視用遊戲 圖像的資料、及在步驟S7中由DSP 11c所產生之電視用遊 戲聲音的資料,傳送至AV-IC 15。因應於此’ AV-IC 15經 由AV連接器16將圖像及聲音的資料輸出至電視2。藉此, 電視用遊戲圖像被顯示於電視,並且電視用遊戲聲音從揚 聲器2a輸出。在步驟S9後執行步驟S10的處理。 步驟S10中,CPU 10將遊戲圖像及遊戲聲音傳送至終 端裝置7。具體而言,記憶於VRAM lid之終端用遊戲圖像 的圖像資料、及在步驟S8中由DSP 11c所產生之聲音的資 料,係藉由CPU 10傳送至編解碼器LSI 27,並藉由編解 碼器LSI 27進行預定的壓縮處理。再者,施以壓縮處理後 之圖像及聲音的資料,係藉由終端通訊模組28經由天線 29傳送至終端裝置7。終端裝置7,藉由無線方式模組70 而接收從遊戲裝置3所傳送來之圖像及聲音的資料,並藉 由編解碼器LSI 66進行預定的解壓縮處理。進行解壓縮處 理後的圖像資料被輸出至LCD 51,進行解壓縮處理後的聲 音資料被輸出至聲音1C 68。藉此,終端用遊戲圖像被顯 示於LCD 51,並且終端用遊戲聲音從揚聲器67輸出。在 步驟S10後執行步驟S11的處理。 步驟S11中,CPU 10判定是否結束遊戲。步驟S11的 判定,例如藉由是否為遊戲結束或使用者進行終止遊戲之 指示等來進行。步驟S11的判定結果為否定時,再次執行 67 323304 201220108 步;驟S2 ,里。另一方面’步驟su的判定 CPU 10結束第12圖所示之遊戲處理。 等 中判定結束遊戲為止,重複執行步驟犯’至步驟S11 理。 芏b11的一連串處 如上所述,本實施形態中,終端梦 板52與加速度感測器63或迴轉❹1 7係具備觸控面 觸控面板52與慣性感測器的輸出作 :广 遊戲裝置3而㈣為遊戲的輸人(步驟^作#枓被傳送至 端裝置7具備顯示裝置似!)51) 。此外’終 戲圖像被顯示於LCD 51(步驟S6、S10)。 理所得之遊 使用觸控面板52對遊觀像直接進行觸用者可 =Γ的動作可藉由慣性感測器偵測出,所^ 仃將顯不有遊戲圖像之LCD51本身移動之 吓乂)了進 操作’以對遊戲圖像直接進行操二操= 進订遊戲,因此可提供例如後述第 戊覺來 操作感覺的遊戲。 及第2遊戲例之新嶺 此外,本實施形態中,終端裝置7 端裝置7之狀態下操作之類比搖桿53及操^可,握持終 為遊戲輪入(步驟S3、S4)。因此,钌…不邙用竹 ffl « ^ , ^ ^ ^ 來進行詳細的遊戲操作。 ’、3搖桿操作 再者,本實施形態中,終端裝置7 麥克風69 ,攝影機56所摄旦/'、,、備攝衫機56及 攝〜機56所攝景》之攝影機圖像的資料及麥^ 將對類比搖桿53及操作鍵54所進 為遊戲輪入(步驟S3、S4)。因朴,,… 幻麵作用作 323304 68 201220108 :音遊戲裝置❹ 克風聲音用作為遊錄人,^销影機圖像及/或麥 機56將圖傻,則 因此,使用者亦可藉由以攝影 操作,來進行遊戲操^之操作/切聲音輸入至麥克風69之 狀態下進行,所'、。此等操作可在握持終端裝置7之 可藉由進行此等二=上所述對遊戲圖像直接進行操作時, 作。 Μ ’而讓使用者進行更錢化的遊戲操 (:置由於遊戲圖像顯示於可搬運型 广裝置7的LCD 51(步驟邡、sl〇),所以使用者可自 配=端裝置7。因此’當使控制器5朝向標示器來 订插作時’藉由將終端裂置7配置在自由位置,使 使控制H 5朝向自由方向來進行遊戲,並可提升對控制 5所進行之操作的自由度。此外,由於可將終端褒置7 配置在任意位置’如後述第5遊戲例般,藉由將終端褒置 配置在適合遊戲内容之位置,可提供更具臨場感之遊戲。 此外,根據本實施形態,遊戲裝置3可從控制器5及 終端裝置7取得操作資料等(步驟S2、S3),所以使用者可 將控制器5及終端裝置7的2個裴置用作為操作手段。因 匕遊戲系統1中,亦可讓複數位使用者使用各農置而讓 複數個人進行遊戲,此外,1位使用者亦可使用2個裝置 來進行遊戲。 此外,根據本實施形態,遊戲裝置3係產生2種遊戲 圖像(步驟S5、S6),並可將遊戲圖像顯示於電視2及終端 323304 69 201220108 裝置7(步驟S9、S10)。如此’藉由將2種遊戲圖像顯示於 不同裝置’可提供對使用者而言更容易觀看之遊戲圖像, 而可提升遊戲的操作性。例如,當2個人進行遊戲時,如 後述第3或第4遊戲例,將對某一方的使用者而言為容易 觀看之視點的遊戲圖像顯示於電視2,將對另一方的使用 者而s為容易觀看之視點的遊戲圖像顯示於終端裝置7, 藉此可在各遊戲者容易觀看之視點下進行遊戲。此外,即 使為例如1人進行遊戲時,如後述第丨、第2或第5遊戲 例,可在不同處的視點中顯示出2種遊戲圖像,藉此,遊 戲者更容易掌握遊戲空間的模樣’而提升遊戲的操作性。 [6.遊戲例] 接著說明遊戲系統1中所進行之遊戲的具體例。以下 所說明之遊戲例中,亦有未運用遊戲系統丨之各裝置的構 成中的一部分之情形,此外,亦有未執行第12圖及第13 圖所不之一連串處理中的一部分處理之情形。亦即,遊戲 系統1可不具備上述全部的構成,此外,遊戲裝置3亦可 不執行第12圖及第13圖所示之一連串處理中的一部分。 (第1遊戲例) 第1遊戲例,為藉由操作終端裝置7而在遊戲空間内 射出物件(飛鏢)之遊戲^遊戲者,藉由改變終端裝置7的 姿勢孓操作、及在觸控面板52上將線描繪出之操作,可浐 示射出飛鏢之方向。 、 _ a 第14圖係顯示第1遊戲例中之電視2的晝面與終端裝 置7之圖。第Η圖中,於電視2及終端裝置7的⑽51 323304 70 201220108 上,顯示出表示遊戲空間之遊戲圖像。於電視2顯 錦m、控制面122、及標㈣3。於咖51㈣出控= 面122(及麟121)。第1遊戲财,遊戲者係藉由使用故 端裝置7之操作’㈣鱗121轉中縣123而進行遊 戲 * 射出賴m時,遊戲者首先藉由操作終端褒置 姿勢,來改變虛擬遊戲空間内所配置之控制面122的姿勢 而成為期望姿勢。亦即,CPU 1G根據慣性感測器(加速度 感測器63及迴轉感測器64)以及磁性感測器62的輸出, 算出終端裝置7的姿勢(步驟S24),並根據所算出的姿勢 來改變控制面122的姿勢(步驟S27)。第i遊戲例中,控 制面122的姿勢係被控制為因應實際空間中之終端裝置7 的姿勢。亦即,遊戲者藉由改變終端裝置7(顯示於終端裝 置7之控制面122)的姿勢,而可在遊戲空間内改變控制面 122的姿勢。第1遊戲例中,控制面122的位置被固定在 遊戲空間中的預定位置。 接著,遊戲者使用觸控筆124等在觸控面板52上進行 將線描繪出之操作(參照第14圖所示之箭頭)。在此,第i 遊戲例中,於終端裝置7的LCD 51,係以使觸控面板52 的輸入面與控制面122相對應之方式來顯示控制面122。 因此,藉由描繪在觸控面板52上之線,可算出在控制面 122上的方向(該線所顯示之方向)^飛鏢121被射往如此 決定之方向。從上述中,CPU 10從觸控面板52的觸控位 置資料100來算出在控制面122上的方向,並進行將飛鏢 323304 201220108 121移動至該算出後的方向之處理(步驟S27)。CPU 10,例 如可因應線的長度或將線描繪出的速度來控制飛鏢121的 速度。 如上所述,根據第1遊戲例,遊戲裝置3藉由將慣性 感測器的輸出用作為遊戲輸入,可因應終端裝置7的移動 (姿勢)使控制面122移動,並藉由將觸控面板52的輸出用 作為遊戲輸入’可特定出在控制面122上的方向。根據此, 遊戲者可移動顯示於終端裝置7之遊戲圖像(控制面122的 圖像)或是對該遊戲圖像進行觸控操作,因此能夠以對遊戲 圖像直接進行操作之新穎操作感覺來進行遊戲。 此外’第1遊戲例中,藉由將慣性感測器及觸控面板 52的感測器輸出用作為遊戲輸人,可容易地指示3維空間 2方向。亦即’遊戲者以—邊的手實際調整終端裝置7 以另-邊的手用線將方向輸人於觸控面板Η 方向之直覺性操作而容易心 操作以及對=7::入,對終端裝置7的姿勢之 示3維空間中的方向之操作^呆作,所以可迅速地進行指 此外,根據第1遊戲例,* 觸控輸入的操作,所以在^了容易對控制面122進行 控制面爲另-方面,.係^置^;中’於全體畫面顯示 以及容易晦準躲123 易旱握控制面122的姿.勢、The amount of change in the posture per unit time of the terminal device 7 can be known by the second angular velocity data 1 〇 L隹1U pe. 〇2. Further, in the case where the terminal device 7 is almost downward, the acceleration applied to the terminal 7 indicates that the acceleration is 'relevant', so that it can be known by the second 铋, * λ: the second knives and the knives. The direction of gravity applied to the terminal device 7 (i.e., the terminal device posture based on the direction of gravity). In addition, the predetermined orientation of the secret device (ie, the terminal device and the posture based on the predetermined orientation) can be known by the orientation data 1Q3. Even when the ferromagnetic phase magnetic field is produced, the amount of rotation of the final creep can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 based on the second acceleration data ^the second angular velocity data 1G2 and the orientation data 1Q3. In the present embodiment, the posture of the terminal device 7 is calculated based on the above three types of data. However, in another embodiment, the posture may be calculated using either or both of the above 3' types of data. The specific calculation method of the posture of the terminal device 7 may be any method. For example, the second acceleration data ιοί and the orientation data 1〇3 may be used to correct the posture calculated based on the angular velocity displayed by the second angular velocity data 丨(10). The method. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 1〇2. The method of calculating the posture based on the angular velocity can be the same as the method in the above step S23. Then, cpu 1 〇 is in the timing of 323304 62 201220108 (for example, when the terminal device 7 is close to the stationary state), the posture calculated based on the second acceleration data 101 and/or the posture calculated from the orientation data 103, To correct the posture calculated from the angular velocity. The method of correcting the posture according to the angular velocity in accordance with the posture of the acceleration can be performed in the same manner as the above-described calculation of the posture of the controller 5. Further, when the posture according to the angular velocity is corrected in accordance with the posture of the orientation data, the CPU 1 can correct the posture according to the angular velocity close to the posture according to the orientation information at a predetermined ratio. Based on the above, the posture of the terminal device 7 can be accurately calculated. Since the controller 5 is provided with the photographic information operation unit 35 as an infrared ray detecting means, the game device 3 can acquire the marker coordinate data 96. Therefore, regarding the controller 5', the game device 3 can know the absolute posture in the real space from the marker coordinate data 96 (in the coordinate system set in the real space, the posture of the thief 5 is controlled). On the other hand, the terminal device 7 does not have, for example, a camera information. Infrared detection means such as 卩35. Therefore, the game device has learned from the second acceleration data 1〇1 and the second angular velocity data 1〇2 that the absolute posture in the real space is known in the direction of rotation about the axis of the Caesar knife. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor ^Z' to cause the game device 3 to acquire the orientation data 103. According to this, in the game _ stomach and summer 3, the absolute direction in the real space can be calculated from the azimuth; 1 〇 3 for the direction of rotation with the direction of gravity as the axis, and the posture of the terminal 7 can be accurately calculated. In the specific processing of the above step S24, the CPU 1 purchases the second acceleration data 101, the second angular velocity data 1〇2, and the orientation 63 323304 201220108 data 103 from the main memory, and calculates based on the data. The posture of the terminal device 7. The data of the calculated posture of the terminal device 7 is displayed, and the terminal posture data 109 is stored in the main memory. The processing of step S25 is executed after step S24. In step S25, the CPU 10 performs recognition processing of the camera image. That is, the CPU 10 performs predetermined identification processing on the camera image data 104. The identification processing can be arbitrary as long as a certain feature can be detected from the camera image and the result is output. For example, when the player's face is included in the camera image, the processing of the face can be recognized. Specifically, it is possible to recognize the treatment of a part of the face (eye or nose or mouth, etc.) or to detect the appearance of the face. Further, the data showing the result of the identification processing is stored as the image identification data 110 in the main memory. The process of step S26 is performed after step S25. In step S26, the CPU 1 performs the recognition of the microphone sound, and the result can be any processing for the sound side of a certain type of lotus player, or only for the sound of the microphone. Travel out. In addition, the 4th (4) processing of the volume of the gram sound is recognized in the main memory of the memory. The data of the fruit is treated as a sound. . After step S26, step S27 is executed. In step S27, the CPU 10 is hot. Here, the so-called game input is a game process in which the game is input by the player. The data technology sent from the data control (4) 5 or the information transmitted by the terminal device 7 can be used. Specifically, the 323304 201220108 game input may be data obtained from the data (controller posture data 108, terminal posture data 109) in addition to various materials included in the controller operation data 92 and the terminal operation data 97. , image identification data 110, and voice recognition data 111). Further, the content of the game processing in step S27 may be any content, for example, processing for moving an object (character) appearing in the game, controlling the processing of the virtual camera, or moving the cursor displayed on the face. Further, the camera image (or the portion) is used as a process for processing a game image or a process for using a microphone sound as a game sound. An example of the above game processing will be described later. In step S27, for example, the data of various parameters set in the character (object) appearing in the game, or the data of the parameter related to the virtual camera arranged in the game space, the data of the score, etc., show the result of the game control processing. The data is stored in the main memory. After the step S27, the CPU 10 ends the game control processing of the step S4. Returning to the description of Fig. 12, in step S5, the television game image for display on the television 2 is generated by the CPU 10 and the GPU lib. That is, the CPU 10 and the GPU lib read the data showing the result of the game control process of step S4 from the main memory, and read the data necessary for generating the game image from the VRAM lid to generate Game image. The game image may be displayed as long as the result of the game control processing of step S4 can be displayed, and can be generated by any method. For example, the method of generating the game image may be a method of generating a 3-dimensional CG image by arranging the virtual camera in a virtual game space and calculating a game space viewed from the virtual camera, or (without using virtual Camera) A method of producing a 2-dimensional image. The resulting TV tour 65 323304 201220108 The play image is memorized in the VRAM lid. The processing of step S6 is performed after the above step S5. In step S6, the game image for the terminal to be displayed on the terminal device 7 is generated by the CPU 10 and the GPU lib. The game image for the terminal is also the same as the game image for the television described above, and may be generated by any method as long as the result of the game control process of step S4 can be displayed. Further, the game image for the terminal can be generated by the same method as the above-described game image for the television, or can be generated by a different method. The generated game image for the terminal is memorized in the VRAM lid. The game image for the television and the game image for the terminal may be the same depending on the content of the game. In this case, the process of generating the game image may not be executed in step S6. The processing of step S7 is performed after the above step S6. In step S7, a game sound for television for outputting to the speaker 2a of the television 2 is generated. That is, the CPU 10 is caused by the DSP 11c to generate a game sound in accordance with the result of the game control processing of the step S4. The generated game sounds are, for example, sound effects of the game, sounds of characters appearing in the game, or BGM (background music). The processing of step S8 is performed after the above step S7. In step S8, a terminal game sound for outputting to the speaker 67 of the terminal device 7 is generated. That is, the CPU 10 is caused by the DSP 11c to generate a game sound in accordance with the result of the game control processing of the step S4. The game sound for the terminal can be the same as or different from the game sound for the above television. In addition, for example, the effect sounds are different, but the BGM can be different only in part. When the television game sound is the same as the terminal game sound, the game sound generation processing may not be executed in step S8. The processing of step S9 66 323304 8 201220108 is performed after the above step S8. In step S9, the CPU 10 outputs the game image and the game sound to the television 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM lid and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. In response to this, the AV-IC 15 outputs the image and sound data to the television 2 via the AV connector 16. Thereby, the television game image is displayed on the television, and the television game sound is output from the speaker 2a. The process of step S10 is performed after step S9. In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the image data of the game image for the terminal stored in the VRAM lid and the data of the sound generated by the DSP 11c in step S8 are transmitted to the codec LSI 27 by the CPU 10, and by The codec LSI 27 performs predetermined compression processing. Further, the image and sound data subjected to the compression processing are transmitted to the terminal device 7 via the antenna 29 via the terminal communication module 28. The terminal device 7 receives the image and audio data transmitted from the game device 3 by the wireless module 70, and performs predetermined decompression processing by the codec LSI 66. The image data subjected to the decompression processing is output to the LCD 51, and the sound data subjected to the decompression processing is output to the sound 1C 68. Thereby, the terminal game image is displayed on the LCD 51, and the terminal game sound is output from the speaker 67. The process of step S11 is performed after step S10. In step S11, the CPU 10 determines whether or not to end the game. The determination in step S11 is performed, for example, by whether or not the game is over or the user gives an instruction to terminate the game. When the result of the determination in step S11 is negative, the step 67 323304 201220108 is executed again; step S2, in. On the other hand, the determination of the step su of the CPU 10 ends the game processing shown in Fig. 12. If it is determined that the game is finished, the process is repeated until the step S11 is performed. The series of the 芏b11 is as described above. In the present embodiment, the terminal dream board 52 and the acceleration sensor 63 or the slewing unit 17 are provided with the output of the touch surface touch panel 52 and the inertial sensor: the wide game device 3 And (4) is the input of the game (step ^作#枓 is transmitted to the end device 7 with a display device like!) 51). Further, the final movie image is displayed on the LCD 51 (steps S6, S10). The user can use the touch panel 52 to directly touch the viewing image. The action can be detected by the inertial sensor, and the LCD 51 itself will be scared to display the game image.乂) In the operation 'to perform the game directly on the game image = the game is booked, so that a game such as the ninth sense of the operation described later can be provided. Further, in the second game example, in the present embodiment, the analog stick 53 and the operation are operated in the state in which the terminal device 7 is in the state of the end device 7, and the grip is finally the game round (steps S3 and S4). Therefore, oh... do not use bamboo ffl « ^ , ^ ^ ^ for detailed game operations. ', 3 joystick operation, in the present embodiment, the data of the camera image of the terminal device 7 microphone 69, camera 56, /, camera, camera 56 and camera 56 And the wheat will enter the game wheel with the analog rocker 53 and the operation key 54 (steps S3, S4). Because Park,,... Fantasy action 323304 68 201220108 : Sound game device 克 克 风 sound is used as a finder, ^ pin camera image and / or 麦 56 will be stupid, so the user can also borrow It is performed by the operation of the game operation and the input of the cut sound to the microphone 69 by the photographic operation. These operations can be performed by holding the terminal device 7 by performing these two operations on the game image as described above.游戏 'When the user makes a more profitable game operation (the display is displayed on the LCD 51 of the portable type device 7 (step 邡, sl〇), the user can customize the end device 7. Therefore, 'when the controller 5 is made to be positioned toward the marker', by arranging the terminal split 7 in the free position, the control H 5 is made to face the free direction, and the operation of the control 5 can be improved. In addition, since the terminal device 7 can be placed at any position as in the fifth game example described later, a game with a more realistic feeling can be provided by arranging the terminal device at a position suitable for the game content. According to the present embodiment, the game device 3 can acquire the operation data and the like from the controller 5 and the terminal device 7 (steps S2 and S3). Therefore, the user can use the two devices of the controller 5 and the terminal device 7 as the operation means. In the game system 1, a plurality of individuals can be used to play games by using a plurality of households, and one user can use two devices to play the game. Further, according to the present embodiment, the game is played. Device 3 produces 2 The game image (steps S5, S6), and the game image can be displayed on the television 2 and the terminal 323304 69 201220108 device 7 (steps S9, S10). Thus by "displaying two kinds of game images on different devices" Providing a game image that is easier for the user to view, and improving the operability of the game. For example, when two people play the game, for example, the third or fourth game example described later will be for a certain user. The game image for the viewpoint of easy viewing is displayed on the television 2, and the game image of the other user's viewpoint for easy viewing is displayed on the terminal device 7, whereby the player can easily view the viewpoint. In addition, even if the game is played for one person, for example, the second, second, or fifth game examples described later can display two kinds of game images in different viewpoints, thereby making it easier for the player to grasp. The game space is improved to improve the operability of the game. [6. Game example] Next, a specific example of the game played in the game system 1 will be described. In the game example described below, there are also devices that do not use the game system. In the composition In some cases, there is a case in which some of the series of processes in the 12th and 13th drawings are not executed. That is, the game system 1 may not have all of the above configurations, and the game device 3 may not be executed. A part of the series of processing shown in Fig. 12 and Fig. 13. (First game example) The first game example is a game player who shoots an object (dart) in the game space by operating the terminal device 7, By changing the posture of the terminal device 7, the operation, and the operation of drawing the line on the touch panel 52, the direction of the dart can be displayed. _ a Figure 14 shows the TV 2 in the first game example. The figure of the face and the terminal device 7. In the figure, on the television 2 and the terminal device 7, (10) 51 323304 70 201220108, a game image representing the game space is displayed. On TV 2 display m, control surface 122, and standard (4) 3. In the coffee 51 (four) out of control = face 122 (and Lin 121). In the first game, the player performs the game by using the operation of the terminal device 7 (4) scale 121 to the middle county 123. When the game is launched, the player first changes the virtual game space by operating the terminal to set the posture. The posture of the control surface 122 disposed therein becomes a desired posture. That is, the CPU 1G calculates the posture of the terminal device 7 based on the outputs of the inertial sensors (the acceleration sensor 63 and the rotation sensor 64) and the magnetic sensor 62 (step S24), and based on the calculated posture. The posture of the control plane 122 is changed (step S27). In the i-th game example, the posture of the control plane 122 is controlled to correspond to the posture of the terminal device 7 in the actual space. That is, the player can change the posture of the control plane 122 in the game space by changing the posture of the terminal device 7 (displayed on the control plane 122 of the terminal device 7). In the first game example, the position of the control plane 122 is fixed at a predetermined position in the game space. Next, the player performs an operation of drawing a line on the touch panel 52 using the stylus pen 124 or the like (refer to an arrow shown in Fig. 14). Here, in the i-th game example, the control unit 122 is displayed on the LCD 51 of the terminal device 7 such that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, by drawing the line on the touch panel 52, the direction on the control surface 122 (the direction in which the line is displayed) can be calculated. ^ The dart 121 is shot in the direction thus determined. From the above, the CPU 10 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs a process of moving the darts 323304 201220108 121 to the calculated direction (step S27). The CPU 10, for example, can control the speed of the darts 121 in response to the length of the line or the speed at which the line is drawn. As described above, according to the first game example, the game device 3 can use the output of the inertial sensor as the game input, and can move the control surface 122 in response to the movement (posture) of the terminal device 7 by using the touch panel. The output of 52 is used as the game input 'specifically on the control plane 122. According to this, the player can move the game image (image of the control plane 122) displayed on the terminal device 7 or perform a touch operation on the game image, thereby enabling a novel operation feeling of directly operating the game image. Come play the game. Further, in the first game example, the three-dimensional space 2 direction can be easily instructed by using the sensor output of the inertial sensor and the touch panel 52 as a game input. That is, the player actually adjusts the terminal device 7 with the hand of the side, and uses the hand line of the other side to input the direction to the intuitive operation of the direction of the touch panel, and is easy to operate and correct = 7:: Since the operation of the orientation of the terminal device 7 in the three-dimensional space is performed, the finger can be quickly moved. Further, according to the first game example, the operation of the touch input is performed, so that the control surface 122 can be easily performed. The control surface is another--, the system is set to ^; in the 'display on the whole screen and easy to hide the 123 easy to grip control surface 122 posture.
體控制面122及桿乾123 : ’在電視2令顯示出包含全 圖)。亦即,上述步驟S27/遊戲空間的圖像(參照第U Y,用以產生電視用遊戲圖像之 72 323304 201220108 第1虛擬攝影機,係以使全體控制面122及標靶123包含 於視野範圍之方式來設定,並且用以產生終端用遊戲圖像 之第2虛擬攝影機,係以使LCD 51的晝面(觸控面板52的 輸入面)與控制面122在晝面上呈一致之方式來設定。因 此,第1遊戲例中,藉由將從不同視點所觀看之遊戲空間 的圖像顯示於電視2與終端裝置7,可更容易進行遊戲操 作。 (第2遊戲例) 將慣性感測器及觸控面板52的感測器輸出用作為遊 戲輸入之遊戲,並不限於上述第1遊戲例,可考量種種遊 戲例。第2遊戲例,與第1遊戲例相同,為藉由操作終端 裝置7而在遊戲空間内射出物件(砲彈)之遊戲。遊戲者, 藉由改變終端裝置7的姿勢之操作、及指定觸控面板52上 的位置之操作,可指示發射砲彈之方向。 第15圖係顯示第2遊戲例中之電視2的晝面與終端裝 置7之圖。第15圖中,於電視2上顯示出大砲131、砲彈 132、 及標靶133。於終端裝置7上顯示出砲彈132及標靶 133。 顯示於終端裝置7之終端用遊戲圖像,為從大砲131 的位置觀看遊戲空間之圖像。 第2遊戲例中,遊戲者藉由操作終端裝置7的姿勢, 可改變作為終端用遊戲圖像而顯示於終端裝置7之顯示範 圍。亦即,CPU 10根據慣性感測器(加速度感測器63及迴 轉感測器64)以及磁性感測器62的輸出,算出終端裝置7 的姿勢(步驟S24),並根據所算出的姿勢,來控制用以產 73 323304 201220108 生終端用遊戲圖像之第2虛擬攝影機的位置及姿勢(步驟 S27)。具體而言,第2虛擬攝影機係設置在大砲131的位 置,並因應終端裝置7的姿勢來控制朝向(姿勢)。如此, 遊戲者藉由改變終端裝置7的姿勢,而可改變顯示於終端 裝置7之遊戲空間的範圍。 此外,第2遊戲例中,遊戲者在觸控面板52上進行將 點輸入之操作(觸控操作),藉此指定砲彈132的發射方 向。具體而言,上述步驟S27的處理中,CPU 10算出對應 於觸控位置之遊戲空間内的位置(控制位置),並以從遊戲 空間内的預定位置(例如大砲131的位置)往控制位置之方 向作為發射方向來算出。然後進行將砲彈132往發射方向 移動之處理。如此,上述第1遊戲例中,遊戲者係進行在 觸控面板52上將線描繪出之操作,但在第2遊戲例中,則 進行指定觸控面板52上的點之操作。上述控制位置,可藉 由設定與上述第1遊戲例相同之控制面(惟第2遊戲例中未 顯示控制面)來算出。亦即,以對應於終端裝置7中的顯示 範圍之方式,因應第2虛擬攝影機的姿勢來配置控制面(具 體而言,控制面係以大砲131的位置為中心,因應終端裝 置7的姿勢變化來旋轉移動),藉此可算出對應於觸控位置 之控制面上的位置作為控制位置。 根據上述策2遊戲例.,遊戲裝置3藉由將慣性感測器. 的輸出用作為遊戲輸入,可因應終端裝置7的移動(姿勢) 來改變終端用遊戲圖像的顯示範圍,並藉由指定該顯示範 圍内的位置之觸控輸入用作為遊戲輸入,可特定出遊戲空 74 323304 © 201220108 間内的方向(砲彈132的發射方向)。因此,第2遊戲例中, 與第1遊戲例中相同,遊戲者可移動顯示於終端裝置7之 遊戲圖像或是對該遊戲圖像進行觸控操作,因此能夠以對 遊戲圖像直接進行操作之新穎操作感覺來進行遊戲。 此外,第2遊戲例中,亦與第1遊戲例中相同,遊戲 者以一邊的手實際調整終端裝置7的姿勢,以另一邊的手 對觸控面板52進行觸控輸入,藉此能夠以在空間内實際輸 入方向地直覺性操作而容易地指示方向。再者,遊戲者可 同時進行對終端裝置7的姿勢之操作以及對觸控面板52之 輸入操作,所以可迅速地進行指示3維空間中的方向之操 作。 此外,第2遊戲例中顯示於電視2之圖像,可為從與 終端裝置7相同之視點所觀看的圖像,但在第15圖中,遊 戲裝置3係顯示從不同視點所觀看的圖像。亦即,用以產 生終端用遊戲圖像之第2虛擬攝影機,被設定在大砲131 的位置,相對於此,用以產生電視用遊戲圖像之第1虛擬 攝影機,被設定在大砲131後方的位置。在此,例如將終 端裝置7的晝面中無法看到的範圍顯示於電視2,藉此可 實現一種遊戲者能夠觀看電視2的畫面來瞄準在終端裝置 7的晝面中無法看到的標13 3之玩法。如此,藉由將電 視2與終端裝置7之顯示範圍設定為不同,不僅更容易掌 握遊戲空間内的模樣,更能夠提升遊戲的趣味性。 如上所述,根據本實施形態,由於可將具備觸控面板 52與慣性感測器之終端裝置7用作為操作裝置,因此可實 75 323304 201220108 現如上述第1及第2遊戲例之以對遊戲圖像直接進行操作 之操作感覺的遊戲。 (第3遊戲例) 以下參照第16圖及第17圖來說明第3遊戲例。第3 遊戲例為2位遊戲者對戰之形式的棒球遊戲。亦即,第1 位遊戲者使用控制器5操作打者,第2位遊戲者使用終端 裝置7操作投手。此外,於電視2及終端裝置7,係顯示 出對各遊戲者而言容易進行遊戲操作之遊戲圖像。 第16圖係顯示第3遊戲例中顯示於電視2之電視用遊 戲圖像的一例圖。第16圖所示之電視用遊戲圖像,主要用 於第1位遊戲者之圖像。亦即,電視用遊戲圖像,係顯示 出從屬於第1位遊戲者的操作對象之打者(打者物件)141 側觀看屬於第2位遊戲者的操作對象之投手(投手物件) 142之遊戲空間。用以產生電視用遊戲圖像之第1虛擬攝 影機,係配置在打者141的後方位置,並從打者141朝投 手142的方向而配置。 另一方面,第17圖係顯示第3遊戲例中顯示於終端裝 置7之終端用遊戲圖像的一例圖。第17圖所示之終端用遊 戲圖像,主要用於第2位遊戲者之圖像。亦即,終端用遊 戲圖像,係顯示出從屬於第2位遊戲者的操作對象之投手 142侧觀看為第1位遊戲者的操作對象之打者141之遊戲 空間。具體而言,上述步驟S27中,CPU 10係根據終端裝 置7的姿勢,來控制用以產生終端用遊戲圖像之第2虛擬 攝影機。第2虛.擬攝影機的姿勢,與上述第2遊戲例相同, 76 323304 201220108 i = f於終端裝置7的姿勢之方式來算出。此外,第2 用遊戲圖像中,係包含用㈣的預疋位置。終端 游標⑷。用以顯不投手142投出球的方向之 糾It遊戲者所操控之打者141的操作方法及第2位 =斤:控之投手142的操作方法,可為任意方法。例 測出對控制Γί據控制器5之慣性感測器的輸出資料,偵 三二所進仃之揮舞操作,朗應揮舞操作來進 搖==揮_。此外 、仃之#作來移料標⑷,當按下操作鍵5 中鍵時,使投手142朝向游標143所㈣之 仃投球動作。此外,被栌η 直進 勢來㈣,亦可因祕端裝置7的姿 移動以取代對類比搖桿53所進行之操作。 如上所述,第3遊戲例中,係在電視2及終端震置7 產生互為不同的視點之遊戲圖像,藉此可提供對各 者而言容易觀看且容易操作之遊戲圖像。 戲 卢擬H她第3遊戲例中,係在單一遊戲空間中設定2個 ^ 〜機,並分別顯示出從各虛擬攝影機觀看遊戲*門 之2種遊戲圖像(第16圖及第17圖)。因此,關於^ = 戲例中所產生之2種遊戲圖像’對遊戲㈣所進行之 處理(遊戲空間内之物件的控料)幾乎為共通,^ 通的遊戲空間進行2次描繪處理即能夠產生 像、 :此與分別進行該遊戲處理時相比,具有處理效=優 323304 77 201220108 此外,第3遊戲例中,顯示出投球方 、顯示於終端裝置7側,因此第“立遊 切標143僅 所指示之位置。因此,不會產生㈣方向^看到游標143 得知而對第2位遊戲者不利之遊戲上的缺^ 1位遊戲者 施形態中,當產生某—方的遊戲者看射^如此,本實 另一方的遊戲者不利之遊戲上的缺失時,=戲圖像時會對 像顯示於終端衫7即可。藉此可防 ^將該遊戲圖 等缺失。其他實_態中,因遊_容戰的戰略性降低 終端用遊戲圖像被第1位遊戲者看到亦不同(例如’即使 時)’遊戲裝置3亦可將終端料戲圖像*產生上述缺失 一同顯示於電視2。 、视用遊戲圖像 (第4遊戲例) 以下參照第18圖及第19圖來說明第 遊戲例為2位遊戲者彼此合作之形式 遊戲例。第4 第1位遊戲者使驗制器5來進行移動飛機3二亦=, 位遊戲者使料端裝置7來進行控制麟的大砲的發射方2 向之操作。第4遊戲例中,亦與第3遊戲例相同,於電视 2及終端裝置7,係顯示出對各遊戲者而言容易進行遊戲操 作之遊戲圖像。 第18圖係顯示第4遊戲例中顯示於電視2之電視用遊 戲圖像的一例圖。政.外,.第19圖係顯示第4遊戲例中顯示 於終端裝置7之終端用遊戲圖像的一例圖。如第18圖所 示’第4遊戲例中,飛機(飛機物件)151與標靶(氣球物件) 153於虛擬遊戲空間中登場。此外,飛機151具有大砲(大 78 323304 201220108 砲物件)152。 如第18圖所示’包含飛機151之遊戲空間的圖像係顯 示作為電視用遊戲圖像。用以產生電視用遊戲圖像之第1 虛擬攝影機,係以產生從後方觀看飛機151之遊戲空間的 圖像之方式來設定。亦即,第1虛擬攝影機,係以飛機151 被包含於攝影範圍(視野範圍)之姿勢來配置在飛機151的 後方位置。此外,第1虛擬攝影機被控制為伴隨著飛機ΐ5ι 的移動而移動。亦即’ CPU 10在上述步驟S27的處理中, 根據控制器操作資料來控制飛機151的移動,並且控制第 1虛擬攝影機的位置及姿勢。如此,第丨虛擬攝影機的位 置及姿勢係因應第1位遊戲者的操作而被控制。 另一方面,如第19圖所示,從飛機151 (更具體而言 為大砲152)所觀看之遊戲空間的圖像係顯示作為終端用遊 戲圖像。因此,用以產生終端用遊戲圖像之第2虛擬攝影 機’係配置在飛機151的位置(更具體而言為大砲152的位 ° CPU 1〇在上述步驟S27的處理中,根據控制器操作 貝料來控制飛機151的移動,並且控制第2虛擬攝影機的 位置。第2虛擬攝影機亦可配置在飛機151或大砲152周 邊的位置(例如大砲152稍微後方的位置)。如上所述,第 2虛_影機的位置是由(操作飛機151的移動之)第i位 $戲者的操作來控制。因此,第4遊戲例中,第丨虛擬攝 影機及第2虛擬攝影機係連動地移動。 ,此外’朝大饱152的發射方向觀看之遊戲空間的圖像 係顯示作為終端用遊戲圖像。在此,大砲152的發射方向, 79 323304 201220108 係控制為對應於終端裝置7的姿勢。亦即,本實施形態中, 第2虛擬攝影機的姿勢’係控制為使第2虛擬攝影機的視 線方向與大砲152的發射方向一致。cpu 1〇在上述步驟S27 的處理中,因應上述步驟S24中所算出之終端裝置γ的姿 勢,來控制大砲152的朝向及第2虛擬攝影機的姿勢。如 此,第2虛擬攝影機的姿勢是藉由第2位遊戲者的操作來 控制。此外,第2位遊戲者藉由改變終端裝置7的姿勢, 可改變大砲152的發射方向。 當從大砲152發射砲彈時,第2位遊戲者壓下終端裝 置7的預定鍵。壓下預定鍵時,砲彈從大砲152的朝向被 發射。終端用遊戲圖像中,於LCD51的晝面中央顯示出準 星154,砲彈往準星154所指示的方向發射。 如上所述,第4遊戲例中,第丨位遊戲者,主要一邊 觀看顯不ib往飛機151的行衫向觀看之賴空間的電視 用遊戲圖像(第18圖),(例如以往期望標把153的方向移 動之方式)-邊操作飛機15卜另—方面,第2位遊戲者, 主要一邊觀看顯示出往大砲152的發射方向觀看之遊戲空 間的終端用遊戲圖像(第19圖),一邊操作大砲152。如此, 第4遊戲例中,在2位遊戲者彼此合作之形式的遊戲中, :刀=將對各遊戲者而言容易觀看且容易操作之遊戲圖 像’分別顯兔於電-視2及終端.裝置7。 此外,第4遊戲射,藉由第丨位遊戲者的操作來控 第1虛擬攝影機及第2虛擬攝影機的位置,藉由第2位 遊戲者的操作來控制第2虛擬攝影機的姿勢。亦即,本實 323304 ⑧ 80 201220108 =形態中,因應各遊戲者的各個遊戲操作而使虛擬攝影機 戧t置或姿勢產生變化,結果可使顯示於各顯示裝置之遊 处工間的顯示範圍產生變化。由於顯示於顯示裝置之遊戲 二間的顯不範圍因應各遊戲者的操作而產生變化,因此, \戲者可實地感文到自己的遊戲操作充分地反映於遊戲 的進行中,而能夠充分地享受遊戲。 第4遊戲例中’係於電視2顯示出從飛機151的後方 =觀看之遊戲圖像’於終端裂置7顯示出從飛機⑸之大 ^的位置所觀看之遊戲圖像。在此,其他遊齡】中,遊戲 置3亦可於終端裝置7顯示出從飛機151的後方所觀看 之遊戲圖像’於電視2顯示出從飛機151之大砲152的位 置所觀看之遊戲圖像。此時,各遊戲者的工作,與上述第 4_遊戲例替換,可狀為第1位遊戲者使用控制器5來進 仃大砲152的操作,第2位遊戲者使用終端裝置7來進行 飛機151的操作。 (第5遊戲例) 、以下參照第2〇圖來說明第5遊戲例。第5遊戲例,為 ,戲者使用控制器5來進行操作之遊戲,終端裝置7並非 操作政置,而是用作為顯示裝置。具體而言,第5遊戲例 為高爾夫球遊戲,因應遊戲者將控制器5如高球桿般地揮 舞之操作(揮桿操作),遊戲裝置3在虛擬遊戲空間中的遊 戲者角色中,進行高爾夫球的揮桿動作。 第20圖係顯示第5遊戲例中使用遊戲系統丨之模樣 圖第20圖中’於電視2的晝面顯示出包含遊戲者角色(的 323304 81 201220108 物件)161及高球桿(的物件)162之遊戲空間 圖中,雖隱藏於高球桿162而未被顯示 第20 間之球(的物❹63亦顯示於電視2。另3面置在遊戲空 圖所示,終端裝置7係以使LCD51的晝面垂 第2〇 ;置在電視2之前方正面的地板上。於終端裝置7顯示出式 知球163之圖像、顯示高球桿162的 高球桿服的桿頭162a)之圖像、^^體而吕為 之圖終端料細像輕上雜面 進灯遊戲時,遊戲者16〇站在終端 ,仃將控制器5如高球捍般地揮舞操作。、此時,、c叩來 在上述步驟S27的處理中,因應上述步驟犯 三 5 =勢,來控制遊戲空間中之高球;162 : 置及姿勢。具體而言,高球桿162,當控 山 方向(第3圖所示之Z軸正方向)朝向於咖51;干:: 的圖像時,係控制為遊戲”内的高球桿⑽擊出= 此外,當控制器5的前端方向朝向咖51時 =51顯示出高球桿162的一部分之圖像(桿頭圖像⑽ ^照第20圖關於終端用遊戲圖像,為了增加臨場感, 此夠以實物大來顯示球163的圖像,或是顯示油應控制 器5繞著z軸的旋轉使桿頭圖像164的朝向旋轉。此外, 終端用遊戲圖像,亦可使用設置在賴”之虛擬攝影機 來產生,或是使用預先準備的圖像資料來產生。使用預先 準備的圖像資料來產生時’不需詳細建構高爾夫球場的地 323304 82 201220108 形模型,而能夠以較小處理^ ^ 藉由遊戲者160、隹」 實的圖像。 162,結果當高 二上述揮桿操作來揮舞高球桿 出)。亦即,CPU Γ〇〇 63時,球163會移動(飛 球⑽是否接觸述步驟奶中判定高球桿162與 达私圓接觸,接觸時則將球163移動。在此,電視 遊戲圖像係以包含有移動後的球163之方:視: 即’咖10係以使移動的球包含於該攝影·之方式= 控制用以產生電視用遊戲圖像之第丨虛擬攝影機的位置及 姿勢。另-方面’終端裝置7中當高球桿162擊 時’球163的圖像移動並立即消失於晝面外。因此,第5 遊戲例中,球移動的模樣主要顯示於電視2,遊戲者_ 可藉由電視用遊戲圖像來確認因揮桿操作所飛出之球的去 向。 。如上所述,第5遊戲例中,遊戲者16〇可藉由揮舞控 制器5來揮舞高球桿162(使遊戲者角色161揮舞高球^ 162)。在此,第5遊戲例中,當控制器.5的前端方^朝: 在LCD 51所顯示之球163的圖像時,係控制為遊戲空間二 的高球桿162擊出球163。因此,遊戲者可藉由“ 而得到實際揮出高球桿之感覺,而能夠使揮桿操作更具臨 場感。 一 第5遊戲例卜當控制器5的前端方向朝向終端 7時’更在LCD 51顯示桿頭圖像164。因此,遊戲者 由將控制器5的前端方向朝向終端裝置7,而得到可^ 間中之高球桿162的姿勢與實際空間中之控制器5的== 323304 83 201220108 相對應之感覺,而能夠使揮桿操作更具臨場感。 如上所述,第5遊戲例,當將終端裝置7用作為顯示 裝置時,藉由適當地配置終端裝置7的位置,可讓使用控 制器5之操作更具臨場感。 此外,上述第5遊戲例中,終端裝置7配置在地面, 於終端裝置7顯示出僅顯示球163周邊的遊戲空間之圖 像。因此,於終端裝置7中,無法顯示遊戲空間中之全體 高球桿162的位置及姿勢,此外,於終端裝置7中,無法 顯示在揮桿操作後球163移動的模樣。因此,第5遊戲例 中,在球163的移動前,係於電視2顯示全體高球桿162, 在球163的移動後,於電視2顯示球163移動的模樣。如 此,根據第5遊戲例,可將更具臨場感之操作提供給遊戲 者,並且可使用電視2及終端裝置7的兩個畫面將容易觀 看的遊戲圖像提示於遊戲者。 此外,上述第5遊戲例中,為了算出控制器5的姿勢, 係使用終端裝置7的標示部55。亦即,CPU 10在上述步驟 S1的初始處理中將標示部55點燈(標示裝置6未點燈), 並且CPU 10在上述步驟S23中根據標示器座標資料96來 算出控制器5的姿勢。根據此,可正確地判定控制器5的 前端方向是否為朝向標示部55之姿勢。上述第5遊戲例 中,_雖可不執行上述步驟S21及S22,但在其他遊戲例中., 亦可藉由執行上述步驟S21及S22的處理而在遊戲中途變 更應予點燈之標示器。例如,CPU 10在步驟S21中,根據 第1加速度資料94來判定控制器5的前端方向是否為朝向 84 323304 ⑧ 201220108 重力方向,在步驟S22中,係控制為當朝向重力方向時將 標不部55點燈’未朝向重力方向時將標示裝置6點燈。根 據此,當控制器5的前端方向朝向重力方向時,可藉由取 得標示部55的標示器座標資料而精度佳地算出控制器5的 姿勢,並且控制器5的前端方向朝向電視2時,藉由取得 標示裝置6的標示器座標資料而精度佳地算出控制器5的 姿勢。 如上述第5遊戲例所說明,遊戲系統丨,可將終端裝 置7设置在自由位置並應用作為顯示裝置。根據此,當將 標示器座標資料用作為遊戲輸人時,除了將控制器5朝向 電視2來使用之外,亦可藉由將終端裝置7蚊在期望位 1而使控制H 5朝向自由方向來使用。亦即’根據本實施 形態,可使用控制器5之朝向並未受限制,所以可提升控 制器5的操作自由度。 工 [7.遊戲系統的其他動作例] 、上述遊戲系統1,可進行上述所說明之用以進行各種 遊戲之動作。終端裝置7亦可用作為可搬運型的顯示器或 弟2顯示器,同時亦可用作為進行觸控輸入或依據動作之 輪入之控制器,根據上述遊戲系、统i,可實施各式各樣的 遊戲。此外,亦包含遊戲以外之用途,亦可進行下列動作。 (遊戲者僅使用終端裝置7來玩遊戲之動作例) 本實施形態中’終端裝置7具有顯示裝置的功能,亦 具有操作裝置的功能。因此,不使用電視及控制器5而僅 將終端裝置7用作為顯示手段及操作手段,藉此°,亦可如 323304 85 201220108 可攜式遊戲裝置般地使用終端裝置7。 依循第12圖所示之遊戲處理來具體地說明,CPU 10 在步驟S3中從終端裝置7取得終端操作資料97,在步驟 S4中僅將終端操作資料97用作為遊戲輸入(不使用控制器 操作資料)來執行遊戲處理。然後在步驟S6中產生遊戲圖 像,在步驟S10中將遊戲圖像傳送至終端裝置7。此時, 可不執行步驟S2、S5、及S9。根據上述内容,係因應對終 端裝置7所進行的操作來進行遊戲處理,並將顯示出遊戲 處理結果之遊戲圖像顯示於終端裝置7。根據此,(實際上 雖然在遊戲裝置中執行遊戲處理,但)亦可將終端裝置7應 用作為可攜式遊戲裝置。因此,根據本實施形態,即使在 有人使用電視2(例如其他人正在收看電視播放)等理由而 無法將遊戲圖像顯示於電視2時,使用者亦可使用終端裝 置7來進行遊戲。 CPU 10並不限於遊戲圖像,對於開啟電源後所顯示之 上述選單晝面,亦可將圖像傳送至終端裝置7來顯示。根 據此,遊戲者可從最初即不需使用電視2進行遊戲,故極 為便利。 再者,上述中,亦可在遊戲中途,將顯示出遊戲圖像 之顯示裝置從終端裝置7變更為電視2。具體而言,CPU 10 可更執行上述步驟S9並將遊戲圖像.輸出至電視2。步驟S9 中被輸出至電視2之圖像,係與步驟S10中被傳送至終端 裝置7之遊戲圖像相同。根據此,以顯示出來自遊戲裝置 3的輸入之方式切換電視2的輸入,可將與終端裝置7相 86 323304 ⑧ 201220108 同之遊戲圖像顯示於電視2,因 顯示裝置變更為電視2。在 可將顯示出遊戲圖像之 可關閉終端裳置7的晝面顯示。,像破顯示於電視2後’ 遊戲系統1中,亦可構成為從 置6、標示部55、或紅外線通訊模^出手段(標示裝 視2所射出的紅外線遙控訊號請72)中,可輪出對電 由因應對終端裝I 7彳 據此,遊戲裝置3,藉 …紅外手段輸 ::::操作電視2的遙控器而能夠使=來: 乍電=因此如上述在切換電視2的輸入 广 (經由網路與其他裝置進行通訊之動作例) C由於遊戲裝置3具有連接於網路之功能, 系純rm㈣網路與外部裝置連接時之遊戲 示,游λ之各裝置的連接關係之圖。如第21圖所 遊戲裝置3可經由網路2〇〇與外部裝置2〇1連接。 如上所述’當外部裝置201與遊戲裝置3可進行通訊 裝’遊戲系統1中’可將終端裝置7作為介面並在與外部 置201之間進行通訊。例如,藉由在外部裝置別1與終 端裝置7之間進行圖像及聲音的接收傳送,可將遊戲系統 1用作為電視電話。具體而言,遊戲裝置3經由網路200 接收來自外部裝置201的圖像及聲音(電話對方的圖像及 聲音)’並將接收到的圖像及聲音傳送至終端裝置7。藉此, 終端裝置7可將來自外部裝置201的圖像顯示於LCD 51, 87 323304 201220108 並且從揚聲器67輸出來自外部裝置謝的聲音。此外,遊 戲裝置3從終端裝置7接收攝影機56所攝影之攝影機圖 像、及麥克風⑽所偵測之參克風聲音,並經由網路_將 攝影機圖像及麥认聲音傳送1外料置2Q1。遊戲製置 3’藉由在與外部裝置2Q1之間重複進行上述圖像及聲音的 接收傳送,可將遊戲系統1用作為電視電話。 本實施形態中,由於終端裝置7為可搬運型,所以使 用者可在自由位置上使用終端梦番7 L aL ^ %褒置7,並使攝影機56朝向 自由方向。此外2實施形態令,由於終端裝置7具備觸 控面板52,二遊戲裝置3亦可將對觸控面板犯所輸入 的輸入資訊(觸控位置資料刚)傳送至外部裝置2〇1。例 如’當藉H置7將來自外部裝置2G1的圖像及聲音 輸出,並且將使用者書寫於觸控面板%上之文字等傳送至 外部裝置2G1時’亦可將遊戲系 統(電子化學習系統)。 W所明系 (與電視播放連動之動作例) 此外,遊^系統1,當以電視2來觀賞電視播放時, 亦能夠與電=放,而動作。亦即,遊戲系統^,當以 電視2來觀員電視節目時,可將| 等輸出至終端裝置7。以下說明遊戲\電:見:目相關之資訊 動而動作時冬動作例。遊戲系統1與電視播放連 上述動作例中,遊戲裝置3可經由㈣與服行 通訊(換言之,第21圖所示之外部裳置2〇1為:器器)。祠 服器係對電視播放的每個頻道記憶與電視播敌相關的各種 323304 ⑧ 88 201220108 資訊(電視資訊)。該電㈣m 之與節目相關之資訊、或是EPg ,、、、子幕或演出者資訊等 或作為資料播放而被播放之資電子化知目表)的資訊、 圖像、聲音、文字、或此等的^此外」電視資訊,可為 不-定需為i·,可對電視播資況。此外,词服器 置舰器,遊齡置3亦 的母_道絲個節目設 當在電視2中輸出電_=服器進行通訊。 置3係令使用者使用終端褒將聲音時,遊戲裝 道輸入。然後,經由絪故i七將觀貝中之電視播放的頻 頻道之電視資訊。因應於此U服益傳送對應於所輸入的 之電視資訊的資料。者接收至卜服器傳送對應於上述頻道 遊戲F晉3,田接收到從伺服器傳送來之資料時, 將上眘B ^接收之資料輸出至終端裝置7。終端襄置7 器輸=:!的圖像及文字資料顯示於1^51。並從揚聲 7°‘享心目貝:斗。藉由上述方式,使用者可使用終端裝置 旱又與目錢賞中的電視節目相關之資訊等。 、/如上所述,遊戲系統1係經由網路與外部裝置(伺服器) 進行通訊’藉此’亦可藉由終端裝置7將與電視播放連動 之資訊提供至使用者。尤其在本實施形態中,由於終端裝 置7為可搬運型,所以使用者可在自由位置上使用終端裝 置7 ’其便利性高。 如以上所述’本實施形態中,除了使用於遊戲之外, 使用者亦能夠以各種用途及形態來使用終端裝置” [8.變形例] 上述實施形態為用以實施本發明之 例,其他實施形 89 323304 201220108 態中,例如亦可在以下所說明之構成中實施本發明。 (具有複數個終端裝置之變形例) 上述實施形態中,遊戲系統1構成為僅具有1個終端 裝置,但遊戲系統1亦可構成為具有複數個終端裝置。亦 即,遊戲裝置3可分別與複數個終端裝置進行無線方式通 訊,可將遊戲圖像的資料、遊戲聲音的資料與控制資料傳 送至各終端裝置,並且從各終端裝置接收操作資料、攝影 機圖像資料與麥克風聲音資料。遊戲裝置3,·係與複數個 終端裝置的各個進行無線方式通訊,此時,遊戲裝置3可 以時間分割方式來進行與各終端裝置之無線方式通訊,或 是分割頻率波段來進行。 如上所述具有複數個終端裝置時,可使用遊戲系統來 進行更多種的遊戲。例如,當遊戲系統1具有2個終端裝 置時,由於遊戲系統1具有3個顯示裝置,所以可產生分 別用於3位遊戲者的遊戲圖像並顯示於各顯示裝置。此 外,當遊戲系統1具有2個終端裝置時,在將控制器與終 端裝置作為1組來使用之遊戲(例如上述第5遊戲例)中,2 位遊戲者可同時進行遊戲。再者,當根據從2個控制器所 輸出之標示器座標資料來進行上述步驟S27的遊戲處理 時,2位遊戲者可分別使控制器朝向標示器(標示裝置6或 標示部5_5)來進行遊戲.操作。亦即,.某一方的遊戲者使控 制器朝向標示裝置6來進行遊戲操作,另一方的遊戲者使 控制器朝向標示部55來進行遊戲操作。 (關於終端裝置的功能之變形例) 90 323304 ⑧ 201220108 上述實施形態中,終端裝置7具有不執行遊戲處理之 所謂精簡型終端的功能。在此,其他實施形態中,亦可藉 由終端裝置7等的其他裝置,來執行上述實施形態中由遊 戲裝置3所執行之一連串遊戲處理中的一部分處理。例 如,由終端裝置7來執行一部分處理(例如終端用遊戲圖像 的產生處理)。此外,例如在具有可相互進行通訊之複數個 資訊處理裝置(遊戲裝置)之遊戲系統中,該複數個資訊處 理裝置可分擔執行遊戲處理。 (產業利用可能性) 如上所述,本發明係以可讓遊戲者進行新穎的遊戲操 作等者為目的,例如可應用在遊戲系統或遊戲系統中所使 用之終端裝置等。 【圖式簡單說明】 第1圖為遊戲系統1的外觀圖。 第2圖係顯示遊戲裝置3的内部構成之方塊圖。 第3圖係顯示控制器5的外觀構成之立體圖。 第4圖係顯示控制器5的外觀構成之立體圖。 第5圖係顯示控制器5的内部構造之圖。 第6圖係顯示控制器5的内部構造之圖。 第7圖係顯示控制器5的構成之方塊圖。 第8圖係顯示終端裝置7的外觀構成之圖。 第9圖係顯示使用者握持終端裝置7之模樣之圖。 第10圖係顯示終端裝置7的内部構成之方塊圖。 第11圖係顯示遊戲處理中所使用之各種資料之圖。 91 323304 201220108 第12圖係顯示遊戲裝置3中所執行之遊戲處理的流程 之主流程圖。 第13圖係顯示遊戲控制處理的詳細流程之流程圖。 第14圖係顯示第1遊戲例中之電視2的晝面與終端裝 置7之圖。 第15圖係顯示第2遊戲例中之電視2的畫面與終端裝 置7之圖。 第16圖係顯示第3遊戲例中顯示於電視2之電視用遊 戲圖像的一例之圖。 第Π圖係顯示第3遊戲例中顯示於終端裝置7之終端 用遊戲圖像的一例之圖。 第18圖係顯示第4遊戲例中顯示於電視2之電視用遊 戲圖像的一例之圖。 第19圖係顯示第4遊戲例中顯示於終端裝置7之終端 用遊戲圖像的一例之圖。 第20圖係顯示第5遊戲例中使用遊戲系統1之模樣之 圖。 第21圖係顯示經由網路與外部裝置連接時之遊戲系 統1中所包含之各裝置的連接關係之圖。 【主要元件符號說明】 1 遊戲系統 ··. · . · 2 . 電視..... 2a 揚聲器 3 遊戲裝置 4 光碟 5 控制器 6 標示裝置 6L、 6R標示器 323304 ⑧ 92 201220108 7 終端裝置 10 CPU 11 系統LSI 11a 輸出輸入處理器(I/O處理器) lib GPU 11c DSP lid VRAM lie 内部主記憶體 12 外部主記憶體 13 ROM/RTC 14 光碟機 15 AV-IC15 17 快閃記憶體 18 網路通訊模組 19 控制器通訊模組 20 擴充連接器 21 記憶卡用連接器 11、 23、29天線 24 電源鍵 25 重設鍵 26 退片鍵 27 編解碼器LSI 28 終端通訊模組 30 基板 31 外罩 31a 放音孔 32 操作部 32a 十字鍵 32b 1號鍵 32c 2號鍵 32d A鍵 32e 減號鍵 32f 首頁鍵 32g 正號鍵 32h 電源鍵 32i B鍵 33 連接器 34a 至 34d LED 35 攝影資訊運算部 35a 光入射面 36 通訊部 37 加速度感測器 38 紅外線濾波器 39 透鏡 40 攝影元件 41 圖像處理電路 93 323304 % 201220108 42 微電腦 43 記憶體 44 無線方式模組 45 天線 46 振動器 47 揚聲器 48 迴轉感測器 50 外罩 51 LCD 52 觸控面板 53 類比搖桿 53A 左類比搖桿 53B 右類比搖桿 54 操作鍵 54A 十字鍵(方向輸入鍵) 54B至54H操作鍵 541 第1L鍵 54J 第1R鍵 54K 第2L鍵 54L 第2R鍵 55 標示部 55A、 55B標示器 56 攝影機 57 揚聲器孔 58 擴充連接器 59A、 59B足部 60 麥克風用孔 61 觸控面板控制器 62 磁性感測器 63 加速度感測器 64 迴轉感測器 65 使用者介面控制器(UI控制器) 66 編解碼器LSI 67 揚聲器 68 聲音1C 69 麥克風 70 無線方式模組 71 天線 72 紅外綠通訊模組 73 快閃記憶體 74 電源1C 75 電池 76 充電器 77 CPU 78 内部記憶體 90 控制器操作資料 94 323304 ⑧ 201220108 91 接收資料 92 控制器操作資料 93 第1操作鍵資料 94 第1加速度資料 95 第1角速度資料 96 標示器座標資料 97 終端操作資料 98 攝影機圖像資料 99 麥克風聲音資料 100 觸控位置資料 101 第2加速度資料 102 第2角速度資料 103 方位資料 104 攝影機圖像資料 105 麥克風聲音資料 106 處理用資料 107 控制資料 108 控制器姿勢資料 109 終端姿勢資料 110 圖像辨識資料 111 聲音辨識資料 121 飛鏢 122 控制面 123 標靶 131 大砲 132 砲彈 133 標靶 141 打者(打者物件) 142 投手(投手物件) 143 游標 151 飛機(飛機物件) 152 大砲(大砲物件) 153 標靶(氣球物件) 154 準星 160 遊戲者 161 遊戲者角色 162 高球桿 162a 桿頭 163 球 164 桿頭圖像 200 網路 201 外部裝置 95 323304The body control surface 122 and the stem 123: 'display on the television 2 command include the full picture). That is, the image of the game space in the above step S27 (refer to the UY, 72 323304 201220108 first virtual camera for generating the game image for the television, so that the entire control surface 122 and the target 123 are included in the field of view. The second virtual camera for generating the game image for the terminal is set such that the face of the LCD 51 (the input face of the touch panel 52) and the control surface 122 are aligned on the face. Therefore, in the first game example, the game operation can be performed more easily by displaying the image of the game space viewed from different viewpoints on the television 2 and the terminal device 7. (Second game example) The inertial sensor is used. The game output of the touch panel 52 is used as a game input game, and is not limited to the first game example described above, and various game examples can be considered. The second game example is the same as the first game example, and is operated by the terminal device. 7. A game in which an object (cannonball) is fired in the game space. The player, by operating the posture of the terminal device 7, and specifying the position on the touch panel 52, can indicate the direction in which the projectile is fired. 5 shows a view of the facet and terminal device 7 of the television 2 in the second game example. In Fig. 15, the cannon 131, the projectile 132, and the target 133 are displayed on the television 2. The terminal device 7 displays The projectile 132 and the target 133 are displayed. The terminal game image displayed on the terminal device 7 is an image of the game space viewed from the position of the cannon 131. In the second game example, the player operates the posture of the terminal device 7, The display range of the terminal device 7 can be changed as the terminal game image. That is, the CPU 10 outputs the output of the inertial sensor (the acceleration sensor 63 and the rotation sensor 64) and the magnetic sensor 62. The posture of the terminal device 7 is calculated (step S24), and the position and posture of the second virtual camera for producing the game image for the terminal of 73 323304 201220108 are controlled based on the calculated posture (step S27). Specifically, The second virtual camera is placed at the position of the cannon 131, and controls the orientation (posture) in response to the posture of the terminal device 7. Thus, the player can change the game displayed on the terminal device 7 by changing the posture of the terminal device 7. Further, in the second game example, the player performs an operation of inputting a point (touch operation) on the touch panel 52, thereby specifying the emission direction of the projectile 132. Specifically, the processing of the above step S27 is performed. The CPU 10 calculates a position (control position) in the game space corresponding to the touch position, and calculates the direction from the predetermined position in the game space (for example, the position of the cannon 131) to the control position as the emission direction. In the first game example, the player performs an operation of drawing a line on the touch panel 52. However, in the second game example, the designated touch panel 52 is performed. The operation of the point above. The control position can be calculated by setting the same control surface as the first game example (but the control surface is not displayed in the second game example). In other words, the control surface is disposed in accordance with the posture of the second virtual camera in accordance with the display range in the terminal device 7 (specifically, the control surface is centered on the position of the cannon 131, and the posture of the terminal device 7 is changed in response to the position of the cannon 131. To rotate the movement, the position on the control surface corresponding to the touch position can be calculated as the control position. According to the above policy 2 game example. , the game device 3 by means of inertial sensors. The output is used as a game input, and the display range of the game image for the terminal can be changed in response to the movement (posture) of the terminal device 7, and the touch input specifying the position within the display range can be specified as a game input. The game is empty 74 323304 © 201220108 The direction in between (the direction in which the projectile 132 is fired). Therefore, in the second game example, as in the first game example, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, so that the game image can be directly played. The novel operation of the operation feels to play the game. Further, in the second game example, as in the first game example, the player actually adjusts the posture of the terminal device 7 with one hand and touch input the touch panel 52 with the other hand. The direction is intuitively operated by intuitively operating the direction of the actual input in the space. Further, the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52, so that the operation of instructing the direction in the three-dimensional space can be quickly performed. Further, the image displayed on the television 2 in the second game example may be an image viewed from the same viewpoint as the terminal device 7, but in FIG. 15, the game device 3 displays a view viewed from a different viewpoint. image. In other words, the second virtual camera for generating the game image for the terminal is set at the position of the cannon 131. On the other hand, the first virtual camera for generating the game image for the television is set behind the cannon 131. position. Here, for example, a range that cannot be seen in the face of the terminal device 7 is displayed on the television 2, whereby a player can view the screen of the television 2 to aim at a target that cannot be seen in the face of the terminal device 7. 13 3 gameplay. Thus, by setting the display range of the television 2 and the terminal device 7 differently, it is not only easier to grasp the appearance in the game space, but also to enhance the fun of the game. As described above, according to the present embodiment, since the terminal device 7 including the touch panel 52 and the inertial sensor can be used as the operation device, it is possible to implement the first and second game examples as described above. A game in which the game image is directly manipulated to operate. (Third Game Example) A third game example will be described below with reference to Figs. 16 and 17 . The third game example is a baseball game in the form of a two-player game. That is, the first player uses the controller 5 to operate the hitter, and the second player uses the terminal device 7 to operate the pitcher. Further, in the television 2 and the terminal device 7, a game image which is easy for each player to perform a game operation is displayed. Fig. 16 is a view showing an example of a television game image displayed on the television 2 in the third game example. The video game image for television shown in Fig. 16 is mainly used for the image of the first player. In other words, the game image for the television shows the game space of the pitcher (pitcher object) 142 of the operation target belonging to the second player from the hitter (the hitter object) 141 side of the operation target of the first player. . The first virtual camera for generating the game image for the television is placed at the rear position of the hitter 141, and is disposed from the beater 141 toward the direction of the shooter 142. On the other hand, Fig. 17 is a view showing an example of a terminal game image displayed on the terminal device 7 in the third game example. The terminal game image shown in Fig. 17 is mainly used for the image of the second player. In other words, the game video for the terminal is displayed as the game space of the hitter 141 that is the operation target of the first player from the pitcher 142 side of the operation target of the second player. Specifically, in the above-described step S27, the CPU 10 controls the second virtual camera for generating the game image for the terminal based on the posture of the terminal device 7. The second virtual. The posture of the camera is calculated in the same manner as in the second game example described above, and 76 323304 201220108 i = f is calculated in the manner of the posture of the terminal device 7. In addition, the second game image includes the preview position using (4). Terminal cursor (4). The method of operating the hitter 141 controlled by the player who is not throwing the ball in the direction of the pitcher 142 and the second place = the kilogram: the operation method of the pitcher 142 can be any method. For example, the output data of the inertial sensor of the control unit 5 is measured, and the flapping operation of the third and second squadrons is detected, and the wave swing operation is performed to swing == _. In addition, 仃# is used to move the label (4), and when the middle button of the operation button 5 is pressed, the pitcher 142 is caused to pitch toward the cursor 143 (4). In addition, the movement of the 栌η straight (4) can also be performed by the movement of the secret end device 7 instead of the operation of the analog rocker 53. As described above, in the third game example, the game image is generated in which the television 2 and the terminal are placed 7 to generate mutually different viewpoints, thereby providing a game image that is easy to view and easy to operate for each. In the third game example, the game is set up in a single game space, and two game images are displayed from each virtual camera (Figures 16 and 17). ). Therefore, it is almost common for the two kinds of game images generated in the playroom to be processed (the control of the objects in the game space) (the control of the objects in the game space) is almost common, and the game space of the game can be rendered twice. When the image processing is performed, the processing efficiency is better than that when the game processing is performed separately. 323304 77 201220108 In addition, in the third game example, the pitching party is displayed and displayed on the terminal device 7 side. 143 is only the indicated position. Therefore, the (four) direction is not generated. When the cursor 143 is known and the game is unfavorable to the second player, the game player is in the form of a game. When the player sees the shot ^, if the other player's game is not suitable for the game, the icon will be displayed on the shirt 7 when the screen is played. This can prevent the game map from being missing. In the real state, the game image of the terminal is reduced by the first player by the strategically reduced game image of the game_ (for example, 'even if only') the game device 3 can also generate the above-mentioned game image*. Missing is displayed on TV 2. Viewing game image (Fourth Game Example) Hereinafter, a game example in which the game player is a form of game in which two players cooperate with each other will be described with reference to FIGS. 18 and 19. The fourth game player makes the machine 5 to move the aircraft 3 Also, the player causes the material end device 7 to perform the operation of the launcher of the controllary cannon. The fourth game example is also displayed on the television 2 and the terminal device 7 in the same manner as the third game example. A game image that is easy for a game operation for each player. Fig. 18 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. outer,. Fig. 19 is a view showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example. As shown in Fig. 18, in the fourth game example, the aircraft (aircraft object) 151 and the target (balloon object) 153 appear in the virtual game space. In addition, aircraft 151 has a cannon (large 78 323304 201220108 gun object) 152. As shown in Fig. 18, the image of the game space including the airplane 151 is displayed as a game image for television. The first virtual camera for generating a game image for television is set such that an image of the game space of the aircraft 151 is viewed from the rear. In other words, the first virtual camera is disposed at a rear position of the aircraft 151 in a posture in which the aircraft 151 is included in the imaging range (field of view). Further, the first virtual camera is controlled to move along with the movement of the airplane ΐ5ι. That is, the CPU 10 controls the movement of the aircraft 151 based on the controller operation data in the processing of the above-described step S27, and controls the position and posture of the first virtual camera. Thus, the position and posture of the second virtual camera are controlled in accordance with the operation of the first player. On the other hand, as shown in Fig. 19, the image of the game space viewed from the aircraft 151 (more specifically, the cannon 152) is displayed as a terminal game image. Therefore, the second virtual camera for generating the game image for the terminal is disposed at the position of the aircraft 151 (more specifically, the position of the cannon 152). The CPU 1 is in the process of the above-described step S27, and operates according to the controller. The movement of the aircraft 151 is controlled to control the position of the second virtual camera. The second virtual camera may also be placed at a position around the aircraft 151 or the cannon 152 (for example, a position slightly behind the cannon 152). As described above, the second virtual The position of the camera is controlled by the operation of the i-th player (operating the movement of the aircraft 151). Therefore, in the fourth game example, the second virtual camera and the second virtual camera are moved in conjunction with each other. The image of the game space viewed in the direction of emission of the sufficiency 152 is displayed as a game image for the terminal. Here, the direction of emission of the cannon 152, 79 323304 201220108 is controlled to correspond to the posture of the terminal device 7. That is, In the present embodiment, the posture of the second virtual camera is controlled such that the line of sight direction of the second virtual camera coincides with the direction of emission of the cannon 152. The processing of the above step S27 is cpu 1 The orientation of the cannon 152 and the posture of the second virtual camera are controlled in response to the posture of the terminal device γ calculated in the above step S24. Thus, the posture of the second virtual camera is controlled by the operation of the second player. Further, the second player can change the direction of the launch of the cannon 152 by changing the posture of the terminal device 7. When the projectile is fired from the cannon 152, the second player presses the predetermined key of the terminal device 7. Pressing the predetermined button At the time, the projectile is fired from the direction of the cannon 152. In the game image for the terminal, the crosshair 154 is displayed in the center of the face of the LCD 51, and the projectile is fired in the direction indicated by the crosshair 154. As described above, in the fourth game example, The game player (mainly watching the TV game image (Fig. 18) which shows the space of the aircraft 151 to the viewing space, (for example, the manner in which the direction of the target 153 is desired to be moved in the past) - the operation In the case of the aircraft, the second player, while watching the terminal game image (Fig. 19) showing the game space viewed in the direction in which the cannon 152 is emitted, operates the cannon 152. In the fourth game example, in a game in which two players cooperate with each other, the knife = a game image that is easy to view and easy to operate for each player, respectively, is displayed in the electric-view 2 and terminal. Device 7. Further, in the fourth game shot, the positions of the first virtual camera and the second virtual camera are controlled by the operation of the third player, and the posture of the second virtual camera is controlled by the operation of the second player. In other words, in the form 323304 8 80 201220108 = in the form, the virtual camera 置t is set or the posture is changed in response to each game operation of each player, and as a result, the display range displayed on the display of each display device can be generated. Variety. Since the display range of the game two displayed on the display device changes depending on the operation of each player, the player can actually reflect the game to his or her own game operation, and can fully reflect the progress of the game. Enjoy the game. In the fourth game example, the game image displayed on the television 2 showing the rear view from the airplane 151 is displayed on the terminal split 7 to display the game image viewed from the position of the airplane (5). Here, in the other game ages, the game set 3 may also display the game image viewed from the rear of the aircraft 151 in the terminal device 7 'the game picture displayed on the television 2 from the position of the cannon 152 of the aircraft 151. image. At this time, the work of each player is replaced with the fourth game example described above, and the first player can use the controller 5 to perform the operation of the cannon 152, and the second player uses the terminal device 7 to carry out the aircraft. 151 operation. (Fifth Game Example) The fifth game example will be described below with reference to FIG. In the fifth game example, the player uses the controller 5 to operate the game, and the terminal device 7 is not used as a display device. Specifically, the fifth game example is a golf game, and in response to the player's operation of swinging the controller 5 like a golf club (swing operation), the game device 3 performs the player character in the virtual game space. The swing of the golf ball. Figure 20 is a view showing the use of the game system in the fifth game example. Figure 20 shows the player character (the 323304 81 201220108 object) 161 and the high club in the face of the TV 2. In the game space map of 162, although the golf club 162 is hidden and the 20th ball is not displayed (the object 63 is also displayed on the TV 2. The other three faces are placed in the game space diagram, and the terminal device 7 is The top surface of the LCD 51 is placed on the floor of the front side of the television 2. The terminal device 7 displays an image of the ball 163 and a head 162a of the high club that displays the high club 162. When the image, the ^^ body and the picture of the terminal are in the light, the player 16〇 stands in the terminal, and the controller 5 swings like a high ball. At this time, in the processing of the above step S27, the high ball in the game space is controlled in response to the above-mentioned steps, and the posture is controlled. Specifically, the high club 162, when the hill control direction (the positive Z-axis direction shown in FIG. 3) is oriented toward the coffee 51; the image of the dry:: is controlled as the high club (10) in the game. Output = In addition, when the front end direction of the controller 5 faces the coffee maker 51 = 51, an image of a part of the high club 162 is displayed (head image (10) ^ according to Fig. 20 regarding the game image for the terminal, in order to increase the sense of presence This is enough to display the image of the ball 163 in a large object, or to display the rotation of the oil controller 5 around the z-axis to rotate the orientation of the head image 164. In addition, the game image of the terminal can also be used. It is generated by the virtual camera of Lai, or by using the image data prepared in advance. When using the image data prepared in advance, it is not necessary to construct the map of the golf course 323304 82 201220108 in detail. Small processing ^ ^ by the player 160, 隹" real image. 162, the result is high-two swing operation to swing the high-ball out). That is, when the CPU Γ〇〇 63, the ball 163 will move (whether or not the flying ball (10) contacts the step milk, the high club 162 is in contact with the private circle, and when the ball is touched, the ball 163 is moved. Here, the video game image The method includes the side of the ball 163 after the movement: view: that is, the method of "the coffee 10 is used to include the moving ball in the shooting" = the position and posture of the second virtual camera for controlling the game image for television In the other aspect, when the high club 162 is hit in the terminal device 7, the image of the ball 163 moves and immediately disappears outside the plane. Therefore, in the fifth game example, the appearance of the ball movement is mainly displayed on the TV 2, the game. _ The destination of the ball flying by the swing operation can be confirmed by the game image of the television. As described above, in the fifth game example, the player 16 can swing the golf ball by waving the controller 5. Rod 162 (to make the player character 161 swing high ball ^ 162). Here, in the fifth game example, when the controller. The front end of 5 is toward: When the image of the ball 163 displayed on the LCD 51 is controlled, the high ball 162 of the game space 2 is controlled to hit the ball 163. Therefore, the player can make the swing operation more realistic by "the actual feeling of swinging the high club." A fifth game example is when the front end direction of the controller 5 faces the terminal 7' The LCD 51 displays the head image 164. Therefore, the player takes the front end direction of the controller 5 toward the terminal device 7, and obtains the posture of the high club 162 in the middle and the controller 5 in the real space == 323304 83 201220108 The swing operation can be made more realistic. As described above, in the fifth game example, when the terminal device 7 is used as the display device, by appropriately arranging the position of the terminal device 7, Further, in the fifth game example described above, the terminal device 7 is disposed on the ground, and the terminal device 7 displays an image in which only the game space around the ball 163 is displayed. In the terminal device 7, the position and posture of the entire golf club 162 in the game space cannot be displayed, and the terminal device 7 cannot display the movement of the ball 163 after the swing operation. Therefore, in the fifth game example, On the ball 163 Before the movement, the whole golf club 162 is displayed on the television 2, and after the movement of the ball 163, the appearance of the movement of the ball 163 is displayed on the television 2. Thus, according to the fifth game example, a more realistic operation can be provided to the game. In addition, in the fifth game example, in order to calculate the posture of the controller 5, the terminal device 7 is used to display the game image that is easy to view using the two screens of the television 2 and the terminal device 7. The indicator portion 55. That is, the CPU 10 lights up the indicator portion 55 in the initial processing of the above-described step S1 (the pointing device 6 is not lit), and the CPU 10 calculates the controller based on the marker coordinate data 96 in the above-described step S23. According to this, it is possible to accurately determine whether or not the distal end direction of the controller 5 is the posture toward the indicator portion 55. In the fifth game example, the above steps S21 and S22 may not be performed, but in other game examples. It is also possible to change the marker to be turned on in the middle of the game by executing the processing of steps S21 and S22 described above. For example, in step S21, the CPU 10 determines whether the front end direction of the controller 5 is toward the direction of gravity of 84 323304 8 201220108 based on the first acceleration data 94, and in step S22, the control is to face the portion when facing the direction of gravity. The 55-point light 'lights the pointing device 6 when it is not facing the direction of gravity. According to this, when the front end direction of the controller 5 faces the gravity direction, the posture of the controller 5 can be accurately calculated by acquiring the marker coordinate data of the indicator portion 55, and when the front end direction of the controller 5 faces the television 2, The posture of the controller 5 is accurately calculated by acquiring the marker coordinate data of the pointing device 6. As described in the fifth game example described above, the game system 丨 can set the terminal device 7 to a free position and apply it as a display device. According to this, when the marker coordinate data is used as a game input, in addition to using the controller 5 toward the television 2, the control device H 5 can be oriented in the free direction by placing the terminal device 7 in the desired position 1. To use. That is, according to the present embodiment, the orientation in which the controller 5 can be used is not limited, so that the degree of freedom in operation of the controller 5 can be improved. Worker [7. Other Operation Examples of the Game System] The game system 1 described above can perform the above-described operations for performing various games. The terminal device 7 can also be used as a transportable display or a second display, and can also be used as a controller for performing touch input or wheeling according to actions. According to the game system and system i, various games can be implemented. . In addition, it also includes other uses beyond the game, and the following actions can also be performed. (Example of the operation of the player using only the terminal device 7 to play the game) In the present embodiment, the terminating device 7 has the function of the display device and also has the function of the operating device. Therefore, the terminal device 7 can be used only as the display means and the operation means without using the television and the controller 5, whereby the terminal device 7 can be used like the portable game device of 323304 85 201220108. According to the game processing shown in Fig. 12, the CPU 10 acquires the terminal operation material 97 from the terminal device 7 in step S3, and uses only the terminal operation material 97 as the game input in step S4 (without using the controller operation). Data) to perform game processing. The game image is then generated in step S6, and the game image is transmitted to the terminal device 7 in step S10. At this time, steps S2, S5, and S9 may not be performed. According to the above, the game processing is performed in response to the operation performed by the terminal device 7, and the game image showing the result of the game processing is displayed on the terminal device 7. According to this, (actually, although the game processing is executed in the game device), the terminal device 7 can also be applied as a portable game device. Therefore, according to the present embodiment, the user can use the terminal device 7 to play the game even if the game image cannot be displayed on the television 2 for reasons such as when the television 2 is used (for example, other people are watching television broadcasting). The CPU 10 is not limited to the game image, and the image may be transmitted to the terminal device 7 for display after the above-mentioned menu page displayed after the power is turned on. According to this, the player can play the game without using the TV 2 from the beginning, which is extremely convenient. Further, in the above, the display device displaying the game image may be changed from the terminal device 7 to the television 2 in the middle of the game. Specifically, the CPU 10 can further perform the above step S9 and play the game image. Output to TV 2. The image output to the television 2 in step S9 is the same as the game image transmitted to the terminal device 7 in step S10. According to this, the input of the television 2 is switched so as to display the input from the game device 3, and the game image similar to the terminal device 7 can be displayed on the television 2, and the display device is changed to the television 2. The face display of the closeable terminal 7 in which the game image is displayed can be displayed. It can be displayed in the game system 1 after the TV 2 is displayed. It can also be configured as the slave 6, the indicator 55, or the infrared communication module (infrared remote control signal 72 indicated by the device 2). According to this, the game device 3, by means of infrared means:::: Operate the remote control of the TV 2 to enable = to: 乍 = therefore switch TV 2 as described above The input is wide (an example of the operation of communicating with other devices via the network) C. Since the game device 3 has a function of connecting to the network, the game is displayed when the pure rm (four) network is connected to the external device, and the connection of each device of the λ is connected. Diagram of the relationship. The game device 3 as shown in Fig. 21 can be connected to the external device 2〇1 via the network 2〇〇. As described above, when the external device 201 and the game device 3 can be communicated, the game device 1 can use the terminal device 7 as an interface and communicate with the external device 201. For example, the game system 1 can be used as a videophone by receiving and transmitting images and sounds between the external device 1 and the terminal device 7. Specifically, the game device 3 receives the image and sound (image and sound of the telephone partner) from the external device 201 via the network 200, and transmits the received image and sound to the terminal device 7. Thereby, the terminal device 7 can display an image from the external device 201 on the LCD 51, 87 323304 201220108 and output a sound from the external device X from the speaker 67. In addition, the game device 3 receives the camera image captured by the camera 56 and the reference wind sound detected by the microphone (10) from the terminal device 7, and transmits the camera image and the microphone sound to the external material via the network_2. . The game system 3' can use the game system 1 as a videophone by repeating the above-described reception and transmission of images and sounds with the external device 2Q1. In the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal terminal 7 L a L ^ % 褒 7 in a free position and direct the camera 56 in the free direction. In addition, in the second embodiment, the terminal device 7 is provided with the touch panel 52, and the second game device 3 can also transmit the input information (touch position information) input to the touch panel to the external device 2〇1. For example, when the image and sound from the external device 2G1 are output by H, and the text written on the touch panel % is transmitted to the external device 2G1, the game system (electronic learning system) can also be used. ). W (Degrees with the operation of the TV broadcast) In addition, the game system 1 can also operate with the TV when watching TV broadcasts on the TV 2. That is, the game system ^, when viewing the television program by the television 2, can output | to the terminal device 7. The following description of the game \ electric: see: the relevant information of the action and action winter action examples. The game system 1 is connected to the television system. In the above operation example, the game device 3 can communicate with the service via (4) (in other words, the external skirt 2〇1 shown in Fig. 21 is: the device).服 The device is used to record various channels related to TV broadcasts on each channel of TV broadcast. 323304 8 88 201220108 Information (TV information). Information (images), information, images, sounds, texts, or information related to the program, or information related to the EPg,,,,,,,,,,,,,,,, This kind of TV broadcast information can be used for the TV. In addition, the word server is equipped with a ship, and the mother's program of the age of 3 is set to output the electric__ server for communication in the TV 2. Set 3 to let the user enter the game when the terminal is used to sound. Then, via the singer, I will watch the TV information of the frequency channel broadcast on the TV in Beibei. In response to this U service, the data corresponding to the entered television information is transmitted. When the recipient receives the data corresponding to the channel game F Jin 3, and receives the data transmitted from the server, the data received by the patient is output to the terminal device 7. The image and text data of the terminal device 7 output =:! is displayed at 1^51. And from the sound of 7 ° ‘ enjoy the heart: fighting. By the above means, the user can use the information related to the television program in the terminal device and the like. As described above, the game system 1 communicates with an external device (server) via the network, whereby the information associated with the television broadcast can be provided to the user via the terminal device 7. In particular, in the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7' at a free position with high convenience. As described above, in the present embodiment, in addition to being used in games, the user can use the terminal device in various uses and forms. [8. [Modifications] The above-described embodiments are examples for carrying out the present invention, and in other embodiments, in the configuration of 89 323304 201220108, the present invention can be implemented, for example, in the configuration described below. (Modification of a plurality of terminal devices) In the above embodiment, the game system 1 is configured to have only one terminal device, but the game system 1 may be configured to have a plurality of terminal devices. That is, the game device 3 can communicate with a plurality of terminal devices in a wireless manner, and can transmit the data of the game image, the data of the game sound, and the control data to each terminal device, and receive the operation data and the camera map from each terminal device. Like data and microphone sound data. The game device 3 communicates with each of a plurality of terminal devices in a wireless manner. In this case, the game device 3 can perform wireless communication with each terminal device in a time division manner or divide the frequency band. When there are a plurality of terminal devices as described above, a game system can be used to perform a wider variety of games. For example, when the game system 1 has two terminal devices, since the game system 1 has three display devices, game images for three players can be generated and displayed on each display device. Further, when the game system 1 has two terminal devices, in a game in which the controller and the terminal device are used as one group (for example, the fifth game example described above), the two players can simultaneously play the game. Furthermore, when the game processing of the above step S27 is performed based on the marker coordinate data output from the two controllers, the two players can respectively perform the controller toward the marker (the pointing device 6 or the indicator portion 5_5). game. operating. that is,. The player of one of the players causes the controller to perform the game operation toward the pointing device 6, and the other player causes the controller to perform the game operation toward the indicator portion 55. (Modification of the function of the terminal device) 90 323304 8 201220108 In the above embodiment, the terminal device 7 has a function of a so-called compact terminal that does not execute game processing. Here, in another embodiment, a part of the series of game processing executed by the game device 3 in the above embodiment may be executed by another device such as the terminal device 7. For example, a part of the processing (e.g., generation processing of the game image for the terminal) is executed by the terminal device 7. Further, for example, in a game system having a plurality of information processing devices (game devices) that can communicate with each other, the plurality of information processing devices can share the execution of the game processing. (Industrial Applicability) As described above, the present invention is intended to allow a player to perform a novel game operation or the like, and can be applied to, for example, a terminal device used in a game system or a game system. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is an external view of the game system 1. Fig. 2 is a block diagram showing the internal structure of the game device 3. Fig. 3 is a perspective view showing the appearance of the controller 5. Fig. 4 is a perspective view showing the appearance of the controller 5. Fig. 5 is a view showing the internal configuration of the controller 5. Fig. 6 is a view showing the internal configuration of the controller 5. Fig. 7 is a block diagram showing the configuration of the controller 5. Fig. 8 is a view showing the appearance of the terminal device 7. Fig. 9 is a view showing a state in which the user holds the terminal device 7. Fig. 10 is a block diagram showing the internal configuration of the terminal device 7. Figure 11 is a diagram showing various materials used in game processing. 91 323304 201220108 Fig. 12 is a main flowchart showing the flow of game processing executed in the game device 3. Figure 13 is a flow chart showing the detailed flow of the game control process. Fig. 14 is a view showing the face of the television 2 and the terminal device 7 in the first game example. Fig. 15 is a view showing the screen of the television 2 and the terminal device 7 in the second game example. Fig. 16 is a view showing an example of a television game image displayed on the television 2 in the third game example. The figure is a diagram showing an example of a terminal game image displayed on the terminal device 7 in the third game example. Fig. 18 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. Fig. 19 is a view showing an example of a terminal game image displayed on the terminal device 7 in the fourth game example. Fig. 20 is a view showing the appearance of the game system 1 in the fifth game example. Fig. 21 is a view showing the connection relationship of the devices included in the game system 1 when connected to an external device via a network. [Main component symbol description] 1 Game system ··. · . · 2 . TV. . . . . 2a Speaker 3 Game device 4 Disc 5 Controller 6 Marking device 6L, 6R marker 323304 8 92 201220108 7 Terminal device 10 CPU 11 System LSI 11a Output input processor (I/O processor) lib GPU 11c DSP lid VRAM lie Internal Main memory 12 External main memory 13 ROM/RTC 14 CD player 15 AV-IC15 17 Flash memory 18 Network communication module 19 Controller communication module 20 Expansion connector 21 Memory card connector 11, 23, 29Antenna 24 Power button 25 Reset button 26 Detach button 27 Codec LSI 28 Terminal communication module 30 Substrate 31 Cover 31a Sound hole 32 Operation unit 32a Cross key 32b 1st key 32c 2nd key 32d A key 32e minus Key 32f Home key 32g Positive key 32h Power key 32i B key 33 Connector 34a to 34d LED 35 Photographic information calculation unit 35a Light incident surface 36 Communication unit 37 Acceleration sensor 38 Infrared filter 39 Lens 40 Photographic element 41 Image processing circuit 93 323304 % 201220108 42 Microcomputer 43 Memory 44 Wireless mode module 45 Antenna 46 Vibrator 47 Speaker 48 Rotary sensor 50 Cover 51 LCD 52 Touch Panel 53 Analog Joystick 53A Left Analog Joystick 53B Right Analog Joystick 54 Operation Key 54A Cross Key (Direction Input Key) 54B to 54H Operation Key 541 1L Key 54J 1R Key 54K 2L Key 54L 2R button 55 Marking portion 55A, 55B marker 56 Camera 57 Speaker hole 58 Expansion connector 59A, 59B Foot 60 Microphone hole 61 Touch panel controller 62 Magnetic sensor 63 Acceleration sensor 64 Rotary sensor 65 User interface controller (UI controller) 66 Codec LSI 67 Speaker 68 Sound 1C 69 Microphone 70 Wireless mode 71 Antenna 72 Infrared green communication module 73 Flash memory 74 Power supply 1C 75 Battery 76 Charger 77 CPU 78 Internal memory 90 Controller operation data 94 323304 8 201220108 91 Receive data 92 Controller operation data 93 1st operation key data 94 1st acceleration data 95 1st angular velocity data 96 marker coordinate data 97 terminal operation data 98 camera image Data 99 microphone sound data 100 touch position data 101 second acceleration data 102 second angular speed data 103 Bit data 104 Camera image data 105 Microphone sound data 106 Processing data 107 Control data 108 Controller posture data 109 Terminal posture data 110 Image identification data 111 Sound identification data 121 Darts 122 Control surface 123 Target 131 Cannon 132 Cannonball 133 Target 141 Hitler (Hit Object) 142 Pitcher (Pitcher Object) 143 Cursor 151 Aircraft (aircraft object) 152 Cannon (Cannon Object) 153 Target (Balloon Object) 154 Crosshair 160 Player 161 Player Character 162 High Club 162a Head 163 ball 164 head image 200 network 201 external device 95 323304