TW201220109A - Controller device and information processing device - Google Patents

Controller device and information processing device Download PDF

Info

Publication number
TW201220109A
TW201220109A TW100126152A TW100126152A TW201220109A TW 201220109 A TW201220109 A TW 201220109A TW 100126152 A TW100126152 A TW 100126152A TW 100126152 A TW100126152 A TW 100126152A TW 201220109 A TW201220109 A TW 201220109A
Authority
TW
Taiwan
Prior art keywords
game
terminal device
data
outer cover
controller
Prior art date
Application number
TW100126152A
Other languages
Chinese (zh)
Other versions
TWI442963B (en
Inventor
Ken-Ichirou Ashida
Yositomo Gotou
Takanori Okamura
Junji Takamoto
Masato Ibuki
Shinji Yamamoto
Hitoshi Tsuchiya
Fumiyoshi Suetake
Akiko Suga
Naoya Yamamoto
Daisuke Kumazaki
Original Assignee
Nintendo Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2010245299A external-priority patent/JP4798809B1/en
Priority claimed from JP2011092612A external-priority patent/JP6103677B2/en
Priority claimed from JP2011102834A external-priority patent/JP5837325B2/en
Priority claimed from JP2011118488A external-priority patent/JP5936315B2/en
Application filed by Nintendo Co Ltd filed Critical Nintendo Co Ltd
Publication of TW201220109A publication Critical patent/TW201220109A/en
Application granted granted Critical
Publication of TWI442963B publication Critical patent/TWI442963B/en

Links

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

A terminal device 7 is a controller device to be operated by a user. The terminal device 7 includes a generally plate-shaped housing 50, an LCD 51, and a projecting portion (an eaves portion 59). The LCD 51 is provided on the front side of the housing 50. The projecting portion is provided so as to project at least at left and right positions on a back side of the housing 50 above the center of the housing 50. When the user holds the left and right portions of the housing 50 with respect to the LCD 51, the user can easily hold the terminal device 7 by holding it so as to allow the projecting portion to rest on the fingers.

Description

201220109 六、發明說明: 【發明所屬之技術領域】 本發明係關於遊戲者可握持操作之操作裝置。 【先前技術】 以往係有遊戲者以手握持所使用之操作.裝置。例如日 本特許(專利)第3703473號說明書所記載之可攜式遊戲裝 置,為折疊式,且於下側外罩設置有操作鍵。根據此遊戲 裝置,使用者可一邊觀看晝面一邊使用畫面兩侧所設置之 # 操作鍵來進行遊戲操作,可容易在握持遊戲裝置下進行遊 戲操作。 【發明内容】 (發明所欲解決之課題) 近年來,關於可攜式終端裝置(操作裝置),晝面等變 得更大而使裝置本身亦跟著大型化之裝置亦逐漸增加。在 此,當使用者以手握持所使用之裝置本身變大時,乃存在 ^ 有裝置不易握持之可能性。 因此,本發明之目的在於提供一種使用者可容易握持 之操作裝置。 (用以解決課題之手段) 本發明為了解決上述課題,係採用以下(1)至(18)的構 成。 (1) 本發明之一例為用以讓使用者進行操作之操作裝置。 操作裝置係具備:大致呈板狀的外罩;顯示部;以及突起 3 323330 201220109 部。顯示部係設置在外罩的表面側。突起部係在外罩的背 面側,位於較外罩的中央.更靠近上侧且至少在左右兩 位置突起地設置。 ' 上述「操作郤」’只要疋使用者可操作之操作裝置均可 為任意裝置,例如有後述實施形態之搖桿(類比搖桿)、按 鍵(鍵)、觸控面板、觸控板(touch pad)等。 上述「左右兩側的位置」係意味著在左右方向,於外201220109 VI. Description of the Invention: TECHNICAL FIELD OF THE INVENTION The present invention relates to an operating device in which a player can hold a operation. [Prior Art] In the past, there has been an operation and a device used by a player to hold the hand. For example, the portable game device described in the specification of Japanese Patent No. 3,703,473 is a folding type, and an operation key is provided on the lower side cover. According to this game device, the user can perform the game operation by using the # operation keys provided on both sides of the screen while viewing the face, and the game operation can be easily performed under the holding game device. [Problems to be Solved by the Invention] In recent years, with regard to portable terminal devices (operating devices), devices having larger kneading surfaces and the like, and devices having a larger size have also been increasing. Here, when the device itself used by the user to hold the hand becomes large, there is a possibility that the device is not easily held. Accordingly, it is an object of the present invention to provide an operating device that can be easily held by a user. (Means for Solving the Problems) In order to solve the above problems, the present invention adopts the following configurations (1) to (18). (1) An example of the present invention is an operation device for allowing a user to operate. The operation device includes a cover having a substantially plate shape, a display portion, and a projection 3 323330 201220109. The display portion is provided on the surface side of the cover. The projection is attached to the back side of the outer cover, at the center of the outer cover, closer to the upper side, and protruded at least at the left and right positions. 'The above-mentioned "operations"" can be any device as long as the user-operable operation device, for example, a joystick (analog joystick), a button (key), a touch panel, a touchpad (touch) Pad) and so on. The above-mentioned "positions on the left and right sides" mean that they are in the left and right direction.

罩中央的左側與右側設置突起部之涵義,可在左右的兩端 部分別设置突起部,或是在較左右的端部更接近中心之位 置分別設置突起部。 根據上述(1)的構成,由於突起部設置在外罩的背面 側,故使用者在握持顯示部之左右的外罩時,突起部可讓 手指抓住,而能夠輕鬆地握持操作裝置。再者,由於+ 部設置在外罩的上側,當使用者將食指、中指、或無^ ε 抵住突起部的下©來握持外㈣,可藉由手掌來支樓^ (參照第Η)圖及第"圖),而可緊緊地握 卜: 二根置據上述⑴的構成’可提供_種使用者容易握持之: (2) 具備:於較外罩的,央為靠近上側,分 相左右方之第1操作部及第2摔作邛 根據上述⑵的構成,由於操作部設置在顯示= 方所以使用者在握持顯示部之左右的外 例如姆指來操作操作部。因此,根據上述⑵的二 323330 4 201220109 供吏用者可容易握壯容易進行操作之操作震置。 本發明:其他一例為具備:大致呈板狀亍 部;第!操作部及第2操作部;以及突起部之 置頁不 顯示部係設置在外罩的表面側。i操作、聚:: :別設置在顯示部的左右方。當使用者可分別以兩= 油作S 1㈣部及帛2操作部來料外_ ^置在外罩的背面側中拇指以外的任—手指可鉤錢位 根據上述(3)的構成’由於突起部設置在外罩 侧’使用者在握持顯示部之左右科罩時,突起部可讓拇 指以外的手指抓住,而能夠_地握持 则及第U圖)。此外,由於操作部設置在顯 方’所以使用者在握持顯示部之左右的外罩時 = =作操終因此,根據上述⑶的構成,可提L 種使(用$『可容易握持且容易進行操作之操作裴置。 操作部的相反 突起部可設置在包含第丨操作部及第 側位置之區域。 ^述「相践位置」’祕來職祕操作部與突 位置為一致之狀態,而是在外罩的表面侧使設置有 Ϊ誉t區域投影在背面側時,亦包包含在外罩的背面側 龙起部之區域、與所投影之區域形成部分重疊之狀 態0 323330 5 201220109 根據上述(4)的構诸, 以食指、中指、或無名指,在操作各操作部時,使用者能夠 照第10圖及第u圖)。曰〜撐簷部59來握持終端裝置7(參 操作部亦變得容易^作藉此’可容易握持終端裝置7 ’各 (5) 操作裝置復可具備.乂办 的左右兩側之第3操作料上面分賴置在外罩 根據上述⑸的構成=4操作部。 外罩之狀態下,例如時=者可在握持顯示部之左右的 部。亦即,上述狀態下^或中指來操作第3及第4操作 更佳之操作裝置。此外Hi多的操作,並提供操作性 持=裝置,所以更容易握持操作裝置。 C6) 2部可具有朝左右延伸之筹狀的形狀。 ^上述⑹的構成’使用者可使支射起部之手指沿 置的作裝置’所以更容易握持操作裝 起部成為㈣之方右方延伸形成’當使用者以使突 裝置操作裝置時,不論是握持操作 &置,均可將梅指以外的手 ^因此’即使以使突起部成為縱向之方式來 ^ 置時,使用者均可緊緊地握持操作裝置。_裝 ⑺ 在突起部的下面,可設置有另與操 製置所能夠卡止之第止孔。 U之附加 323330 6 201220109 根據上述α)的構成,可使用第1卡止孔堅固地連接操 作裝置與附加裝置。此外,當組合上述(6)的構成及上述(7) 的構成時,在操作裝置之左右方向的中央附近可設置第1 卡止孔,所以可均等地保持左右方的均衡而穩定地連接附 加裝置。 (8) 在外罩之下側的面’可設置有附加裝置所能夠卡止之 第2卡止孔。 根據上述(8)的構成,可使用設置在不同位置之第i卡 ==,孔來連接操作裝置與附加裝置,所以更能 (9) 操作裝置,在位於突知μ 右雨伽… 、大起部的下方且在外罩的背面之左 右兩側,復可具備剖面呈凸型之凸部。 左 根據上述(9 )的構成走 小指)鉤住凸部來握持外I使用者可將手指(例如無名指或 置。 所以更可緊緊地握持操作裝 (10) 突起部與凸部可隔著間隔而設置。 根據上述(10)的構成 下以中指或無名指等支_=用者可在凸部不會成為阻礙 部來握持操作裝置。藉此且將其他手指卡住凸 (1^ 令易握持操作裴置。 操作裝置復可具備:在外 仕外罩的背面上設置在左右兩側 323330 7 201220109 之握把部。 根據上述(11)的構成,使用者可將手指(例如無名指或 置 小指)鉤住握把部來握持外罩,所以更可緊緊地握持操作裝 t〇 (12) 操作裝置復可具備第5操作部及第6操作部。第5操 作部係在外罩之表面側的面中配置在第1操作部的下方。 第6操作部在外罩之表面側的面配置在第2操作部的下方。 根據上述(12)的構成,可使用操作裝置來進行更多樣 的操作。此外,即使在操作第5操作部及第6操作部時, 使用者亦可緊緊地握持操作裝置,因此可提供操作性佳 操作裝置。 〈 (13) 部· if明之其他—㈣具備:域呈板狀的外罩;顯示 罩^ 以及操作部之操作裝置。顯示部係設置在外 =面側。突起部係在外罩的背面側,於至少在左 的=突起地設置。操作部係設置在突砂之 側由於突起部⑽ 圖)。此外,由於操作部設置在乍突^ 鉤住突起部綠持操作裝置時,=之上侧的面,在手指 此時’使用者能以從上下夹*易地_作該操作部。 置,故形成更容易握持操作二:部之方式握持操作裝 衣直如上所述,根據上述(13) 323330 8 201220109 的構成,可提供一種使用者可容易握持且容 。 操作裝置。 進仃操作之 (14)The left side and the right side of the center of the cover are provided with protrusions, and protrusions may be provided at both ends of the left and right sides, or protrusions may be provided at positions closer to the center than the left and right ends. According to the configuration of the above (1), since the protruding portion is provided on the back side of the cover, when the user holds the right and left covers of the display portion, the protruding portion can be grasped by the finger, and the operating device can be easily held. Furthermore, since the + part is placed on the upper side of the outer cover, when the user holds the index finger, the middle finger, or the lower part of the protrusion to hold the outer (4), the palm can be used to support the building ^ (refer to the third) Figure and the "picture", but can be tightly gripped: two sets according to the above (1) composition 'provided _ kinds of users easy to hold: (2) with: in the outer cover, the center is close to the upper side According to the configuration of the above (2), since the operation unit is provided on the display side, the user operates the operation unit by, for example, a thumb while holding the left and right sides of the display unit. Therefore, according to the above (2), the second 323330 4 201220109 can be easily shaken by the user for easy operation. The present invention: another example is provided with: a substantially plate-shaped scorpion; The operation unit and the second operation unit; and the page display portion of the protrusion portion are provided on the surface side of the cover. i operation, poly:: : Do not set on the left and right sides of the display. When the user can use the two = oil as the S 1 (four) part and the 帛 2 operation part, the outside of the thumb is placed on the back side of the outer cover, and the finger can be hooked according to the composition of the above (3). When the user holds the left and right cover of the display unit, the protrusion can be grasped by a finger other than the thumb, and can be held by the user and the U-shaped figure. In addition, since the operation unit is provided in the display side, the user can hold the cover on the right and left sides of the display unit, and the operation can be made in accordance with the above configuration (3). The operation of the opposite side of the operation unit can be set in the area including the second operation unit and the first side position. The "comparison position" is the state in which the secret operation unit and the protrusion position are in the same state. On the other hand, when the surface of the outer cover is projected on the back side, the area including the raised portion on the back side of the cover is partially overlapped with the projected area. 0 323330 5 201220109 (4) The user can use the index finger, the middle finger, or the ring finger to operate the operation sections, and the user can take the 10th and the uth diagrams.曰 檐 檐 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 59 握 59 59 59 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握 握The third operation material is placed on the outer cover according to the configuration of the above (5) = 4 operation portion. In the state of the outer cover, for example, the person can hold the left and right portions of the display portion. That is, the above state or the middle finger is operated. The third and fourth operations are better operating devices. In addition, Hi operates more and provides an operative holding device, so that it is easier to hold the operating device. C6) The two portions may have a shape that extends toward the left and right. ^The composition of the above (6) 'The user can make the finger of the branching portion be placed on the device' so that it is easier to hold the operation mounting portion and the right side of the (4) is extended to form 'when the user operates the device with the protruding device Regardless of the holding operation & setting, the hand other than the plum finger can be used, so that the user can hold the operating device tightly even when the projection is made longitudinal. _Packing (7) Below the protrusion, a third hole that can be locked with the operating device can be provided. Addition of U 323330 6 201220109 According to the above configuration of α), the first locking hole can be used to firmly connect the operating device and the attachment. In addition, when the configuration of the above (6) and the configuration of the above (7) are combined, the first locking hole can be provided in the vicinity of the center in the left-right direction of the operating device, so that the right and left balance can be uniformly maintained and the connection can be stably connected. Device. (8) The second mounting hole to which the attachment device can be locked may be provided on the lower surface of the housing. According to the configuration of the above (8), the i-th card ==, the hole provided at different positions can be used to connect the operating device and the additional device, so that it is more capable of (9) operating the device, located at the sudden μ-right rain gamma... Below the starting portion and on the left and right sides of the back of the cover, a convex portion having a convex cross section may be provided. Left, according to the above-mentioned (9), the small finger is hooked to the convex portion to hold the outer I. The user can put a finger (for example, a ring finger or a finger. Therefore, the operating device (10) can be gripped tightly. According to the configuration of the above (10), the middle finger or the ring finger or the like _= the user can hold the operation device without the convex portion becoming the obstruction portion, thereby clamping the other fingers (1) ^Easy to hold the operating device. The operating device can be equipped with a grip on the left and right sides of 323330 7 201220109 on the back of the outer cover. According to the above configuration (11), the user can put a finger (such as the ring finger) Or the little finger) hooks the grip portion to grip the cover, so that the operation device can be gripped tightly. (12) The operation device can be provided with the fifth operation portion and the sixth operation portion. The fifth operation portion is The surface on the front surface side of the outer cover is disposed below the first operation portion. The sixth operation portion is disposed on the surface on the surface side of the outer cover below the second operation portion. The configuration of the above (12) can be performed using an operation device. More kind of operation. In addition, even in operation 5 In the case of the operation unit and the sixth operation unit, the user can hold the operation device tightly, so that the operation device can be provided with good operability. 〈 (13) 部· if otherwise - (4) having a cover having a plate-like shape; The display cover and the operation device of the operation unit are provided. The display portion is provided on the outer surface side. The protrusion portion is provided on the back side of the cover, and is provided at least on the left side = the protrusion. The operation portion is provided on the side of the sandblast due to the protrusion portion. (10) Fig. In addition, since the operation portion is provided when the protrusion ^ hooks the protrusion green holding operation device, the upper side is the surface, and at the time of the finger, the user can easily perform the operation from the upper and lower sides. Therefore, it is easier to hold the operation. The second operation is as follows. According to the above (13) 323330 8 201220109, a user can easily hold and accommodate. Device. (14)

本發明之其他-例為用以讓使用者進行操作之操 置。操作裝置係具備:大致呈板狀的外罩;顯示部「以^ 握把部。齡耗設置在外罩的表面側。㈣部,係在卞 述外罩的背面側之前述外罩的左右兩側朝上下方向二 設置’且剖面呈凸型。 匕根據上述(14)的構成,使用者可將手指(例如無名指 =指)釣住㈣部來握持外罩,所以更可緊緊地握持操作裝 因此’根據上述(⑷的構成,可提供—種使用者可容 易握持之操作裝置。 (15) 你操作裳置復可具備:在外罩的背面側,於握把部的上 >在左右兩側的位置突起地設置之突起部。 側,根據上述(15)的構成,由於突起部設置在外罩的背面 沪始使用者在握持顯示部之左右的外罩時’突起部可讓手 二而I鬆地握持操作裴置,所以更玎緊緊地握持操作 (16) 操作羞置復可具備:在外罩之上側的面分別設置在左 1Γ第7、操作部及第8操作部。 據上述(16)的構成,可使用操作裝置來進行更多樣 的才呆作。此休 r ’由於操作部配置在外罩的上表面,所以使 9 323330 201220109 用者可從外罩的表面侧、上侧及背合 握持操作裝置。 “側包住外罩而緊緊地 (17) 操作裝置復可具備設置在顯示㈣4 根據上述(17)的構成,使用者可使 ^面板。 示於顯示部之晝像更直覺且容易地進行操作:此:= ==了’操作裝置係藉由突起部以稍微二 對觸控面板進行操作。 〜'下各易 (18) f作裝置在外罩的内部復更可具備慣性感測器。 根據上达(18)的構成,可揮動或移動 =:,,=τ作裝置更直覺且容_= 匕外_此構成’由於是假定軸操作裝置來使用, 將附加裝置連接於操作裝置時,堅固地連接操作裝 二::置者乃極為重要。因此,上述⑽的構成中, ^木用上述⑺或(8)的構成,對於堅固地連接操作裝置 與'附加裝置者特別有效。 (19) 操作裝置復可具備通訊部及顯示控制部。通訊部,係 ::線方式將顯*出對本身機器所進行的操作之操作資料 广至遊戲裝置’並接收從前述遊戲裝置所傳送來之 ^ °顯示控㈣’係將所接收的圖像資料顯示於前述顯 部。 323330 10 201220109 料杜根,上述(19)的構成’使用者可使用容易握持且操作 戶斤^操作裳置來進行遊戲操作。此外,由於從遊戲裝置 运來之圖像被顯示於顯示部,所以❹者可一邊觀看 於操作裝置的顯不部之晝像_邊進行遊戲操作。 (20) 理」桑作▲置復可具備遊戲處理部及顯示控制部。遊戲處 糸根據對本身機II的操作來執行遊戲處理。 顯示控制 部:、艮據遊戲處理的結果來生成遊戲圖像並顯示於顯示 薏 根據上述(2G)的構成,可容易地握持可攜式遊戲裝 且其操作性佳。 (21) 顯示部可具有5吋以上的晝面。 根據上述(21)的構成,可使用較大晝面來顯示出容易 f且具震撼力之圖像。如上述(21)的構成使用較大晝面 係’”貝不部時’賴使操作襄置本身的大小變大,所以對於 用者可容易握持之上述⑴至⑽的構成特別有效。 此外,本發明之其他—例,亦可提供—種具備上述⑴ 資訊處I::部分(外罩、顯示部及突起部等)之平板型的 (發明之效果) 根據本發明,在外罩的表面舰置顯示部,並且在外 匈Γ面Γ於較外罩的中央為靠近上側且至少在左右兩 、立U突起部,藉此,使用者更可容隸持操作裝 323330 11 201220109 置。 本發明的上述及其他目的、特徵、態樣、及效果,透 過與添附圖式之對照、與以下的詳細說明而能更進一步瞭 解。 【實施方式】 [1·遊戲系統的全體構成] 以下參照圖式來說明本發明的一實施形態之遊戲系統 1。第1圖為遊戲系統1的外觀圖。第1圖中,遊戲系統1 ® 係包含:以電視接收機等代表之定置型顯示裝置(以下記載 為「電視」)2、定置型遊戲裝置3、光碟4、控制器5、標 示裝置6、及終端裝置7。遊戲系統1係根據採用控制器5 之遊戲操作,在遊戲裝置3中執行遊戲處理,並將藉由遊 戲處理所得之遊戲圖像顯示於電視2及/或終端裝置7。 遊戲裝置3中,可更換使用之資訊記憶媒體的一例之 光碟4,係可裝卸地***於該遊戲裝置3。光碟4係記憶有 φ 用以在遊戲裝置3中執行之資訊處理程式(典型為遊戲程 式)。於遊戲裝置3的前面設置有光碟4的***口。遊戲裝 置3,係藉由讀取***在***口的光碟4中所記憶之資訊 處理程式並加以執行而執行遊戲處理。 電視2係經由連接線而連接於遊戲裝置3。電視2,係 顯示出藉由遊戲裝置3中所執行的遊戲處理而得之遊戲圖 像。電視2具有喇队2a(第2圖),喇叭2a係輸出上述遊 戲處理的結果所得之遊戲聲音。其他實施形態中,遊戲裝 置3與定置型顯示裝置可構成為一體。此外,遊戲裝置3 12 323330 201220109 與電視2之通訊可為無線通訊。 於電視2的晝面周邊(第1圖中為畫面上侧),設置有 標示裝置6。詳細内容將於之後詳述,使用者(遊戲者)可 進行移動控制器5之遊戲操作,標示裝置6係用以算出控 制器5的動作或位置或姿勢等。標示裝置6於其兩端具有 2個標示器6R及6L。標示器6R(標示器6L亦相同),具體 而言為1個以上的紅外線LED(Light Emitting Diode ;發 光二極體),且朝向電視2的前方輸出紅外線。標示裝置6 ® 連接於遊戲裝置3,遊戲裝置3可控制標示裝置6所具備 之各紅外線LED的點燈。標示裝置6為可運送型,使用者 可將標示裝置6設置在自由位置。第1圖中,係顯示標示 裝置6設置在電視2上之型態,但設置標示裝置6之位置 及朝向可為任意。 控制器5,係將顯示出對本身機器所進行之操作内容 的操作資料賦予至遊戲裝置3者。控制器5與遊戲裝置3 φ 之間可藉由無線通訊來進行通訊。本實施形態中,控制器 5與遊戲裝置3之間的無線通訊,例如採用Bluetooth(藍 牙)(註冊商標)技術。其他實施形態中,控制器5與遊戲裝 置3能夠以有線方式連接。此外,本實施形態中,遊戲系 統1中所包含之控制器5係設為1個,但遊戲裝置3可與 複數個控制器5通訊,藉由同時使用預定台數的控制器, 可讓複數位使用者玩遊戲。控制器5的詳細構成將於之後 詳述。 終端裝置7,為使用者所能夠握持之程度的大小,使 13 323330 201220109 用者能夠以手握持終端裝置7來移動,或是將終端裝置7 配置在自由位置來使用。詳細構成將於之後敘述,終端裝 置7係具備為顯示手段的LCD(Liquid Crystal Display : 液晶顯示裝置)51及輸入手段(後述觸控面板52或迴轉感 測器74等)。終端裝置7與遊戲裝置3之間可藉由無線(或 有線)方式來進行通訊。終端裝置7從遊戲裝置3接收在遊 戲裝置3中所生成之圖像(例如遊戲圖像)的資料,並將圖 像顯示於LCD 51。本實施形態中,係使用LCD作為顯示裝 ® 置,但終端裝置7亦可具有例如應用EL(Electr〇 Luminescence ··電致發光)之顯示裝置等其他任意之顯示裝 置。此外’終端裝置7係將顯示出對本身機器所進行之操 作的内容之操作資料傳送至遊戲裝_置3。 [2·遊戲裝置3的内部構成] 接著參照第2圖,說明遊戲裝置3的内部構成。第2 圖係顯示遊戲裝置3的内部構成之方塊圖。遊戲裝置3係 φ 具有·· CPU(Central Processing Unit :中央處理單元)1()、 系統LSI 11、外部主記憶體12、R〇M/RTC 13、光碟機、 及 AV-IC 15 等。 ' CPU 10係藉由執行記憶於光碟4之遊戲程式來執行遊 戲處理,具有遊戲處理器的功能。cpu 1〇連接於系統L幻 11。系統LSI 11,除了 CPU 1〇之外,亦連接有外部主“ 憶體12、R0M/RTC 13、光碟機14及AV_IC 15。系統匕幻 係進行連接於其之各構成要素間之資料轉傳的控制、應予 顯不之圖像的生成、從外部I置取得資料等處理。***吻 323330 14 201220109 11的内部構成將於之後詳述。揮發性的外部主 eC* ’丨蒽體12, 係記憶從光碟機14所讀取之遊戲程式或是從快閃纪 所讀取之遊戲程式等程式,或記憶各種資料者,並用μ體 裝置3之啟動用的程式之ROM(所謂開機R〇m)、以 CPU 10的工作區或缓衝區。R0M/RTC 13係且右驻λ丄 ^ α „„ „ . 、’裝入有遊戲 及計數時 間之時脈電路(RTC : Real Time Clock)。光碟機14,〜 光碟4讀取程式資料或材質圖形資料等,並將讀 $從 a . , I貧料Other examples of the invention are those for the user to operate. The operation device includes a cover having a substantially plate shape, and the display portion "is a grip portion. The age is set on the surface side of the cover. The fourth portion is on the left and right sides of the cover on the back side of the cover. The direction 2 is set to 'the cross section is convex. 匕 According to the configuration of the above (14), the user can hold the cover (for example, the ring finger = finger) to hold the cover, so that the operation device can be gripped tightly. 'According to the above ((4), it is possible to provide an operating device that can be easily held by the user. (15) You can operate the skirt to have: on the back side of the cover, on the grip portion > on the left and right According to the configuration of the above (15), since the protrusion is provided on the back surface of the cover, when the user holds the cover on the left and right sides of the display portion, the protrusion can be used for the hand. The holding device is loosely held, so that the gripping operation is tighter (16). The operation is ashamed. The upper side of the outer cover is provided on the left side, the seventh side, the operating part, and the eighth operating part. The above configuration (16) can use an operating device Since the operation part is disposed on the upper surface of the outer cover, the user can hold the operating device from the surface side, the upper side and the back of the outer cover. Covering the cover tightly (17) The operation device can be provided on the display (4) 4 According to the configuration of (17) above, the user can make the panel. The image displayed on the display portion is more intuitive and easy to operate: := == 'The operating device is operated by slightly protruding the touch panel by the protrusions. ~ 'The next easy (18) f device can be equipped with an inertial sensor inside the cover. (18) The composition can be swung or moved =:,, =τ as the device is more intuitive and the capacity _= _ _ this configuration 'is assumed to be used by the shaft operating device, when attaching the additional device to the operating device, firmly It is extremely important to connect the operation device 2:: The above configuration (10) is the same as the configuration of the above (7) or (8), and is particularly effective for a strong connection between the operation device and the 'additional device. The device can be equipped with a communication unit and a display control unit. The communication department is: the line mode will display the operation information of the operation performed on the own machine to the game device' and receive the image displayed from the aforementioned game device (4). The data is shown in the above-mentioned display section. 323330 10 201220109 The material of Dugan, the composition of the above (19) 'user can use the easy grip and the operation of the user to operate the game to perform the game operation. In addition, since the game device is shipped Since the image is displayed on the display unit, the player can perform the game operation while viewing the image of the display unit of the operation device. (20) The device can be provided with a game processing unit and a display control unit. The game performs game processing according to the operation of the own machine II. The display control unit: generates a game image based on the result of the game processing and displays it on the display. 薏 According to the above configuration (2G), the portable game device can be easily held and the operability is good. (21) The display unit may have a face of 5 inches or more. According to the configuration of the above (21), it is possible to use a large face to display an image that is easy to fact and has a shock. When the configuration of the above (21) is large, the size of the operation device itself is increased, so that the configuration of the above (1) to (10) which can be easily gripped by the user is particularly effective. According to another aspect of the present invention, a flat type (the effect of the invention) having the above-mentioned (1) information portion I:: (cover, display portion, protrusion, etc.) may be provided. According to the present invention, the surface of the outer cover is The display portion is disposed, and the outer Hungarian surface is placed closer to the upper side than the center of the outer cover, and at least two left and right U-shaped protrusions are provided, whereby the user can further hold the operation device 323330 11 201220109. The other objects, features, aspects, and effects of the game system will be further understood by reference to the accompanying drawings and the following detailed description. [Embodiment] [1. Overall configuration of the game system] A game system 1 according to an embodiment of the present invention. Fig. 1 is an external view of the game system 1. In the first diagram, the game system 1 ® includes a fixed display device represented by a television receiver or the like (hereinafter referred to as TV ") 2, fixed-type game apparatus 3, an optical disc 4, a controller 5, display device 6 standard, and a terminal device 7. The game system 1 executes game processing in the game device 3 in accordance with the game operation using the controller 5, and displays the game image obtained by the game processing on the television 2 and/or the terminal device 7. In the game device 3, a disc 4, which is an example of a replaceable information storage medium, is detachably inserted into the game device 3. The disc 4 system stores φ an information processing program (typically a game program) for execution in the game device 3. An insertion port of the optical disc 4 is provided in front of the game device 3. The game device 3 performs game processing by reading and executing an information processing program stored in the optical disc 4 inserted in the insertion port. The television 2 is connected to the game device 3 via a connection line. The television 2 displays a game image obtained by the game processing executed in the game device 3. The television 2 has a racquet 2a (Fig. 2), and the horn 2a outputs the game sound obtained as a result of the above-described game processing. In other embodiments, the game device 3 and the stationary display device may be integrally formed. In addition, the communication between the game device 3 12 323330 201220109 and the television 2 can be wireless communication. A marking device 6 is provided around the kneading surface of the television 2 (the upper side of the screen in Fig. 1). The details will be described later in detail, and the user (player) can perform the game operation of the mobile controller 5, and the pointing device 6 is used to calculate the motion, position, posture, and the like of the controller 5. The marking device 6 has two markers 6R and 6L at its both ends. The marker 6R (the same is also used for the marker 6L), specifically, one or more infrared LEDs (Light Emitting Diodes), and outputs infrared rays toward the front of the television 2. The pointing device 6 ® is connected to the game device 3, and the game device 3 can control the lighting of each of the infrared LEDs provided in the signing device 6. The marking device 6 is of a transportable type, and the user can set the marking device 6 in a free position. In Fig. 1, the type in which the marking device 6 is placed on the television 2 is shown, but the position and orientation of the setting indicating device 6 can be arbitrary. The controller 5 displays the operation data indicating the contents of the operation performed by the own machine to the game device 3. Communication between the controller 5 and the game device 3 φ can be performed by wireless communication. In the present embodiment, wireless communication between the controller 5 and the game device 3 is, for example, a Bluetooth (registered trademark) technology. In other embodiments, the controller 5 and the game device 3 can be connected in a wired manner. Further, in the present embodiment, the number of controllers 5 included in the game system 1 is one, but the game device 3 can communicate with a plurality of controllers 5, and by using a predetermined number of controllers at the same time, it is possible to Several users play the game. The detailed configuration of the controller 5 will be described in detail later. The terminal device 7 is of a size that can be gripped by the user, so that the user can move the terminal device 7 by hand or the terminal device 7 can be used in a free position. The detailed configuration will be described later, and the terminal device 7 includes an LCD (Liquid Crystal Display) 51 as a display means and an input means (a touch panel 52 or a rotation sensor 74 to be described later). Communication between the terminal device 7 and the game device 3 can be performed by wireless (or wired). The terminal device 7 receives the material of the image (e.g., game image) generated in the game device 3 from the game device 3, and displays the image on the LCD 51. In the present embodiment, an LCD is used as the display device. However, the terminal device 7 may have any other display device such as a display device to which EL (Electr® Luminescence) is applied. Further, the terminal device 7 transmits the operation data showing the contents of the operation performed by the own device to the game device_3. [2. Internal Configuration of Game Device 3] Next, the internal configuration of the game device 3 will be described with reference to Fig. 2 . Fig. 2 is a block diagram showing the internal structure of the game device 3. The game device 3 system φ includes CPU (Central Processing Unit) 1 (), system LSI 11, external main memory 12, R〇M/RTC 13, optical disk drive, and AV-IC 15. The CPU 10 performs game processing by executing a game program stored on the disc 4, and has a function of a game processor. Cpu 1〇 is connected to the system L Magic 11. The system LSI 11 is connected to the external main memory unit 12, the ROM/RTC 13, the optical disk drive 14 and the AV_IC 15 in addition to the CPU 1. The system transmits data between the components connected thereto. The control, the generation of the image to be displayed, the acquisition of data from the external I, etc. The internal composition of the system kiss 323330 14 201220109 11 will be detailed later. Volatile external main eC* '丨蒽 body 12, It is a program that stores a game program read from the optical disk player 14 or a game program read from a flash memory, or a program that stores various data, and uses a program ROM for booting the device 3 (so-called boot R〇). m), with the working area or buffer of CPU 10. R0M/RTC 13 system and right station λ丄^ α „„ „ . , 'Load clock with game and counting time (RTC: Real Time Clock) . CD player 14, ~ CD 4 reads program data or material graphic data, etc., and will read $ from a.

寫入於後述之内部主記憶體lie或外部主記憶體12。 系統LSI 11中,係設置有輸出輸入處理器(〖/ο處王里 器)lla、GPU(Graphics Processor Unit :繪圖處理單元) llb、 DSP(Digital Signal Processor:數位訊號處理器) llc、 VRAM(Video RAM)lld、以及内部主記憶體lle。雖於 圖示中省略,但此等構成要素11a至lie係藉由内部匯流 排相互連接。 & 〜 GPU lib係形成描繪手段的一部分,並依循來自cpu w φ 的繪圖指令(製圖指令)來生成圖像。VRAM lid,係記憶用 以讓GPU lib執行繪圖指令所需之資料(多邊形資料或材質 圖形資料等資料)。生成圖像時,聊仙係、使用記憶=vram lid之資料來製成圖像資料。本實施形態中,遊戲裝置3 係生成顯示於電視2之遊戲圖像及顯示於終端裝置7 戲圖像兩者。以下,有將顯示於電視2之遊戲圖像稱為 視用遊戲圖像」,將顯示於終端裝置7之遊戲圖像稱為「終 端用遊戲圖像」之情形。 DSP 11c係具有音訊處理ϋ的功能,係使用記憶於内 323330 201220109 部主記憶體lie或外部主記憶體12之聲音資料或聲音波形 (音色)資料,來生成聲音資料。本實施形態中,遊戲聲音 與遊戲圖像相同,係生成從電視2的喇队所輸出之遊戲聲 音及從終端裝置7的喇叭所輸出之遊戲聲音兩者。以下, 有將從電視2輸出之遊戲聲音稱為「電視用遊戲聲音」,將 從終端裝置7輸出之遊戲聲音稱為「終端用遊戲聲音」之 情形。 如上所述,在遊戲裝置3中所生成之圖像及聲音中, ® 電視2中所輸出之圖像及聲音的資料,係藉由AV-IC 15所 讀取。AV-IC 15係將所讀取之圖像資料經由AV連接器16 輸出至電視2,並且將所讀取之聲音資料輸出至電視2内 建的喇2a。藉此,圖像被顯示於電視2,並從°刺°八2a輸 出聲音。 此外,在遊戲裝置3中所生成之圖像及聲音中,輸出 至終端裝置7中之圖像及聲音的資料,係藉由輸出輸入處 φ 理器11a等傳送至終端裝置7。藉由輸出輸入處理器11a 等將資料傳送至終端裝置7之傳送方式將於之後詳述。 輸出輸入處理器11a,係在與連接於其之構成要素間 執行資料的接收傳送,或是執行來自外部裝置的資料之下 載。輸出輸入處理器11a,係連接於快閃記憶體17、網路 通訊模組18、控制器通訊模組19、擴充連接器20、記憶 卡用連接器21、編解碼器LSI 27。此外,網路通訊模組 18連接有天線22。控制器通訊模組19連接有夭線23。編 解碼器LSI 27連接於終端通訊模組28,終端通訊模組28 16 323330 201220109 連接有天線29。 • 遊戲裝置3,可連接於網際網路等網路而與外部資訊 處理裝置(例如其他遊戲裝置或各種伺服器等)進行通訊。 亦即,輸出輸入處理器11a可經由網路通訊模組18及天線 22連接於網際網路等網路,並與連接於網路之外部資訊處 理裝置進行通訊。輸出輸入處理器11a係定期存取快閃記 憶體17,偵測是否有須傳送至網路之資料,具有該資料時, 經由網路通訊模組18及天線22傳送至網路。此外,輸出 ® 輸入處理器11a係經由網路、天線22及網路通訊模組18, 接收從外部資訊處理裝置傳送來之資料或從下載伺服器所 下載之資料,並將接收的資料記憶於快閃記憶體17。C P U 10 藉由執行該遊戲程式,來讀取記憶於快閃記憶體17之資料 並應用在遊戲程式。快閃記憶體17中,除了在遊戲裝置3 與外部資訊處理裝置之間所接收傳送的資料之外,亦可記 憶有應用遊戲裝置3進行遊戲之遊戲的儲存資料(遊戲的 φ 結果資料或中.途資料)。此外,快閃記憶體17中亦可記憶 有遊戲程式。 此外,遊戲裝置3可接收來自控制器5的操作資料。 亦即,輸出輸入處理器11a可經由天線23及控制器通訊模 組19來接收從控制器5所傳送來之操作資料,並記憶(暫 .時記憶)於内部主記憶體lie或外部主記憶體12的緩衝區。 此外,遊戲裝置3可在與終端裝置7之間進行圖像或 聲音等資料的接收傳送。輸出輸入處理器11a,當將遊戲 圖像(終端用遊戲圖像)傳送至終端裝置7時,係將GPU lib 17 323330 201220109 =7==的資料傳送至編解碼,27。編解碼 …卢 輸入處理器…的圖像資料進行預定 =Ξ此終:通訊模組28在與终端裝置7之間= 因此’由編解碼器LSI27所壓縮之圖像資料,係 ,由料通訊模組28,經由天線29傳送至終端裝置7。本 貫:形態中,從遊戲裝置3傳送至終端裝置7之圖像資料 ^用在遊戲中,遊戲中若所顯示的圖像產It is written in the internal main memory lie or the external main memory 12 which will be described later. In the system LSI 11, an output input processor ([/o), GPU (Graphics Processor Unit) llb, DSP (Digital Signal Processor), and VRAM ( Video RAM) lld, and internal main memory lle. Although omitted from the illustration, these constituent elements 11a to lie are connected to each other by an internal bus bar. & ~ GPU lib forms part of the rendering method and generates images based on drawing commands (drawing instructions) from cpu w φ. VRAM lid, which is used to store the data (polygon data or material graphic data) required by the GPU lib to execute the drawing instructions. When generating an image, the text is used to make image data using the data of memory=vram lid. In the present embodiment, the game device 3 generates both a game image displayed on the television 2 and a game image displayed on the terminal device 7. Hereinafter, the game image displayed on the television 2 is referred to as a "view game image", and the game image displayed on the terminal device 7 is referred to as a "terminal game image". The DSP 11c has the function of audio processing, and uses the sound data or sound waveform (tone) data stored in the main memory lie or the external main memory 12 to generate sound data. In the present embodiment, the game sound is the same as the game image, and both the game sound outputted from the racquet of the television 2 and the game sound outputted from the horn of the terminal device 7 are generated. In the following, the game sound outputted from the television 2 is referred to as "game sound for television", and the game sound outputted from the terminal device 7 is referred to as "game sound for the terminal". As described above, among the images and sounds generated by the game device 3, the images of the images and sounds output from the TV 2 are read by the AV-IC 15. The AV-IC 15 outputs the read image data to the television 2 via the AV connector 16, and outputs the read sound data to the built-in picture 2a of the television 2. Thereby, the image is displayed on the television 2, and the sound is output from the thorns 八2a. Further, among the images and sounds generated by the game device 3, the data of the images and sounds outputted to the terminal device 7 are transmitted to the terminal device 7 by the output/input device 11a or the like. The transmission method of transmitting data to the terminal device 7 by the output input processor 11a or the like will be described in detail later. The output input processor 11a performs reception and transmission of data between components connected thereto, or performs data download from an external device. The output input processor 11a is connected to the flash memory 17, the network communication module 18, the controller communication module 19, the expansion connector 20, the memory card connector 21, and the codec LSI 27. In addition, the network communication module 18 is connected to the antenna 22. The controller communication module 19 is connected to the twist line 23. The codec LSI 27 is connected to the terminal communication module 28, and the terminal communication module 28 16 323330 201220109 is connected to the antenna 29. • The game device 3 can be connected to a network such as the Internet to communicate with an external information processing device (for example, other game devices or various servers). That is, the output input processor 11a can be connected to a network such as the Internet via the network communication module 18 and the antenna 22, and communicates with an external information processing device connected to the network. The output input processor 11a periodically accesses the flash memory 17 to detect whether there is data to be transmitted to the network. When the data is present, it is transmitted to the network via the network communication module 18 and the antenna 22. In addition, the output® input processor 11a receives the data transmitted from the external information processing device or the data downloaded from the download server via the network, the antenna 22, and the network communication module 18, and memorizes the received data. Flash memory 17. C P U 10 reads the data stored in the flash memory 17 and applies it to the game program by executing the game program. In the flash memory 17, in addition to the data transmitted between the game device 3 and the external information processing device, the stored data of the game in which the game device 3 is played may be memorized (the φ result data of the game or the middle) . Way information). In addition, a game program can also be memorized in the flash memory 17. Further, the game device 3 can receive an operation material from the controller 5. That is, the output input processor 11a can receive the operation data transmitted from the controller 5 via the antenna 23 and the controller communication module 19, and memorize (temporary memory) in the internal main memory lie or external main memory. Buffer of body 12. Further, the game device 3 can perform reception and transmission of data such as images or sounds with the terminal device 7. The output/output processor 11a transfers the data of the GPU lib 17 323330 201220109 = 7 == to the codec, 27 when the game image (the game image for the terminal) is transmitted to the terminal device 7. The image data of the codec...the input processor is scheduled to be finalized = the end of the communication module 28 is between the terminal device 7 and the image data compressed by the codec LSI 27, The module 28 is transmitted to the terminal device 7 via the antenna 29. In the form: the image data transmitted from the game device 3 to the terminal device 7 is used in the game, and the image displayed in the game is produced.

操作性產生不良影響。因此,關於圖像資料㈣ ^裝置3至終端裝置7之傳送,較佳為盡可能形成不會產 生延遲。因此,本實施形態中,編解碼器⑶27例如使用 H· 264規格之高效率壓縮技術來壓縮圖像資料。亦可使用 其他壓縮技術,在通訊速度足夠時,亦能夠以無壓縮來傳 运圖像資料而構成。此外,終端通訊模組28例如為接受 Wi-Fi認證之通賴組,可使用以例如⑽臓·⑴規格 所採用之 MIMG(Multiple Input Multiple 响⑷技術, 高速地進行與終端裝置7之間的無線通訊,紋採用其他 通訊方式。 此外遊戲裝置3除了圖像資料之外,亦將聲音資料 傳送至終端裝置7。亦即,輸出輸人處理器11a係將DSP 11c 所生成之聲m經*轉㈣LSI π輸出至終端通訊 模、、且28 °編解^In [SI 27對於聲音資料,亦進行與圖像 資料相同之壓縮處理。對聲音資料所進行之壓縮的方式, 可為任意方式’值較佳為壓縮率高且聲音劣化少之方式。 此外’其他實施·中,聲音資料亦能夠不壓縮來傳送。 323330 18 201220109 ,經由天 終端通訊模I且2 8將壓縮後的圖像資料及聲音 線29傳送至終端裝置7。 貝’Operationality has an adverse effect. Therefore, with regard to the transmission of the image data (4) ^ device 3 to the terminal device 7, it is preferable to form as much as possible without delay. Therefore, in the present embodiment, the codec (3) 27 compresses image data using, for example, the H.264 high efficiency compression technique. Other compression techniques can also be used to transmit image data without compression when the communication speed is sufficient. Further, the terminal communication module 28 is, for example, a communication group that accepts Wi-Fi authentication, and can perform high-speed operation with the terminal device 7 using MIMG (Multiple Input Multiple (4) technology) adopted in the specification (10) 臓 (1), for example. In addition to the image data, the game device 3 transmits the sound data to the terminal device 7. That is, the output input processor 11a is the sound generated by the DSP 11c. Turn (4) LSI π output to the terminal communication mode, and 28 ° editing ^ In [SI 27 for the sound data, the same compression processing as the image data. The compression of the sound data can be any way' The value is preferably a method in which the compression ratio is high and the sound deterioration is small. In addition, in other implementations, the sound data can be transmitted without compression. 323330 18 201220109 , the compressed image data is transmitted via the day terminal communication module I and 28 And the sound line 29 is transmitted to the terminal device 7.

=者’遊戲裝置3除了上述圖像資料及聲音資料之 卜’亦可因應必要將各觀㈣料傳送轉端裝置7 制貧^為顯示出對終端裝置7所具備的構成要素進㈣ 制之才曰不的資料,例如有控制標示部(第ig圖所示之於干 部55)的點燈之指示、或是控制攝影機(第1〇圖所示:攝 攝影之指示等。輸出輪入處理器&係因應咖 1〇的心示,將控制資料傳送至終端襄置7。關於此控制資 科’本實施形態中,編解碼器LSI 27未進行資料的壓縮處 施謝’可進行壓縮處理。從遊戲裝置 傳=至、、以裝置7之上述資料’刊應必要進行編碼或 不進行編碼。 此外’遊戲裝置3可從終端裝置7接收各種資料。 細内容將於之後敘述,本實施形態令,終端裝置7係傳送 春㈣倾、®像資料及聲音㈣。從終端裝置7所傳送來 之各種#料,係經由天線29由終端通訊模組28所接收。 在此,來自終端裝置7的圖像資料及聲音資料,亦施以與 ^遊戲裝置3至終縣置7的圖像資料及聲音資料同樣的 壓縮處理。因此,關於此等圖像資料及聲音資料,係從終 端通訊模組28傳送至編解碼器LSI27,藉由編解碼器⑻ 27施以㈣縮處理並輸出至輸出輸人處理器…。另一方 面*,關於來自終端裝置7的操作資料,由於資料量與圖像 或聲音相比為較少,所以亦可不施以壓縮處理。此外,可 323330 19 201220109 因應必要進行編碼或不進行編碼。因此,操作賴在由終 端通訊模組28所接收後,經由編解碼器LSI 27被輸出至 輸出輸入處理器lla。輸出輸入處理器11a,係將從終端裝 置7所接收之資料,記憶(暫時記憶)於内部主記憶體 或外部主記憶體12的緩衝區。 此外遊戲裝置3可連接於其他機器或外部記憶媒 體。亦即,於輸出輸入處理器lla係連接有擴充連接器2〇 φ及記憶卡用連接器21。擴充連接器20為USB或SCSI之類 的介面用之連接器。可藉由將外部記憶媒體之類的媒體、 其他控制器等之周邊機器、或是有線的通訊用連接器連接 於擴充連接H 2G,來進行與網路之通似取代網路通訊模 組18。記憶卡用連接器21 &用以連接記憶卡之類的外部 記憶,體之連接器。例如,輸出輪入處理器Ua經由擴充 連接器20 己憶卡用連接器21來存取外部記憶媒體,可 將資料保存於外部記憶媒體或從外部記憶媒體讀取資料。 鲁 遊戲裝置3中,設置有電源鍵24、童設鍵25、及退片 鍵26。電源鍵24及重設鍵25連接於系統LSI η。當導通 電源鍵24時,藉由未圖示之Ac轉接器,從外部電源將電 力供給至遊戲裝f 3的各構成要素。按壓重設鍵25時,系 冼LSI 11重新啟動遊戲裝置3的啟動程式。退片鍵沉連 接於光碟機14。按壓退片鍵26時,光碟4從光碟機14排 出。 其他實施形態中,遊戲裝置3所具備的各構成要素中 之某些構成要素,可構成在另與遊戲裝置3不同之擴充機 323330 20 201220109 器。此時’擴充機器例如可經由上述擴充連接器2〇與遊戲 裝置3連接。具體而言,擴充機器具備例如上述編解碼器 LSI 27、終端通訊模組28及天線29的各構成要素,並可 =於擴充連接H 20。根據此,可藉由將上述擴充機器連 構成亡遂Ϊ構成要素之遊戲裝置,·而將該遊戲裝置 構成為可與終端裝置7進行通訊。 [3·控制器5的構成] 接著參照第3圖至第7圖來說明控制器5。第3圖係 „器5的外觀構成之立體圖…圖係顯示控制器 的外觀構成之立體圖。第3圖為從控制器5的上側後方 觀看之立體圖,第4圖為從控制器5的下側前方觀看之立 體圖。 第3圖及第4圖中,控制器5係具有例如由塑膠成型 所形成之外罩3卜外罩31具有以其前後方向(第3圖所示 之Ζ軸方向)為長邊方向之大致長方體形,全體為大人或小 _ 孩的單手所能夠握持之大小。使用者藉由麗下控制器5上 所設置的鍵,以及移動控制器5本身來改變其位置或姿勢 (傾斜度),可進行遊戲操作。 外罩31上設置有複數個操作鍵。如第3圖所示,於外 罩31的上表面,設置有:十字鍵32a、!號鍵饥、2號鍵 32c、A鍵32d、減號鍵32e、首頁鍵32f、正號鍵32g、及 電源鍵32h。本說明書中,將設置有此等鍵32a至3沾之 外罩31的上表面稱為「按鍵面」。另一方面,如第4圖所 示,於外罩31的下表面形成有凹部,於該凹部的後面側傾 323330 21 201220109 斜面設置有B鍵32h於此等各操作鍵32a至犯丨,適當地 分配有因應遊戲裝置3所執行之資訊處理程式的功能:此 外,電源鍵32h係用來以遠距方式將遊戲裝置3本體的電 源導通關閉者。首頁鍵32f及電源鍵咖 於外罩31的上表面。藉此可防止使用者誤壓首=== 電源鍵32h。 於外罩31的後面設置有連接器33。連接器33係用以 將其他機器(例如其他感測器單元或控制器)連接於控制器 5者。此外’於外罩31的後面之連接器33的兩側,設置 有用以防止上述其他機器輕易地脫離之卡止孔33a。 於外罩31上表面的後方設置有複數個(第3圖中為4 個)LED 34a至34d。在此,控制器5中,係為了與其他控 制器5區分而赋予控制器種類(號碼)。各LED 34a至34d, 係以將目前設定在控制器5之上述控制器種類通知至使用 者、或是將控制器5的電池殘量通知至使用者之目的所使 • 用。具體而言,使用控制器5進行遊戲操作時,係因應上 述控制器種類使複數個LED 34a至34d中的任一個點燈。 此外,控制器5係具有攝像資訊運算部35(第6圖), 如第4圖所示,於外罩31的前面設置有攝像資訊運算部 35的光入射面35a。光入射面35a,係以至少讓來自標示 器6R及6L的紅外線穿透之材質所構成。 於外罩31上表面的1號鍵32b與首頁鍵32f之間,形 成有用以將來自控制器5所内建之喇叭47(第5圖)的聲音 往外部放出之放音孔31a。 22 323330 201220109 接著參照第5圖及第6圖來說明控制器5的内部構 造。第5圖及第6圖係顯示控制器5的内部構造之圖。第 5圖為卸下控制器5的上框體(外罩31的一部分)之狀態之 立體圖。第6圖為卸下控制器5的下框體(外罩31的一部 分)之狀態之立體圖。第6圖所示之立體圖,為從背面觀看 第5圖所示之基板30之立體圖。 第5圖中,基板30固定設置在外罩31的内部,於該 基板30的上方主面上,設置有各各操作鍵32a至32h、各 ® LED 34a至34d、加速度感測器37、天線45、及β刺0八47等。 此等係藉由形成於基板30等之配線(未圖示)而連接於微 電腦(Micro Computer)42(參照第6圖)。本實施形態中, 加速度感測器.37係配置在X軸方向上從控制器5的中心偏 離之位置。藉此可容易算出使控制器5繞著Z軸旋轉時之 控制器5的動作。此外,加速度感測器3 7係配置在長邊方 向(Z軸方向)上較控制器5的中心為前方之位置。此外, φ 藉由無線模組44(第6圖)及天線45,使控制器5具有無線 控制器之功能。 另一方面,第6圖中,於基板30之下方主面上的前端 緣設置有攝像資訊運算部35。攝像資訊運算部35,從控制 器5的前方依序設有:紅外線濾波器38、透鏡39、攝像元 件40、及圖像處理電路41。此等構件38至41分別安裝於 基板30的下方主面。 此外,於基板30的下方主面上,設置有上述微電腦 42及振動器46。振動器46例如為振動馬達或螺線管,並 23 323330 201220109 藉由开>成於基板30等之配線而連接於微電腦42。夢由微 電腦42的指示使振動器46動作,藉此使控制器5 ^生振 動。藉此可使該振動傳達至握持控制器5之使用者的手, 而實現所謂對應振動的遊戲。本實施形態中,振動器佔係 配置在外罩31的稍微靠近前方處。亦即,藉由將振動器 46配置在較控制器5的中心更位於端側,可使振動器46 的振動對全體控制器5形成更大振動。此外,連接器33係 安裝在基板30之下方主面上的後端緣。除了第5圖及第6 • 圖所示者外’控制器5亦具備:生成微電腦42的基本時脈 之水晶振動元件、將聲音訊號輸出至喇叭47之擴大器等。 第3圖至第6圖所示之控制器5的形狀、各操作鍵的 形狀、加速度感測器或振動器的數目及設置位置等僅為一 例,可為其他形狀、數目及設置位置。此外,本實施形態 中’攝像手段的攝像方向為Z軸正方向,但攝像方向可為 任意方向。亦即,控制器5中之攝像資訊運算部35的位置 φ (攝像資訊運算部35的光入射面35a),亦可不在外罩31 的前面,只要可從外罩31外部取光,則亦可設置在其他面。 第7圖係顯示控制器5的構成之方塊圖。控制器5係 具備:操作部32(各操作鍵32a至32i)、攝像資訊運算^ 35、通訊部36、加速度感測器37、及迴轉感測器48 °控 制器5,係將顯示出對本身機器所進行之操作内容的資料’ 作為操作資料傳送至遊戲裝置3。以下,有將控制器5所 傳送之操作資料稱為「控制器操作資料」,將終端裝置7所 傳送之操作資料稱為「終端操作資料」之情形。 323330 24 201220109 操作部32係包含上迷各操作鍵32a至32i,並將顯示 出對各操作鍵32a至32i的輪入狀態(是否壓下各操作鍵 32a至32〇之操作鍵資料輪出至通訊部祁的微電腦42。 攝像寅訊運算部35,係用以解析攝像手段所攝像之圖 像資料,判別當中亮度較高的區域,並算出該區域的重心 ==之系統。攝像資訊運算部35,由於具有例如 ^ .框/秒的取樣週期,故即使是相對高速的控制If the player device 3 is in addition to the above-mentioned image data and audio data, it is also necessary to display the components of the terminal device 7 in accordance with the necessity of the device 4 (4). The information that is not available, for example, the indication of the lighting of the control indicator (shown in the ig chart 55), or the control of the camera (the first picture: the instruction of the shooting, etc.. Output wheeling processing) The controller & transmits the control data to the terminal device 7 in response to the message of the coffee maker. In this embodiment, the codec LSI 27 does not perform compression of the data. From the game device, the above-mentioned data of the device 7 is required to be encoded or not encoded. Further, the game device 3 can receive various materials from the terminal device 7. The details will be described later, and the present embodiment will be described. In the form of a command, the terminal device 7 transmits the spring (four) tilt, the image data, and the sound (four). The various materials transmitted from the terminal device 7 are received by the terminal communication module 28 via the antenna 29. Here, the terminal device is received. 7 image data The sound data is also subjected to the same compression processing as the image data and sound data of the game device 3 to the final county 7. Therefore, the image data and the sound data are transmitted from the terminal communication module 28 to the edited The decoder LSI 27 is subjected to (four) reduction processing by the codec (8) 27 and output to the output input processor. On the other hand, * with respect to the operation data from the terminal device 7, since the amount of data is compared with the image or sound In addition, compression processing may be omitted, and 323330 19 201220109 may or may not be encoded as necessary. Therefore, the operation is received by the terminal communication module 28, and then output via the codec LSI 27. To the output input processor 11a, the output/output processor 11a memorizes (temporarily memorizes) the data received from the terminal device 7 into the buffer of the internal main memory or the external main memory 12. Further, the game device 3 can be connected. For other devices or external memory media, that is, the expansion connector 2〇φ and the memory card connector 21 are connected to the output input processor 11a. The expansion connector 20 is USB or S. A connector for an interface such as CSI, which can be connected to a network by connecting a medium such as an external memory medium, a peripheral device such as another controller, or a wired communication connector to the extended connection H 2G. It is similar to the network communication module 18. The memory card connector 21 & is used to connect an external memory such as a memory card, the connector of the body. For example, the output wheel-in processor Ua via the expansion connector 20 has recalled The card connector 21 accesses the external memory medium, and the data can be stored in the external memory medium or read from the external memory medium. The game device 3 is provided with a power button 24, a child button 25, and a drop button 26. The power key 24 and the reset key 25 are connected to the system LSI η. When the power button 24 is turned on, power is supplied from the external power source to each component of the game device f3 by an Ac adapter (not shown). When the reset button 25 is pressed, the LSI 11 restarts the startup program of the game device 3. The eject button is connected to the disc player 14. When the eject button 26 is pressed, the disc 4 is discharged from the disc player 14. In other embodiments, some of the constituent elements of the game device 3 may be configured as expansion machines 323330 20 201220109 that are different from the game device 3. At this time, the expansion machine can be connected to the game device 3 via the above-described expansion connector 2, for example. Specifically, the expansion device includes, for example, the respective components of the codec LSI 27, the terminal communication module 28, and the antenna 29, and can expand the connection H20. According to this, the game device can be configured to communicate with the terminal device 7 by connecting the expansion device to a game device that constitutes a component of death. [3. Configuration of Controller 5] Next, the controller 5 will be described with reference to Figs. 3 to 7 . Fig. 3 is a perspective view showing the appearance of the device 5. The figure shows a perspective view of the appearance of the controller. Fig. 3 is a perspective view from the upper rear side of the controller 5, and Fig. 4 is a lower side of the controller 5. In the third and fourth figures, the controller 5 has an outer cover 3 formed of, for example, plastic molding, and the outer cover 31 has a long side in the front-rear direction (the z-axis direction shown in Fig. 3). The direction of the general rectangular shape, the size of the one or the child's one hand can be held by the user. The user changes the position or posture by the key set on the controller 5 and the mobile controller 5 itself. (Slope), the game operation can be performed. The outer cover 31 is provided with a plurality of operation keys. As shown in Fig. 3, on the upper surface of the outer cover 31, a cross key 32a, a ! key key hunger, and a 2nd key 32c are provided. The A key 32d, the minus key 32e, the first key 32f, the positive key 32g, and the power key 32h. In the present specification, the upper surface of the cover 31 provided with the keys 32a to 3 is referred to as a "key face". . On the other hand, as shown in Fig. 4, a concave portion is formed on the lower surface of the outer cover 31, and the rear side of the concave portion is tilted 323330 21 201220109, and the B key 32h is provided on each of the operation keys 32a to be smashed, suitably The function of the information processing program executed in response to the game device 3 is assigned: in addition, the power button 32h is used to turn the power of the main body of the game device 3 to the close-off in a remote manner. The home button 32f and the power button are provided on the upper surface of the cover 31. This prevents the user from accidentally pressing the first === power button 32h. A connector 33 is provided behind the outer cover 31. Connector 33 is used to connect other machines (e.g., other sensor units or controllers) to controller 5. Further, on both sides of the connector 33 on the rear side of the outer cover 31, locking holes 33a for preventing the above-mentioned other devices from being easily detached are provided. A plurality of (four in FIG. 3) LEDs 34a to 34d are disposed behind the upper surface of the outer cover 31. Here, the controller 5 is given a controller type (number) in order to distinguish it from the other controllers 5. Each of the LEDs 34a to 34d is used for the purpose of notifying the user of the type of controller currently set in the controller 5 or notifying the user of the remaining amount of the battery of the controller 5. Specifically, when the game operation is performed using the controller 5, any one of the plurality of LEDs 34a to 34d is turned on in response to the type of the controller. Further, the controller 5 includes an imaging information computing unit 35 (Fig. 6). As shown in Fig. 4, a light incident surface 35a of the imaging information computing unit 35 is provided on the front surface of the cover 31. The light incident surface 35a is made of a material that allows at least infrared rays from the markers 6R and 6L to penetrate. Between the first key 32b on the upper surface of the outer cover 31 and the first key 32f, a sound emitting hole 31a for discharging the sound from the speaker 47 (Fig. 5) built in the controller 5 to the outside is formed. 22 323330 201220109 Next, the internal structure of the controller 5 will be described with reference to Figs. 5 and 6. 5 and 6 are views showing the internal structure of the controller 5. Fig. 5 is a perspective view showing a state in which the upper casing (part of the outer casing 31) of the controller 5 is removed. Fig. 6 is a perspective view showing a state in which the lower casing (a part of the outer cover 31) of the controller 5 is removed. The perspective view shown in Fig. 6 is a perspective view of the substrate 30 shown in Fig. 5 as viewed from the back. In FIG. 5, the substrate 30 is fixedly disposed inside the outer cover 31. On the upper main surface of the substrate 30, the operation keys 32a to 32h, the respective LEDs 34a to 34d, the acceleration sensor 37, and the antenna 45 are disposed. And β thorn 0 eight 47 and so on. These are connected to a microcomputer (Micro Computer) 42 by wiring (not shown) formed on the substrate 30 or the like (see Fig. 6). In the present embodiment, the acceleration sensor 37 is disposed at a position deviated from the center of the controller 5 in the X-axis direction. Thereby, the operation of the controller 5 when the controller 5 is rotated about the Z axis can be easily calculated. Further, the acceleration sensor 37 is disposed at a position forward of the center of the controller 5 in the longitudinal direction (Z-axis direction). Further, φ has the function of the wireless controller by the wireless module 44 (Fig. 6) and the antenna 45. On the other hand, in Fig. 6, the imaging information computing unit 35 is provided on the distal end edge of the lower main surface of the substrate 30. The imaging information computing unit 35 is provided with an infrared filter 38, a lens 39, an imaging element 40, and an image processing circuit 41 in this order from the front of the controller 5. These members 38 to 41 are attached to the lower main faces of the substrate 30, respectively. Further, the microcomputer 42 and the vibrator 46 are provided on the lower main surface of the substrate 30. The vibrator 46 is, for example, a vibration motor or a solenoid, and is connected to the microcomputer 42 by wiring of the substrate 30 or the like by 23 323330 201220109. The dream is operated by the microcomputer 42 to cause the vibrator 46 to operate, thereby causing the controller 5 to vibrate. Thereby, the vibration can be transmitted to the hand of the user holding the controller 5, and a so-called corresponding vibration game can be realized. In the present embodiment, the vibrator is disposed slightly in front of the outer cover 31. That is, by arranging the vibrator 46 at the end side more than the center of the controller 5, the vibration of the vibrator 46 can cause greater vibration to the entire controller 5. Further, the connector 33 is mounted on the rear end edge of the lower main surface of the substrate 30. The controller 5 includes a crystal vibration element that generates a basic clock of the microcomputer 42 and an amplifier that outputs an audio signal to the speaker 47, in addition to those shown in Fig. 5 and Fig. 6 . The shape of the controller 5 shown in Figs. 3 to 6 , the shape of each operation key, the number of acceleration sensors or vibrators, and the installation position are merely examples, and may be other shapes, numbers, and setting positions. Further, in the present embodiment, the imaging direction of the imaging means is the Z-axis positive direction, but the imaging direction may be any direction. In other words, the position φ of the imaging information computing unit 35 in the controller 5 (the light incident surface 35a of the imaging information computing unit 35) may not be in front of the outer cover 31, and may be set as long as it can be taken from the outside of the outer cover 31. On the other side. Fig. 7 is a block diagram showing the configuration of the controller 5. The controller 5 includes an operation unit 32 (each operation key 32a to 32i), a camera information operation 35, a communication unit 36, an acceleration sensor 37, and a rotation sensor 48° controller 5, which will display a pair. The material 'the content of the operation content performed by the own machine' is transmitted to the game device 3 as the operation data. Hereinafter, the operation data transmitted from the controller 5 will be referred to as "controller operation data", and the operation data transmitted from the terminal device 7 will be referred to as "terminal operation data". 323330 24 201220109 The operation unit 32 includes the above-described operation keys 32a to 32i, and will display the wheel-in state of each of the operation keys 32a to 32i (whether or not the operation key data of each operation key 32a to 32 is depressed is rotated to The microcomputer 42 of the communication unit is used to analyze the image data captured by the imaging means, determine the area where the brightness is high, and calculate the system of the center of gravity of the area ==. 35, even with relatively high speed control due to a sampling period of, for example, ^ frame/sec

器5的動作,亦可跟隨而進行解析。 攝像資訊運算部L解析 39、攝像元件40、及=備m慮波11 38 '透鏡 係使從控制器5的前方w理電路41。紅外線滤波器38 ’ 透鏡39,係將穿透之光中,僅讓紅外線通過。 至攝像轉4G。攝像_卜^錢器38之紅外線聚光並入射 測器之固體攝像元件^ #40例如為CM〇S感測器或CCD感 出圖像訊號。在此,出戈39所聚光之紅外線感光並輸 55及標示裝置6,是_=,之終端裝置7的標示部 藉由設H外祕波H H線之標示器所構成。因此, 線滤波器38之紅外線感先而^象元# 40彳僅將通過紅外 地將攝像州標示部55及/^輯/料’=更可正確 像。以下,將藉由攝像元件^不裝置6)的圖像予以攝 像。藉由攝像元件40所丄所攝像之圖像稱為攝像圖 路Μ中進行處理。_像處^/資料,係在圖像處理電 攝像對象的位置。圖像處 41係异出攝像圖像内之 之座標輪出至通訊部36的微:41將顯示出所算出的位置 m電腦42。該座標的資料, 323330 25 201220109 藉由微電腦42作為操作資料傳送至遊戲裝置3。以下,將 上述座標稱為「標示器座標」。由於標示器座標對應於控制 器5本身的朝向(傾斜角度)或位置而改變,所以遊戲裝置 3可使用該標示器座標來算出控制器5的朝向或位置。 其他實施形態中’控制器5可構成為不具備圖像處理 電路41,攝像圖像本身可從控制器5傳送至遊戲裝置3。 此時,遊戲裝置3可具有與圖像處理電路41為相同功能之 電路或程式,並算出上述標示器座標。 加速度感測器37,係偵測出控制器5的加速度(包含 重力加速度)’亦即偵測出施加於控制器5之力(包含重 力)。加速度感測器37,在施加於該加速度感測器37的偵 測部之加速度中,係偵測出沿著感測轴方向之直線方向的 加速度(直線加速度)之值。例如,為2軸以上的多軸加速 度感測器時’係分別偵測出沿著各軸之成分的加速度作為 施加於加速度感測器的偵測部之加速度。加速度感測器 37,例如為靜電電容型的 MEMS(Micro Electro Mechanical System:微機電系統)型加速度感測器,但亦可使用其他方 式的加速度感測器。 •本實施形態中,加速度感測器37係對以控制器5為基 準之上下方向(第3圖所示之Y軸方向)、左右方向(第3圖 所杀之X轴方向)及前後方向(第3圖所示之Z軸方向)的3 轴方向分別偵測直線加速度。由於加速度感測器37偾測出 沿著各軸之直線方向上的加速度,所以加速度感測器37的 輸出係顯示出3軸的各軸之直線加速度的值。亦即,所債 26 323330 201220109 測之加速度,係顯示為以控制器5為基準所設定之XYZ座 標系(控制器座標系)上的3維向量。 顯示出加速度感測器37所偵測之加速度之資料(加速 度k料)’被輸出至通訊部3 6。由於加速度感測器3 7所偵 測之加速度對應於控制器5本身的朝向(傾斜角度)或動作 而改變’所以遊戲裝置3可使用所取得的加速度資料來算 出控制器5的朝向或動作。本實施形態中,遊戲裝置3係 根據所取得之加速度資料來算出控制器5的姿勢或傾斜角 *度等。 對本業業者而言可從本說明書的說明中容易理解的 是’根據從加速度感測器37(關於後述的加速度感測器63 亦相同)所輸出之加速度的訊號,遊戲裝置3的處理器(例 如CPU 10)或控制器5的處理器(例如微電腦42)等電腦進 行處理’藉此可推測或算出(判定)關於控制器5之更進一 步的資訊者。例如’當以裝載加速度感測器37之控制器5 # 處於靜止狀態者為前提來執行電腦侧的處理時(亦即在由 加速度感測器所偵測之加速度僅有重力加速度時來執行處 理時),只要控制器5實質上處於靜止狀態者,則可根據所 偵測之加速度來得知控制器5的姿勢相對於重力方向是否 傾斜或傾斜何種程度。具體而言,以加速度感測器37的偵 测軸朝垂直正下方之狀態為基準時,可藉由是否施加有1G (重力加速度)而得知控制器5相對於基準是否傾斜,並藉 由該大小來得知相對於基準傾斜何種程度。此外,為多轴 的加速度感測器37時’可藉由對各軸的加速度訊號施以處 323330 27 201220109 理而更詳細得知控制器5相對於重力方向傾斜何種程度。 此時,處理器可根據來自加速度感測器37的輸出來算出控 制器5的傾斜角度,或是不算出該傾斜角度而算出控制器 5的傾斜方向。如此,藉由將加速度感測器3 7與處理器組 合使用,可判定控制器5的傾斜角度或姿勢。 另一方面,以控制器5處於動作狀態(控制器5為移動 之狀態)者為前提時,由於加速度感測器37除了重力加速 度之外亦偵測出因應控制器5的動作之加速度,故藉由預 ® 定處理從偵測出之加速度去除重力加速度的成分,藉此可 得知控制器5的動作方向。此外,即使在以控制器5處於 動作狀態者為前提時,藉由預定處理從偵測出之加速度去 除因應加速度感測器的動作之加速度的成分,藉此亦可得 知控制器5相對於重力方向之斜率。其他實施例中,加速 度感測器37亦可具備:用以在將以内建的加速度偵測手段 所偵測之加速度訊號輸出至微電腦42前對該加速度訊號 φ 進行預定處理之組入式的處理裝置或其他種類的專用處理 裝置。組入式或專用的處理裝置,例如當用以使加速度感 測器37偵測出靜態加速度(例如重力加速度)時,可將加速 度訊號轉換為傾斜角(或是其他的較佳參數)。 迴轉感測器48,係偵測出繞著3軸(本實施形態中為 XYZ軸)之角速度。本說明書中,以控制器5的攝像方向(Z 軸正方向)為基準,將繞著X軸的旋轉方向稱為縱搖(pitch) 方向,繞著Y軸的旋轉方向稱為偏搖(raw)方向,繞著Z軸 的旋轉方向稱為橫搖(roll)方向。迴轉感測器48只要可偵 28 323330 201220109 測出繞著3軸之角速度即可,所使用之遍轉感測器的數目 及組合可為任意。例如,迴轉感測器48可為3軸迴轉感測 器,或是組合2軸迴轉感測器與丨軸迴轉感測器來偵測繞 著3軸之角速度。顯示出迴轉感測器切所偵測之角速度之 貝料’被輸出至通訊部36。此外,迴轉感測器48亦可谓 測出繞著1軸或2轴之角速度。 通訊部36係包含:微電腦42、記憶體43、無線模組 44、及天線45。微電腦42,於進行處理時一邊將記憶體 43用作為記憶區,一邊以將微電腦&所取得之資料以無 線方式傳送至遊戲裝置3之方式控制無線模組44。 從操作部32、攝像資訊運算部35、加速錢測器37、 及迴轉感測器48輸出至微電腦42之資料,被暫時儲存於 。己隐體43此等J料係作為操作資料(控制器操作資料)被 傳送至遊戲裝置3。亦即,微電腦42係當往遊戲裝置3的 控制器通訊模組19之傳送時序來到時,將儲存於記憶體 籲43之操作資料輸出至無線模組44。無線模組44例如採用 Bluetooth(藍牙)(註冊商標)技術,以操作資料將預定頻率 的傳輸波進行調邊’並從天線45將該微弱電波訊號播送。 亦即作貧料以無線模組44調變為微弱電波訊號,並從 控制器5傳送出。微弱電波訊號由遊戲裝置3側的控制器 通訊模組19接收。藉由對所接收之微弱電波訊號進行解調 或解碼,遊戲裝置3可取得操作資料。此外,遊戲裝置3 的CPU 10係使用從控制器5所取得之操作資料來進行遊戲 處理。從通訊部36往控制器通訊模組19之無線傳送,係 323330 29 201220109 在每個預定週期逐次進行,由於遊戲處理一般是以1/60秒 為單位(1圖框時間)來進行,所以較佳係以此時間以下的 周期來進行,控制器5的通訊部36係例如以1/200秒1次 的比率,將操作資料輸出至遊戲裝置3的控制器通訊模組 19 ° 如上所述,控制器5可傳送標示器座標資料、加速度 資料、角速度資料、及操作鍵資料,作為顯示出對本身機 器所進行操作之操作資料。此外,遊戲裝置3係使用上述 ® 操作資料作為遊戲輸入來執行遊戲處理。因此,藉由使用 上述控制器5,使用者除了壓下各操作鍵之以往的一般遊 戲操作外,更可進行移動控制器5本身之遊戲操作。例如, 可進行使控制器5傾斜為任意姿勢之操作、藉由控制器5 來指示晝面上的任意位置之操作、以及移動控制器5本身 之操作等。 本實施形態中,控制器5不具有顯示遊戲圖像之顯示 ^ 手段,但亦可具有例如顯示出表示電池殘量的圖像等用之 顯示手段。 [4.終端裝置7的構成] 接著,參照第8圖至第13圖來說明終端裝置7的構 成。第8圖係顯示終端裝置7的外觀構成之平面圖。第8 圖中的(a)圖為終端裝置7的前視圖,(b)圖為俯視圖,(c) 圖為右側視圖,(d)圖為仰視圖。第9圖為終端裝置7之背 視圖。此外,第10圖及第11圖係顯示使用者橫向握持終 端裝置7之模樣之圖。第12圖及第13圖係顯示使用者縱 30 323330 201220109 向握持終端裝置7之模樣之圖。 如第8圖所示,終端裝置7具備大致為橫向較長之 方形板狀形狀的外罩50。亦即,終端裝置7可稱為平板1 的資訊處理裝置。外罩50只要是全體為板狀形狀即可反型 具有曲面或一部分具有突起等。外罩5〇為使用者所能夠= 持之程度的大小,因此,使用者能夠以手握持終端裝置^ 來移動,或是改變終端裝置7的配置位置。終端裝置7的 縱向(z軸方向)長度,較佳為100至15〇[mm],本實施形熊 中為133.5[mm]。終端裝置7的橫向(x軸方向)長度, 為200至250[mm],本實施形態中為228 26[min]。終端裝 置7的厚度(y軸方向的長度),較佳者板狀部分為π至 [mm]左右,包含最厚部分者為30至50[mm]左右,本實施 形悲中為23. 6(最厚部分為4〇 26)[mm]。此外,終端裝置 7的重置約為400至6〇〇[g],本實施形態中為53〇[g]。詳 細内容將於之後敘述’但即使是如上述之相對大型的終端 裝置(操作裝置),亦構成為使用者容易握持且容易操作的 終端裝置7。 、終端襄置7於外罩50的表面(表侧)具有LCD 5卜LCD 51真面的大小較佳為5吋以上,在此為 6. 2吋。本實施形 之、冬端裝置7 ’藉由容易握持且容易操作之構成,即 U較大的LCD ’亦容易操作。其他實施形態中,可設 置較」的LCD 51以將操作裝置7的大小設成相對較小。LCD 51 -又置在外罩5G表面的中央附近。因此,如第ι()圖及第 11圖所不’使用者藉由握持LCD 51兩侧部分的外罩50, 31 323330 201220109 可一邊觀看LCD 51的畫面一邊握持終端裝置7來移動。第 10圖及第11圖中’係顯示使用者握持LCD 51左右兩側的 部分之外罩50,而以橫握方式(橫向較長的朝向)握持終端 裝置7之例子,但亦可如第12圖及第13圖所示,以縱握 方式(縱向較長的朝向)握持終端裝置7。The action of the device 5 can also be followed by analysis. The imaging information calculation unit L analyzes 39, the imaging element 40, and the preparation m wave 11 38 'lens. The front side of the controller 5 is connected to the circuit 41. Infrared filter 38' The lens 39, which is the light that will penetrate, passes only the infrared rays. Go to camera 4G. The solid-state imaging device of the camera _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ Here, the infrared light illuminating and illuminating 55 and the pointing device 6 of the singular 39 are _=, and the indicator portion of the terminal device 7 is constituted by a marker for setting the H-short wave H H line. Therefore, the infrared ray of the line filter 38 is first and only the image of the state indicating portion 55 and/or the image of the image is more accurately imaged by infrared. Hereinafter, the image is taken by the image pickup device 6 without the image of the device 6). The image captured by the imaging element 40 is referred to as an image in the image path. _ image ^ / data, in the position of the image processing camera. At the image portion 41, the coordinate wheel in the captured image to the communication unit 36: 41 will display the calculated position m computer 42. The coordinates of the coordinates, 323330 25 201220109, are transmitted to the game device 3 by the microcomputer 42 as operational data. Hereinafter, the above coordinates are referred to as "marker coordinates". Since the marker coordinates are changed corresponding to the orientation (tilt angle) or position of the controller 5 itself, the game device 3 can use the marker coordinates to calculate the orientation or position of the controller 5. In other embodiments, the controller 5 may be configured not to include the image processing circuit 41, and the captured image itself may be transmitted from the controller 5 to the game device 3. At this time, the game device 3 may have a circuit or a program having the same function as the image processing circuit 41, and calculate the marker coordinates. The acceleration sensor 37 detects the acceleration (including the gravitational acceleration) of the controller 5, that is, the force applied to the controller 5 (including the gravity). The acceleration sensor 37 detects the value of the acceleration (linear acceleration) in the direction along the direction of the sensing axis in the acceleration applied to the detecting portion of the acceleration sensor 37. For example, in the case of a multi-axis acceleration sensor of two or more axes, the acceleration of the component along each axis is detected as the acceleration applied to the detecting portion of the acceleration sensor. The acceleration sensor 37 is, for example, a capacitive MEMS (Micro Electro Mechanical System) type acceleration sensor, but other types of acceleration sensors can also be used. In the present embodiment, the acceleration sensor 37 is in the up-down direction (the Y-axis direction shown in FIG. 3), the left-right direction (the X-axis direction killed in FIG. 3), and the front-rear direction with respect to the controller 5. The linear acceleration is detected in the three-axis directions of the Z-axis direction (Fig. 3). Since the acceleration sensor 37 detects the acceleration in the linear direction along each axis, the output of the acceleration sensor 37 shows the value of the linear acceleration of each of the three axes. That is, the acceleration measured by the debt 26 323330 201220109 is shown as a 3-dimensional vector on the XYZ coordinate system (controller coordinate system) set by the controller 5. The data (acceleration k) of the acceleration detected by the acceleration sensor 37 is output to the communication unit 36. Since the acceleration detected by the acceleration sensor 37 changes in accordance with the orientation (tilt angle) or action of the controller 5 itself, the game device 3 can use the acquired acceleration data to calculate the orientation or action of the controller 5. In the present embodiment, the game device 3 calculates the posture, the tilt angle, and the like of the controller 5 based on the acquired acceleration data. It is easy for the person skilled in the art to understand from the description of the present specification that the processor of the game device 3 is based on the signal of the acceleration output from the acceleration sensor 37 (the same applies to the acceleration sensor 63 to be described later). For example, the CPU 10) or a computer such as the processor (for example, the microcomputer 42) of the controller 5 performs processing 'by this, it is possible to estimate or calculate (determine) further information about the controller 5. For example, when the computer side processing is performed on the premise that the controller 5 # of the acceleration sensor 37 is in a stationary state (that is, when the acceleration detected by the acceleration sensor has only the gravitational acceleration), the processing is performed. As long as the controller 5 is substantially at a standstill, it can be known from the detected acceleration whether the posture of the controller 5 is tilted or tilted with respect to the direction of gravity. Specifically, when the detection axis of the acceleration sensor 37 is directly below the vertical direction, whether or not the controller 5 is tilted with respect to the reference can be obtained by whether 1G (gravity acceleration) is applied or not. This size is used to know how much the tilt is relative to the reference. In addition, for the multi-axis acceleration sensor 37, the extent to which the controller 5 is tilted with respect to the direction of gravity can be known in more detail by applying the acceleration signal to each axis 323330 27 201220109. At this time, the processor can calculate the tilt angle of the controller 5 based on the output from the acceleration sensor 37, or calculate the tilt direction of the controller 5 without calculating the tilt angle. Thus, by using the acceleration sensor 37 in combination with the processor, the tilt angle or posture of the controller 5 can be determined. On the other hand, when the controller 5 is in the operating state (the controller 5 is in the moving state), since the acceleration sensor 37 detects the acceleration of the action of the controller 5 in addition to the gravitational acceleration, The component of the gravitational acceleration is removed from the detected acceleration by the pre-determination process, whereby the direction of operation of the controller 5 can be known. In addition, even if the controller 5 is in the operating state, the component of the acceleration corresponding to the motion of the acceleration sensor is removed from the detected acceleration by a predetermined process, thereby knowing that the controller 5 is relative to the controller 5 The slope of the direction of gravity. In other embodiments, the acceleration sensor 37 may further include: an in-line processing for pre-determining the acceleration signal φ before outputting the acceleration signal detected by the built-in acceleration detecting means to the microcomputer 42. A device or other type of specialized processing device. An integrated or dedicated processing device, for example, when used to cause the acceleration sensor 37 to detect a static acceleration (e.g., gravitational acceleration), can convert the acceleration signal to a tilt angle (or other preferred parameter). The rotation sensor 48 detects the angular velocity around the three axes (the XYZ axis in this embodiment). In the present specification, the direction of rotation about the X axis is referred to as a pitch direction with reference to the imaging direction of the controller 5 (the positive direction of the Z axis), and the direction of rotation about the Y axis is called a yaw (raw) The direction of rotation around the Z axis is called the roll direction. The rotation sensor 48 can detect the angular velocity around the 3 axes as long as it can detect 28 323330 201220109, and the number and combination of the ubiquitous sensors used can be arbitrary. For example, the swing sensor 48 can be a 3-axis swivel sensor or a combined 2-axis swivel sensor and a x-axis swivel sensor to detect angular velocity about the 3 axes. The batting material ' showing the angular velocity detected by the gyro sensor is output to the communication unit 36. Further, the gyro sensor 48 can also measure the angular velocity about one or two axes. The communication unit 36 includes a microcomputer 42, a memory 43, a wireless module 44, and an antenna 45. The microcomputer 42 controls the wireless module 44 so as to transmit the data acquired by the microcomputer & to the game device 3 in a wireless manner while using the memory 43 as a memory area during processing. The data output from the operation unit 32, the imaging information calculation unit 35, the acceleration sensor 37, and the rotation sensor 48 to the microcomputer 42 are temporarily stored. These hidden materials are transmitted to the game device 3 as operation data (controller operation data). That is, the microcomputer 42 outputs the operation data stored in the memory call 43 to the wireless module 44 when the transmission timing of the controller communication module 19 of the game device 3 comes. The wireless module 44, for example, uses Bluetooth (registered trademark) technology to adjust the transmission wave of a predetermined frequency by operating data and broadcasts the weak electric wave signal from the antenna 45. That is, the poor component is modulated by the wireless module 44 into a weak electric wave signal and transmitted from the controller 5. The weak electric wave signal is received by the controller communication module 19 on the side of the game device 3. The game device 3 can acquire the operational data by demodulating or decoding the received weak electric wave signal. Further, the CPU 10 of the game device 3 performs game processing using the operation data acquired from the controller 5. The wireless transmission from the communication unit 36 to the controller communication module 19 is performed 323330 29 201220109 in each predetermined cycle. Since the game processing is generally performed in units of 1/60 second (1 frame time), Preferably, the communication unit 36 of the controller 5 outputs the operation data to the controller communication module 19 of the game device 3 at a rate of 1/200 second, for example, as described above. The controller 5 can transmit the marker coordinate data, the acceleration data, the angular velocity data, and the operation key data as operation data showing the operation of the own machine. Further, the game device 3 performs game processing using the above-described ® operation data as a game input. Therefore, by using the controller 5 described above, the user can perform the game operation of the mobile controller 5 itself in addition to the conventional general game operation of pressing the operation keys. For example, an operation of tilting the controller 5 to an arbitrary posture, an operation of instructing an arbitrary position on the face by the controller 5, an operation of the movement controller 5 itself, and the like can be performed. In the present embodiment, the controller 5 does not have a display means for displaying a game image, but may have, for example, a display means for displaying an image indicating the remaining amount of the battery. [4. Configuration of Terminal Device 7] Next, the configuration of the terminal device 7 will be described with reference to Figs. 8 to 13 . Fig. 8 is a plan view showing the appearance of the terminal device 7. Fig. 8(a) is a front view of the terminal device 7, (b) is a plan view, (c) is a right side view, and (d) is a bottom view. Figure 9 is a rear view of the terminal device 7. Further, Fig. 10 and Fig. 11 are views showing a state in which the user holds the terminal device 7 laterally. Fig. 12 and Fig. 13 are views showing the appearance of the user's longitudinal direction 30 323330 201220109 to the holding terminal device 7. As shown in Fig. 8, the terminal device 7 is provided with a cover 50 having a substantially square plate shape which is substantially horizontally long. That is, the terminal device 7 can be referred to as an information processing device of the tablet 1. The outer cover 50 may have a curved surface or a part of a projection or the like as long as the entire shape is a plate shape. Since the cover 5 is a size that can be held by the user, the user can move the terminal device by hand or change the arrangement position of the terminal device 7. The length of the longitudinal direction (z-axis direction) of the terminal device 7 is preferably 100 to 15 〇 [mm], and is 133.5 [mm] in the present embodiment. The length of the lateral direction (x-axis direction) of the terminal device 7 is 200 to 250 [mm], and is 228 26 [min] in the present embodiment. The thickness of the terminal device 7 (the length in the y-axis direction) is preferably about π to [mm], and the thickness of the thickest portion is about 30 to 50 [mm], and the present embodiment is 23.6. (The thickest part is 4〇26) [mm]. Further, the reset of the terminal device 7 is approximately 400 to 6 〇〇 [g], and is 53 〇 [g] in the present embodiment. The details will be described later. However, even in the case of the relatively large terminal device (operating device) as described above, the terminal device 7 is easy to hold and easy to operate by the user. The terminal device 7 has a surface of the outer cover 50 (front side) having a size of the LCD 5 and the surface of the LCD 51 is preferably 5 吋 or more, and is 6.2 inches here. In the present embodiment, the winter end device 7' is easy to handle and easy to operate, that is, the U-large LCD' is also easy to operate. In other embodiments, a relatively small LCD 51 can be provided to set the size of the operating device 7 to be relatively small. The LCD 51 - is again placed near the center of the surface of the outer cover 5G. Therefore, the user can hold the terminal device 7 while moving while viewing the screen of the LCD 51 by holding the cover 50, 31 323330 201220109 on both sides of the LCD 51 as shown in FIGS. 1 and 11 . In Fig. 10 and Fig. 11, the example shows that the user holds the outer cover 50 on the left and right sides of the LCD 51, and holds the terminal device 7 in a horizontally held manner (long lateral direction), but it may be as As shown in Fig. 12 and Fig. 13, the terminal device 7 is held in a longitudinally gripping manner (longitudinal direction).

如第8圖的(a)圖所示,終端裝置7於1CD 51的晝面 具有觸控面板52作為操作手段。本實施形態中,觸控=板 52為電阻膜方式的觸控面板。惟觸控面板並不限於電阻膜 方式,例如可使用例如靜電電容方式等任意方式的觸控面 板。此外’觸控面板52可為單點觸控方式或多點觸控方 式。本實施形態中,觸控面板52係應用與LCD 51的解析 度為相同解析度(_精度)者。惟觸控面板52料析度並 不須與LCD 51㈣析度一致。對觸控面板52之輸入,通 常用觸控筆60來進行’但並不限於馳筆⑼,亦能夠以 使用者的手指對觸控面板52進行輸人。外罩Μ上,可設 置有用以收納用來對觸控面板52進行操作之觸控筆之 收納孔6〇a(參照第8圖⑽。在此,收納孔_係以使觸 控筆60不會落下之方式設置在外罩5()的上面,但亦可設 置在侧面或下面。如此,由於終端裝置7具有觸控面板犯, 所以使用者可-邊移動終端裝置7 —邊操作觸控面板犯。 亦即,使用者可—邊移動LQ)51的畫面,—邊對該畫面直 接(藉由觸控面板52)進行輸入。 如第8圖所不’終端裝置7具備2個類比搖桿53A及 5兆、以及複數個操作鍵(按鍵)54A至_作為操作手段(操 323330 32 201220109 作部)。各類比搖桿53A及53B為可指示方向之裝置。 比搖杯53A及53B,係構成為可使由使用者的手指所操 之可動構件(搖桿部)相對於外罩5〇的表面往任意方向^上 下左右及斜向的任意角度)滑動。亦即,是亦被稱為滑 之方向輸入裝置。各類比搖桿53A及53B❸可動構件,尹 要為可相對於外罩5〇的表面往任意方向傾倒之種類者^ 可:本實施形態中,係使用可動構件可滑動之種類的類此 搖才干因此’即使使用者不大幅移動拇指,亦可操作 比搖桿53A及53B,並可在緊緊握持外罩5〇之狀態下進_ 操作。當❹可動構件為傾狀_者料各舰搖桿^ M3B時,對使用者而言,更容易了解輸人的程度(傾斜 程度),而更容易進行詳細操作。 此外,左類比搖桿53A及右類比搖桿53β分別設置在 LCD 51晝面的左側及右側。因此,使用者可藉由左右任一 手使用類比搖桿來進行指示方向之輸人。此外,如第1〇圖 及第Π圖所示’各類比搖桿53A及53B設置在使用者可於 握持終端|置7的左右部分(LCD 51左右兩側的部分)之狀 態,下進行操作之位置上,因此,即使使用者握持終端裝置 7來移動時,亦可容易操作各類比搖桿53A及53B。、 口各刼作鍵54A至54L為用以進行預定輸入之操作手段 (操作')’為可壓下之按鍵。如以下所示,各操作鍵… 至^4L係設置在使用者可於握持終端裝置7的左右部分之 狀態下進行操作之位置上(參照第10圖及第11圖)。因此, 即使使用者握持終端裝置7來移動時,亦可容易操作此等 323330 33 201220109 操作手段。As shown in Fig. 8(a), the terminal device 7 has a touch panel 52 as an operation means on the side of the 1CD 51. In the present embodiment, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film type, and for example, a touch panel of any type such as an electrostatic capacitance method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present embodiment, the touch panel 52 is applied to the same resolution (_precision) as the resolution of the LCD 51. However, the resolution of the touch panel 52 does not have to be consistent with the resolution of the LCD 51 (four). The input to the touch panel 52 is performed by the stylus pen 60. However, it is not limited to the pen (9), and the touch panel 52 can be input by the user's finger. The housing cover can be provided with a receiving hole 6〇a for accommodating the stylus for operating the touch panel 52 (refer to FIG. 8(10). Here, the receiving hole _ is such that the stylus 60 does not The falling manner is disposed on the upper surface of the outer cover 5 (), but may be disposed on the side or the lower side. Thus, since the terminal device 7 has a touch panel, the user can operate the touch panel while moving the terminal device 7 That is, the user can move the screen of the LQ 51 while directly inputting the screen (by the touch panel 52). As shown in Fig. 8, the terminal device 7 is provided with two analog joysticks 53A and 5 megabytes, and a plurality of operation keys (keys) 54A to _ as operation means (operation 323330 32 201220109). The various types of rocker bars 53A and 53B are devices that indicate the direction. The swing cups 53A and 53B are configured to be slidable by the movable member (rocker portion) operated by the user's finger in any direction up and down, and at any angle in the oblique direction with respect to the surface of the outer cover 5A. That is, it is also referred to as a sliding direction input device. For all kinds of movable members of the rocker 53A and 53B, Yin is a type that can be tilted in any direction with respect to the surface of the outer cover 5 可. In this embodiment, the type of the movable member can be slidable. Therefore, even if the user does not move the thumb a lot, the joysticks 53A and 53B can be operated, and the operation can be performed while holding the cover 5 紧 tightly. When the movable member is tilted, the rocker is M3B, it is easier for the user to understand the degree of input (degree of inclination), and it is easier to perform detailed operations. Further, the left analog stick 53A and the right analog stick 53β are respectively disposed on the left and right sides of the face of the LCD 51. Therefore, the user can use the analog stick to control the direction of the person by either left or right. In addition, as shown in the first and second figures, the various types of rocker levers 53A and 53B are disposed in a state in which the user can hold the left and right portions of the terminal 7 (the left and right sides of the LCD 51). At the position where the operation is performed, even if the user holds the terminal device 7 to move, it is easy to operate the various types of the joysticks 53A and 53B. The port keys 54A to 54L are operation means (operation ')' for pressing the predetermined input. As will be described below, each of the operation keys ... to 4L is provided at a position where the user can operate while holding the left and right portions of the terminal device 7 (see Figs. 10 and 11). Therefore, even when the user holds the terminal device 7 to move, it is easy to operate the 323330 33 201220109 operation means.

11 圖)。 如第8圖的(a)圖所示,於外罩 十字鍵54A係设置在LCD 51的左側且為左類比搖桿 53A的下側。亦即’十字鍵54A配置在使用者的左手所能 夠操作之位置上。十字鍵54A具有十字形, 上下左右的方向之鍵。 為至少可指示 。此等3個 此外,鍵54B至54D設置在LCD51的下側 鍵54B至54D係配置在左右兩手所能夠操作之位置上。此 外,終端裝置7具有用以導通/關閉終端裝置7的電源之電 源鍵54M。藉由電源鍵54M的操作,亦能夠以遠距方式, 導通/關閉遊戲裝置3的電源。電源鍵54M,與鍵5牝至54D • 相同,設置在LCD51的下侧。電源鍵54肘設置在鍵54B至 54D的右側。因此,電源鍵54M係配置在右手所能夠操作(容 易操作)之位置上。此外,4個鍵54E至54H係設置在LCD 51 的右侧且為右類比搖桿53B的下側。亦即,4個鍵54£至 54H係配置在使用者的右手所能夠操作之位置上。再者,4 個鍵54E至54H係以(相對於4個鍵54E至54H的中心位置) 成為上下左右的位置關係之方式來配置。因此,終端裝置 7可使4個鍵54E至54H具有用以令使用者指示上下左右 的方向之鍵的功能。 323330 34 201220109 本實施形態中,各類比搖桿53A及53B係配置在較十 字鍵54A及各個鍵54E至54H為上側。在此,各類比搖桿 53A及53B在厚度方向(y軸方向)上較十字鍵54A及各個鍵 54E至54H為突出。因此,若將類比搖桿53A及十字鍵54A 的位置相反地配置,使用者以姆指操作十字鍵54A時,可 能有拇指碰到類比搖桿53A而產生錯誤操作之疑慮。將類 比搖桿53B及各個鍵54E至54H的位置相反地配置時,亦11 figure). As shown in Fig. 8(a), the outer cover cross key 54A is provided on the left side of the LCD 51 and is the lower side of the left analog rocker 53A. That is, the 'cross key 54A' is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape, a key in the up, down, left, and right directions. To at least indicate. In addition, the keys 54B to 54D are disposed on the lower side of the LCD 51. The keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, the terminal device 7 has a power source key 54M for turning on/off the power of the terminal device 7. The power of the game device 3 can also be turned on/off in a remote manner by the operation of the power button 54M. The power key 54M, like the keys 5A to 54D, is disposed on the lower side of the LCD 51. The power button 54 is disposed on the right side of the keys 54B to 54D. Therefore, the power key 54M is disposed at a position where the right hand can be operated (easy operation). Further, four keys 54E to 54H are provided on the right side of the LCD 51 and are the lower side of the right analog rocker 53B. That is, the four keys 54 £ to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device 7 can have four functions of the keys 54E to 54H for the user to instruct the keys in the up, down, left, and right directions. 323330 34 201220109 In the present embodiment, the various types of ratio rockers 53A and 53B are disposed on the upper side of the ten-key 54A and the respective keys 54E to 54H. Here, the various types of ratio rockers 53A and 53B are protruded in the thickness direction (y-axis direction) from the cross key 54A and the respective keys 54E to 54H. Therefore, when the position of the analog rocker 53A and the cross key 54A is reversed, when the user operates the cross key 54A with the thumb, there is a possibility that the thumb hits the analog rocker 53A and an erroneous operation is caused. When the analog rocker 53B and the respective keys 54E to 54H are arranged oppositely,

會產生同樣問題。相對於此,本實施形態中,由於將各類Will produce the same problem. On the other hand, in this embodiment,

比搖桿53A及53B配置在較十字鍵54A及各個鍵54E至54H 為上侧’所以在使用者操作類比搖桿53A及53B時,手指 碰到十字鍵54A及各個鍵54E至54H之可能性較上述情況 為低。如此,本實施形態中,可降低錯誤操作的可能性, 而月b夠&升終端裝置7的操作性。惟在其他實施形態中, 可因應必要將類比搖桿53A及十字鍵54A相反地配置,或 將類比搖桿53B及各個鍵54E至54H相反地配置。 在此,本實施形態中,某些操作部(各類比搖桿53a及 53B、十字鍵54A、及3個鍵5犯至54G),在顯示部⑽ 的左右兩侧,設置在較外罩5〇中之上下方向(y轴方向)的 部時,使用者主要握持在較 Τ之上下方向的中心為上側。在此,當使 握持=罩5G的下側時’(尤其在終端裝置7如本實施形離 ,有相對較大的大小時),所握持之終端裝置7變得不;' 疋,使用者不易握持終端裝t 7。相對於此,本實施 中’ A作上述操作料’使用者主要握持在較終端 323330 35 201220109 7中之上下方向的中心為上侧’此外,可藉由手掌從橫向 來支撐外罩50。因此,使用者可在安定的狀態下握持外罩 50而容易握持終端裝置7 ’故上述操作部變得更容易操 作。其他實施形態中,在較外罩50的中央為上側,可在顯 示部的左右方分別設置至少1個操作部。例如,可僅將各 類比搖桿53A及53B設置在較外罩50的中央為上側。此 外,例如當十字鍵54A設置在較左類比搖桿53A為上側, 且4個鍵54E至54H設置在較右類比搖桿53B為上側時, 十字鍵54A及4個鍵54E至54H ’可設置在較外罩5〇的中 央為上侧。The ratio of the rocker levers 53A and 53B is set to the upper side of the cross key 54A and the respective keys 54E to 54H. Therefore, when the user operates the analog rockers 53A and 53B, the possibility that the finger touches the cross key 54A and the respective keys 54E to 54H It is lower than the above situation. As described above, in the present embodiment, the possibility of erroneous operation can be reduced, and the monthly b is sufficient to improve the operability of the terminal device 7. However, in other embodiments, the analog rocker 53A and the cross key 54A may be arranged oppositely, or the analog rocker 53B and the respective keys 54E to 54H may be arranged oppositely. Here, in the present embodiment, some of the operation portions (various types of the joysticks 53a and 53B, the cross key 54A, and the three keys 5 are broken to 54G) are disposed on the left and right sides of the display unit (10) in the outer cover 5 In the upper and lower directions (y-axis direction) of the middle, the user mainly holds the center in the upper and lower directions of the upper side as the upper side. Here, when the grip = the lower side of the cover 5G is made (especially when the terminal device 7 is separated from the present embodiment, there is a relatively large size), the held terminal device 7 becomes no; '疋, It is not easy for the user to hold the terminal to install t7. On the other hand, in the present embodiment, the user of the 'A operating material' is mainly held at the center in the upper and lower directions of the terminal 323330 35 201220109 7, and the outer cover 50 can be supported from the lateral direction by the palm. Therefore, the user can grip the cover 50 in a stable state and easily hold the terminal device 7', so that the above-described operation portion becomes easier to operate. In other embodiments, at least one of the operation portions is provided on the left and right sides of the display portion, respectively, on the upper side of the center of the outer cover 50. For example, only the analog rockers 53A and 53B may be disposed on the upper side of the center of the outer cover 50. Further, for example, when the cross key 54A is disposed on the upper side of the left analog stick 53A, and the four keys 54E to 54H are disposed on the upper side of the right analog stick 53B, the cross key 54A and the four keys 54E to 54H' can be set. The center of the outer cover 5 is the upper side.

此外,本實施形態中’在外罩5〇的背侧(與設置有LCD 51之表面相反的一侧)設置有突起部(簷部59)(;參照第8圖 的(c)圖及第9圖)。如第8圖的(c)圖所示,簷部59為從 大致呈板狀之外罩50的背面突起地設置之山狀構件。突起 部,係具有可讓握持外罩50的背面之使用者的手指鉤住之 尚度(厚度)。突起部的高度較佳為1〇至25[職],本實施 形態中,為16.66[mm]。此外,突起部的下面,較佳係以 使突起部容易讓使用者的手指鉤住之方式,相對於外罩5〇 的背面具有45。以上(尤佳為,以上)的傾斜。如第 所示’突起部的下面’可形成為傾斜角度較上面更大。如 第1〇圖及第11圖料,使用者將手指鉤住 =乘載於手指上)來簡,藉此,即使終端裝置? $ 的大小,亦不會疲勞而能夠在穩定的狀態下握持 終端裝置7。亦即1部59可作為以手指支撐外罩50之 323330 36 201220109 支撐構件’ itU卜,亦可作為手指鉤住部。 此外簷。卩59,在外罩50的上下方向上被設置在較 中央,上側層部59,係設置在與外單50的表面上所設 置之操作部i各类員比搖桿53A & 53b)的大致相反側之位 置^即大起部係設置在包含分別設置在顯示部的左右 方之操作P的相反側的位置之區域。因此,當操作上述操 作P時使用者可藉由中指或無名指來支樓詹部59之方式 •握持?端Λ置7(參照第10圖及* 11圖)。藉此,終端裝 置7炱得谷易握持,上述操作部亦變得容易操作。此外, 實 y二中犬起部具有(凸起部分)往左右延伸之詹狀 的化狀’所以使用者可使中指或無名指沿著突起部的下面 來握持終端I置7,而更容易握持終端裝置7。詹部59只 要形成為γ凸起部分)往左右延伸即可 ,並不限於第9圖所 不之彺水平方向延伸之形狀。其他實施形態中,簷部59可 在從水平方㈣微傾斜之方向上延伸 。例如,筹部5 9可設 鲁置為匕著從左右兩端朝中央而往上(或下)傾斜。 本實施形態中,係以在料59設置後述卡止孔者為理 由’而採㈣成為純的形狀之料59作為形成於外罩的 突起部’但突起部可為任意形狀。例如,其他實施 4、中’可構成為在外罩5〇的背側,2個突起部設置在左 右兩侧(左右方向的中央未設置突起部)(參照第扣圖)。此 外,其他實施形態中,突起部的剖面形狀(垂直於χ軸方向 之剖面上的形狀)’亦可以能夠讓使用者的手指 : 樓終端裝置7之方式(手指更緊緊地鉤住突起部之方式), 323330 37 201220109 形成為鉤狀(下面凹入之形狀)。 突起部(簷部59)之上下方向的寬度,可為任意寬度。 例如’突起部可形成至外罩50的上邊為止。亦即,突起部 的上面可形成於與外罩50上側的側面相同之位置。此時, 外罩50為下側較薄且上側較厚之2段構成。如此’外罩 50較佳係在背面的左右兩側,形成有朝下方之面(突起部 的下面)。藉此,使用者可將手指抵住該面而輕鬆地握持終 鲁端裝置7。上述「朝下方之面」可形成於外罩50之背面的 任意位置,但較佳係位於較外罩5〇的中央為上側。 此外’如第8圖的(a)圖、(b)圖及(c)圖所示,第1L 鍵541及第1R鍵54J分別設置在外罩5〇上側的面之左右 兩側。本實施形態中,第沈鍵541及第以鍵54J ’係設 置在外罩50的斜上方部分(左上方部分及右上方部分)。具 體而言,第1L鍵541設置在板狀外罩50之上侧的側面左 端,並從左上侧的側面露出(換言之,從上側及左侧兩者的 春 側面露出)。此外,第1 r鍵54J設置在板狀外罩之上侧 的側面右端,並從右上侧的側面露出(換言之,從上侧及右 側兩者的侧面露出)。如此,第1L鍵541配置在使用者的 左手食指所能夠操作之位置,第1R鍵54J配置在使用者的 右手食指所能夠操作之位置(參照第10圖)。其他實施形態 中,分別设置在外罩5 〇上侧的面的左右方之操作部’不需 設置在左右的端部,亦可設置在端部以外的位置。此外, 亦可分別將操作部設置在外罩50的左右側面。 此外’如第8圖的(c)圖及第9圖所示,第2L鍵54K 323330 38 201220109 及第2R鍵54L係配置在上述突起部(簷部59)。第2L鍵54K 設置在簷部59的左端附近。第2R鍵54L設置在簷部59的 右端附近。亦即,第2L鍵54K設置在外罩50之背面左侧(從 表面側觀看時之左侧)的稍微上方處,第2R鍵54L設置在 外罩50之背面右側(從表面侧觀看時之右側)的稱微上方 處。換言之,第2L鍵54K係設置在表面上所設置之左類比 搖桿53A的(大致)相反側的位置,第2R鍵54L係設置在表 面上所設置之右類比搖桿53B的(大致)相反側的位置。如 • 此,第2L鍵54K配置在使用者的左手中指或食指所能夠操 作之位置上,第2R鍵54L配置在使用者的右手中指或食指 所能夠操作之位置上(參照第10圖及第11圖)。此外,如 第8圖的(c)圖所示,第2L鍵54K及第2R鍵54L係設置在 上述簷部59的上面。因此,第2L鍵54K及第2R鍵54L具 有朝上方(斜向上方)之按鍵面。由於使用者握持終端裝置 7時中指或食指推測為朝上下方向動作,因此,藉由使按 φ 鍵面朝上方,使用者可容易壓下第2L鍵54K及第2R鍵54L。 如上所述,本實施形態中’在較外罩5〇的中央為上侧 上,於顯示部(LCD 51)的左右方分別設置有操作部(類比搖 桿53A及53B),此外,在外罩50的背側上,於該操作部 之相反側的位置上分別設置有其他操作部(第2L·鍵54K及 第2R鍵54L)。根據此,上述操作部與其他操作部配置在 外罩50的表侧與背侧之互相對向的位置上,當操作此等操 作部時,使用者可從表側與背侧夾持外罩5〇而握持。此 外,操作此等操作部時之使用者,係握持外罩50之上下方 323330 39 201220109 向的中心更上侧處’故可在上側握持終端裝置7,並且以 手4支撐終端裝置7(參照第1〇圖及第n圖)。藉由上述 内谷,使用者在可操作至少4個操作部之狀態下,穩定地 摄持終端裝置7’而能夠提供一種使用者容易握持且操作 性佳之操作裝置(終端裝置7)。 如上所述,本實施形態中,藉由在將手指抵住突起部 (簷部59)的下面之狀態下握持終端裝置7,使用者可輕鬆 % 地握持終端裝置7。此外,由於在突起部的上面設置有第 2L鍵54K及第2R鍵54L,所以使用者可在上述狀態下容易 知作此等鍵。使用者例如能夠以下列握持方式容易地握持 終端裝置7。 亦即,如第10圖所示,使用者亦可將無名指抵住簷部 59的下面(第1〇圖所示之單點鏈線),(以無名指支撐簷部 59之方式)握持終端裝置7。此時,使用者能夠以食指或中 指操作4個鍵(第1L鍵541、第ir鍵54J、第2L鍵54IC、 ® 及第2R鍵54L)。例如,當所要求之遊戲操作中,所使用 的鍵較多且較複雜時,藉由如第1〇圖所示地握持,可容易 地操作多數鍵。此外,由於各類比搖桿53A及53B設置在 十予鍵54A及鍵54E至54H的上侧’故在要求相對複雜的 操作時,使用者可藉由姆指來操作類比搖桿53A及53B, 而能夠方便地進行。此外,第10圖中,使用者將拇指抵住 外罩50的表面,將食指抵住外罩5〇的上面,將中指抵住 外罩50的背面之簷部59的上面,將無名指抵住簷部59的 下面,將小指抵住外罩50的背面來握持終端裝置7。如此, 323330 40 201220109 使用者可從四方包圍外罩5〇而緊緊地握持終端裝置?。 此外如第11圖所示,使用者亦可將中指抵住詹部 59的下面(第11圖所不之單點鏈線)來握持終端裝置了。此 時使用者月b夠以食指容易地操作2個組第&鍵祕及 第2R鍵541^)例如’當所要求之遊戲操作中,所使用的 鍵較少且較單純時,可如第u圖所示地握持。第u圖中, 由於使用者可藉由2根手指(無名指及小指)握持外罩5〇的 • 下侧,因此可緊緊地握持終端裝置7。 本實施形射,I部59的下面,係設置在位於各類比 搖桿53A及53B、與十字鍵54A及4個鍵5仙至54H之間(位 於較各類比搖桿53A及53B為下方,且較十字鍵5M及4 個鍵54E至54H為上方)。因此,當將無名指抵住筹部59 來握持終端裝置7(第1〇圖)時,容易以拇指來操作各類比 搖桿53A及53B,將中指抵住簷部59來握持終端裂置7(第 11圖)時’容易以梅指來操作十字鍵54A及4個鍵54e至 _ 54H。亦即,不論是上述2種情形的哪—種情形,使用者均 可在緊緊地握持終端裝置7之狀態下進行方向輸入操作。 此外,如上所述,使用者亦可縱向地握持終端裝置7。 亦即’如第12圖所示’使用者以左手握持終端裝置\的上 邊,藉此可縱向地握持終端裝£7。此外,如帛㈣所示, 使用者以左手握持終端裝置7的下邊,藉此可縱向地握持 終端裝置7。第12圖及第13圖中,係顯示以左手握持终 端裝置7的情形,但亦能夠以右手握持終㈣置了。如此, 由於使用者能夠以單手握持終端裝置7,因此,例如亦可 323330 201220109 一邊以一方的手握持終端裝置7 —邊以另一方的手對觸控 面板52進行輸入來操作。 此外,以第12圖所示之握持方式來握持終端裝置7 時,使用者以拇指以外的手指(第12圖中為中指、無名指 及小指)抵住簷部59的下面(第12圖所示之單點鏈線),藉 此可緊緊地握持終端裝置7。尤其在本實施形態中,由於 簷部59往左右(第12圖中為上下)延伸而形成,因此,不 論使用者握持終端裝置7之上邊的哪個位置,均能夠將拇 指以外的手指抵住簷部59,而能夠緊緊地握持終端裝置 7。亦即,當使用者縱向地握持終端裝置7來使用時,簷部 59可用作為把手。另一方面,以第13圖所示之握持方式 來握持終端裝置7時,使用者能夠以左手對鍵54B至54D 進行操作。因此,例如能夠一邊以單手對觸控面板52進行 輸入,一邊以握持終端裝置7之手對鍵54B至541)進行操 作,而可進行更多操作。 關於本實施形態之終端裝置7,由於突起部(簷部59) 設置在背面,當以使LCD 51的晝面(外罩50的表面)朝上 之狀態來載置終端裝置7時,晝面會呈稍微傾斜之狀態。 藉此,可在載置終端裝置7之狀態下更容易觀看畫面。此 外,在載置終端裝置7之狀態下容易對觸控面板52進行輸 入操作。此外,其他實施形態中,亦可在外罩50的背面形 成具有與上述簷部59同等程度的高度之追加的突起部。根 據此,在使LCD 51的畫面朝上之狀態下使各突起部接觸於 地面,可載置終端裝置7使晝面呈水平。此外,亦可使追 42 323330 201220109 加的突起部能夠裝卸(或可折疊)。根據此,可在晝面呈稍 微傾斜之狀態與晝面呈水平之狀態兩者時,載置終端裝 置。亦即,當放置終端裝置7來使用時,簷部59可用作為 腳部。 各操作鍵54A至54L,係適當地分配有因應遊戲程式 之功能。例如,十字鍵54A及鍵54E至54H可用在方向指 示操作或選擇操作等,各鍵54B至54E可用在決定操作或 取消操作等。此外,終端裝置7亦可具有用以導通/關閉 ® LCD 51的晝面顯示之鍵,或是用以進行與遊戲裝置3的連 接設定(配對)之鍵。 如第8圖的(a)圖所示,終端裝置7,係在外罩50的 表面具備有由標示器55A及標示器55B所構成之標示部 55。標示部55設置在LCD 51的上側。各標示器55A及標 示器55B,與標示裝置6的各標示器6R及6L相同,是由1 個以上的紅外線LED所構成。構成標示器55A及55B之紅 φ 外線LED,係配置在可讓紅外線穿透之窗部的内側。標示 部55,與上述標示裝置6相同,係用以讓遊戲裝置3算出 控制器5的動作等所用。此外,遊戲裝置3可控制標示部 55所具備之各個紅外線LED的點燈。 終端裝置7,係具備作為攝像手段之攝影機56。攝影 機56係包含具有預定解析度之攝像元件(例如CMOS影像感 測器或CCD影像感測器等)及透鏡。如第8圖所示,本實施 形態中,攝影機56設置在外罩50的表面。因此,攝影機 56可將握持著終端裝置7之使用者的臉予以攝像,例如可 43 323330 201220109 將邊觀看LCD 51 —邊進行遊戲時之使用者予以攝像。在 本實施形態中,攝影機56係配置在2個標示器55A及55B 之間。 終端裝置7,係具備作為聲音輸入手段之麥克風69。 於外罩50的表面設置有麥克風用孔5〇c。麥克風69設置 在該麥克風用孔50c内之外罩50的内部。麥克風69係偵 測出使用者的聲音等、以及終端裝置7周圍的聲音。 ^ s ^竣置7,係具備作為聲音輸出手段之制σ八77。如 第8圖的(d)圖所示,於外罩50表面的下側設置有喇叭孔 57。喇叭77的輸出聲音從該喇叭孔57輸出。本實施形態 中、’、端骏置7具有2個喇叭,於左喇叭與右喇叭的各位 置有喇叭孔57。終端裝置7具備用以調整喇叭77的 二之旋叙64。此外終端褒f 7具備用以連接耳機等的 聲曰出邻之聲音輸出端子62。在此,考量到在外罩的下 側側面連接附加裝置之情形,上述聲音輪出端子62及旋紐 籲64係設置在外軍50的上側侧面,但亦可設置在左右的侧 面或下側的側面。 此外,外單50上,設置有用以讓來自紅外線通訊模組 82的紅外線訊號射出至終端裂置7的外部之窗63。在此, 窗63係以當握持1XD 51的兩側時使紅外線訊號射往使用 者的前方之方式,設置在外罩5Q的上_卜惟其他實施 形態中’窗63例如可設置在外罩5〇的背面等之任奮位置。 此外’終端裝置7係具備用以將其他裝置與終端裳置 7連接之航連接器58。擴充連接器邱為_在與終端裝 323330 44 201220109 置7上所連接之其他裝置之間進行資料(資訊)的接收傳送 之通訊端子。本實施形態中,如第8圖的(d)圖所示,擴充 連接器58係設置在外罩50的下側側面。連接於擴充連接 器58之其他裝置,可為任意裝置’例如為特定遊戲令所使 用之控制器(槍型控制器等)或是鍵盤等之輸入裝置。若無 連接附加裝置之必要性,則亦可不設置擴充連接器58。擴 充連接器58中,可包含將電力供給至附加裝置之端子,或 I 是用於充電之端子。 此外,終端裝置7,在擴充連接器58之外,另具有用 以從附加裝置取得電力之充電端子66。當將充電端子66 連接於後述支架(stand)210時,電力係從支架21〇供給至 終端裝置7。本實施形態中,充電端子66係設置在外罩5〇 的下側侧面。因此’當連接終端裝置7與附加裝置(例如第 15圖所示之輸入裝置2〇〇或第17圖所示之輸入裝置22〇)Further, in the present embodiment, 'the projections (the crotch portion 59) are provided on the back side of the outer cover 5A (the side opposite to the surface on which the LCD 51 is provided) (see (c) and ninth of Fig. 8; Figure). As shown in Fig. 8(c), the crotch portion 59 is a mountain-shaped member which is provided to protrude from the back surface of the substantially plate-shaped outer cover 50. The projection has a degree (thickness) that allows the user's finger holding the back surface of the cover 50 to be hooked. The height of the protrusion is preferably from 1 〇 to 25 [,], and in the present embodiment, it is 16.66 [mm]. Further, it is preferable that the lower surface of the projection portion has 45 with respect to the back surface of the outer cover 5A so that the projection portion is easily caught by the user's finger. The above (more preferably, above) tilt. The 'underside of the protrusion' as shown in the figure can be formed to have a larger inclination angle than the upper surface. For example, in Figure 1 and Figure 11, the user hooks the finger = ride on the finger to simplify it, even if the terminal device? The size of $ is also not fatigued and the terminal device 7 can be held in a stable state. That is, the one portion 59 can be used as a support member ‘itUb for supporting the outer cover 50 with fingers, and can also serve as a finger hooking portion. Also 檐. The cymbal 59 is disposed at the center in the up-and-down direction of the outer cover 50, and the upper layer portion 59 is provided on the surface of the outer unit 50, and is provided in the approximate portion of the operating portion i of the rocker 53A & 53b). The position on the opposite side, that is, the large portion is provided in a region including the position on the opposite side to the operation P of the left and right sides of the display portion. Therefore, when operating the above operation P, the user can use the middle finger or the ring finger to support the way of the Ministry of Finance 59. • Hold? End device 7 (refer to Figure 10 and *11). Thereby, the terminal device 7 is easy to hold, and the above operation portion is also easy to operate. In addition, the real y second middle dog has a chevron shape that extends (the convex portion) to the left and right. Therefore, the user can hold the middle finger or the ring finger along the lower surface of the protrusion to hold the terminal I, which is easier. The terminal device 7 is held. It is only necessary for the Zhan part 59 to be formed as a gamma convex portion to extend left and right, and is not limited to the shape extending in the horizontal direction as shown in Fig. 9. In other embodiments, the crotch portion 59 can extend in a direction that is slightly inclined from the horizontal side (four). For example, the planning unit 5 can be set to tilt upwards (or down) from the left and right ends toward the center. In the present embodiment, the material 59 which is a pure shape is taken as the rule of the material 59, and the material 59 which is a pure shape is taken as the protrusion portion of the outer cover, but the protrusion portion may have any shape. For example, in the other embodiments 4 and 5, the two protrusions may be provided on the back side of the outer cover 5, and the two protrusions may be provided on the left and right sides (the protrusions are not provided in the center in the left-right direction) (refer to the figure). Further, in other embodiments, the cross-sectional shape of the protruding portion (the shape perpendicular to the cross-sectional direction in the x-axis direction)' may also allow the user's finger: the manner of the floor terminal device 7 (the finger is more tightly hooked to the protruding portion) The way), 323330 37 201220109 is formed into a hook shape (the shape of the concave shape below). The width of the protrusion (the crotch portion 59) in the upper and lower directions may be any width. For example, the protrusion may be formed to the upper side of the outer cover 50. That is, the upper surface of the protruding portion can be formed at the same position as the side surface on the upper side of the outer cover 50. At this time, the outer cover 50 is configured in two stages in which the lower side is thin and the upper side is thick. Thus, the outer cover 50 is preferably formed on the left and right sides of the back surface, and has a downward facing surface (the lower surface of the projection). Thereby, the user can easily hold the final end device 7 by holding the finger against the face. The above-mentioned "face facing downward" may be formed at any position on the back surface of the outer cover 50, but is preferably located on the upper side of the center of the outer cover 5''. Further, as shown in Fig. 8 (a), (b), and (c), the first L key 541 and the first R key 54J are provided on the left and right sides of the upper surface of the outer cover 5 分别. In the present embodiment, the sinker key 541 and the first key 54J' are provided on the obliquely upper portion (the upper left portion and the upper right portion) of the outer cover 50. Specifically, the first L-key 541 is provided on the left side of the upper side of the upper surface of the plate-shaped outer cover 50, and is exposed from the side surface on the upper left side (in other words, exposed from the spring side of both the upper side and the left side). Further, the first r-key 54J is provided on the right side end of the upper side of the plate-shaped outer cover, and is exposed from the side surface on the upper right side (in other words, exposed from the side faces of both the upper side and the right side). In this manner, the first L key 541 is placed at a position where the user's left index finger can be operated, and the first R key 54J is placed at a position where the user's right index finger can be operated (see Fig. 10). In the other embodiments, the left and right operation portions ′ provided on the upper surface of the outer cover 5 不 are not required to be provided at the left and right end portions, and may be provided at positions other than the end portions. Further, the operation portions may be provided on the left and right side surfaces of the outer cover 50, respectively. Further, as shown in Fig. 8 (c) and Fig. 9, the second L key 54K 323330 38 201220109 and the second R key 54L are disposed on the protruding portion (the crotch portion 59). The second L key 54K is provided near the left end of the crotch portion 59. The 2nd R key 54L is provided near the right end of the crotch portion 59. That is, the 2nd L key 54K is provided slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the 2nd R key 54L is provided on the right side of the back surface of the cover 50 (the right side when viewed from the front side) It is called above the micro. In other words, the 2nd L key 54K is disposed at a position on the (substantially) opposite side of the left analog rocker 53A provided on the surface, and the 2nd R key 54L is disposed (substantially) opposite to the right analog rocker 53B provided on the surface. Side position. For example, the second L key 54K is disposed at a position where the user's left middle finger or index finger can be operated, and the second R key 54L is disposed at a position where the user's right middle finger or index finger can operate (refer to FIG. 10 and 11 figure). Further, as shown in Fig. 8(c), the second L key 54K and the second R key 54L are provided on the upper surface of the crotch portion 59. Therefore, the second L key 54K and the second R key 54L have a key surface that faces upward (inclined upward). Since the middle finger or the index finger is estimated to move in the up and down direction when the user holds the terminal device 7, the user can easily press the second L key 54K and the second R key 54L by pressing the φ key face upward. As described above, in the present embodiment, the operation unit (the analog rockers 53A and 53B) is provided on the upper side of the display unit (LCD 51) on the upper side of the outer cover 5A, and the outer cover 50 is provided on the outer cover 50. On the back side, other operation portions (the second L-key 54K and the second R-key 54L) are provided at positions on the opposite side of the operation portion. According to this, the operation unit and the other operation unit are disposed at positions facing each other between the front side and the back side of the outer cover 50. When the operation unit is operated, the user can hold the outer cover 5 from the front side and the back side. Hold. Further, the user who operates the operation portions grips the upper and lower sides of the upper and lower sides of the outer cover 50, 323330 39 201220109, so that the terminal device 7 can be held on the upper side and the terminal device 7 is supported by the hand 4 ( Refer to Figure 1 and Figure n). With the above-described inner valley, the user can stably hold the terminal device 7' while operating at least four operation portions, and can provide an operation device (terminal device 7) which is easy for the user to grip and has excellent operability. As described above, in the present embodiment, the user can easily hold the terminal device 7 by holding the terminal device 7 while holding the finger against the lower surface of the projection (the crotch portion 59). Further, since the second L key 54K and the second R key 54L are provided on the upper surface of the projection, the user can easily recognize the key in the above state. The user can easily hold the terminal device 7 in the following manner, for example. That is, as shown in FIG. 10, the user can also hold the ring finger against the underside of the crotch portion 59 (the single-dot chain line shown in FIG. 1), and hold the terminal (in the manner of the ring finger supporting the crotch portion 59). Device 7. At this time, the user can operate the four keys (the first L key 541, the irth key 54J, the second L key 54IC, the ® and the second R key 54L) with the index finger or the middle finger. For example, when the required game operation is more and more complicated, the majority of the keys can be easily operated by being held as shown in Fig. 1 . In addition, since the various types of rockers 53A and 53B are disposed on the upper side of the ten-key 54A and the keys 54E to 54H, the user can operate the analog rockers 53A and 53B by means of the thumb when relatively complicated operations are required. , and can be carried out conveniently. Further, in Fig. 10, the user puts the thumb against the surface of the outer cover 50, presses the index finger against the upper surface of the outer cover 5, and presses the middle finger against the upper surface of the crotch portion 59 of the outer cover 50, and the ring finger is pressed against the crotch portion 59. Below, the little finger is pressed against the back of the outer cover 50 to hold the terminal device 7. Thus, 323330 40 201220109 Can the user hold the terminal device tightly from the four sides of the outer cover 5? . Further, as shown in Fig. 11, the user can hold the middle finger against the bottom of the Zhan 59 (the single-dot chain line not shown in Fig. 11). At this time, the user's month b is enough to easily operate the two groups of the & key and the second key 541^), for example, 'when the required game operation is less and simpler, Hold it as shown in Figure u. In Fig. u, since the user can hold the lower side of the cover 5〇 by two fingers (the ring finger and the little finger), the terminal device 7 can be gripped tightly. In the present embodiment, the lower surface of the I portion 59 is disposed between the various types of ratio rockers 53A and 53B, and the cross key 54A and the four keys 5 to 54H (located in the various types of joysticks 53A and 53B). Below, and the cross key 5M and the four keys 54E to 54H are upper). Therefore, when the ring finger is held against the preparation unit 59 to hold the terminal device 7 (Fig. 1), it is easy to operate the various types of ratio rockers 53A and 53B with the thumb, and the middle finger is pressed against the jaw portion 59 to hold the terminal crack. When 7 (Fig. 11) is used, it is easy to operate the cross key 54A and the four keys 54e to _ 54H with the plum fingers. That is, regardless of the above two cases, the user can perform the direction input operation while holding the terminal device 7 tightly. Further, as described above, the user can also hold the terminal device 7 vertically. That is, as shown in Fig. 12, the user holds the upper side of the terminal device \ with his left hand, whereby the terminal can be held longitudinally. Further, as shown in (d), the user holds the lower side of the terminal device 7 with the left hand, whereby the terminal device 7 can be held longitudinally. In Figs. 12 and 13, the case where the terminal device 7 is held by the left hand is shown, but it is also possible to hold the terminal (4) with the right hand. In this way, since the user can hold the terminal device 7 with one hand, for example, 323330 201220109 can be operated by holding the terminal device 7 with one hand while inputting the touch panel 52 with the other hand. Further, when the terminal device 7 is held by the gripping method shown in Fig. 12, the user touches the lower surface of the crotch portion 59 with a finger other than the thumb (the middle finger, the ring finger, and the little finger in Fig. 12) (Fig. 12) The single-point chain line shown), whereby the terminal device 7 can be held tightly. In particular, in the present embodiment, since the crotch portion 59 is formed to extend left and right (upper and lower in FIG. 12), it is possible to hold the finger other than the thumb regardless of the position on the upper side of the terminal device 7 by the user. The crotch portion 59 can hold the terminal device 7 tightly. That is, the crotch portion 59 can be used as a handle when the user holds the terminal device 7 longitudinally for use. On the other hand, when the terminal device 7 is held by the holding mode shown in Fig. 13, the user can operate the keys 54B to 54D with the left hand. Therefore, for example, the touch panel 52 can be input with one hand, and the hand-to-keys 54B to 541) of the terminal device 7 can be operated, and more operations can be performed. In the terminal device 7 of the present embodiment, when the protruding portion (the crotch portion 59) is provided on the back surface, when the terminal device 7 is placed with the crotch surface of the LCD 51 (the surface of the outer cover 50) facing upward, the surface device 7 is placed. It is slightly tilted. Thereby, it is possible to view the screen more easily in the state in which the terminal device 7 is placed. Further, it is easy to perform an input operation on the touch panel 52 in a state where the terminal device 7 is placed. Further, in another embodiment, an additional protrusion having a height equal to that of the crotch portion 59 may be formed on the back surface of the outer cover 50. According to this, the projections are brought into contact with the ground while the screen of the LCD 51 is facing upward, and the terminal device 7 can be placed to horizontally. In addition, the protrusions of the chasing 42 323330 201220109 can be detachable (or foldable). According to this, the terminal device can be placed when both the state in which the face is slightly inclined and the state in which the face is horizontal. That is, the crotch portion 59 can be used as a foot when the terminal device 7 is placed for use. Each of the operation keys 54A to 54L is appropriately assigned a function corresponding to the game program. For example, the cross key 54A and the keys 54E to 54H can be used for a direction indicating operation or a selection operation, etc., and the keys 54B to 54E can be used for a decision operation or a cancel operation or the like. Further, the terminal device 7 may have a button for turning on/off the face display of the LCD 51 or a button for making a setting (pairing) with the game device 3. As shown in Fig. 8(a), the terminal device 7 is provided with a indicator portion 55 composed of a marker 55A and a marker 55B on the surface of the cover 50. The indicator portion 55 is provided on the upper side of the LCD 51. Each of the marker 55A and the indicator 55B is composed of one or more infrared LEDs, similarly to the markers 6R and 6L of the indicator device 6. The red φ outer-line LEDs constituting the markers 55A and 55B are disposed inside the window portion through which infrared rays can pass. The indicator unit 55 is similar to the above-described indicator device 6, and is used to cause the game device 3 to calculate the operation of the controller 5 or the like. Further, the game device 3 can control the lighting of the respective infrared LEDs provided in the indicator portion 55. The terminal device 7 is provided with a camera 56 as an imaging means. The camera 56 includes an imaging element (for example, a CMOS image sensor or a CCD image sensor) having a predetermined resolution and a lens. As shown in Fig. 8, in the present embodiment, the camera 56 is provided on the surface of the outer cover 50. Therefore, the camera 56 can image the face of the user holding the terminal device 7, for example, 43 323330 201220109, while viewing the LCD 51 while the user is playing the game. In the present embodiment, the camera 56 is disposed between the two markers 55A and 55B. The terminal device 7 is provided with a microphone 69 as a voice input means. A microphone hole 5〇c is provided on the surface of the outer cover 50. The microphone 69 is disposed inside the outer cover 50 in the microphone hole 50c. The microphone 69 detects the sound of the user and the like, and the sound around the terminal device 7. ^ s ^ 竣 7 is equipped with σ 八 77 as a means of sound output. As shown in Fig. 8(d), a horn hole 57 is provided on the lower side of the surface of the outer cover 50. The output sound of the speaker 77 is output from the horn hole 57. In the present embodiment, the terminal block 7 has two horns, and a horn hole 57 is provided in each of the left horn and the right horn. The terminal device 7 is provided with a second knob 64 for adjusting the horn 77. Further, the terminal 褒f 7 is provided with a sound output terminal 62 for connecting a headphone or the like. Here, in consideration of the case where the attachment device is attached to the lower side surface of the outer cover, the sound wheel output terminal 62 and the knob 64 are provided on the upper side of the outer army 50, but may be disposed on the left side or the lower side. . Further, on the outer sheet 50, a window 63 for injecting an infrared signal from the infrared communication module 82 to the outside of the terminal split 7 is provided. Here, the window 63 is provided on the outer cover 5Q in such a manner that the infrared signal is incident on the front side of the user when the two sides of the 1XD 51 are held. In other embodiments, the window 63 can be disposed, for example, on the outer cover 5. The back of the cockroach waits for the position of Ren. Further, the terminal device 7 is provided with a navigation connector 58 for connecting another device to the terminal device 7. Expansion connector Qiuwei_ Communication terminal for receiving and transmitting data (information) between other devices connected to terminal 323330 44 201220109. In the present embodiment, as shown in Fig. 8(d), the expansion connector 58 is provided on the lower side surface of the outer cover 50. The other device connected to the expansion connector 58 may be any device such as a controller (gun type controller or the like) used for a specific game order or an input device such as a keyboard. If there is no need to connect an add-on device, the expansion connector 58 may not be provided. The expansion connector 58 may include a terminal for supplying power to the additional device, or I is a terminal for charging. Further, the terminal device 7 has a charging terminal 66 for taking power from the additional device in addition to the expansion connector 58. When the charging terminal 66 is connected to a stand 210 to be described later, power is supplied from the holder 21 to the terminal device 7. In the present embodiment, the charging terminal 66 is provided on the lower side surface of the outer cover 5A. Therefore, when the terminal device 7 and the additional device are connected (for example, the input device 2 shown in Fig. 15 or the input device 22 shown in Fig. 17)

至終端裝置7。本實名 罩50的下侧側面者, 50的上侧側面,彳曰; 施形態中’考量到附加裝置設置在外 充電連接器(蓋部61)係設置在外罩 亦可。又置在左右侧面或下側側面。 323330 45 201220109 此外,終端裝置7具有可裝卸於外罩50之電池蓋67。 於電池蓋67的内側配置有電池(第14圖所示之電池85)。 本實施形態中,電池蓋67係設置在外罩50的背側,並設 置在突起部(簷部59)的下側。 此外,於終端裝置7的外罩50上,設置有用以讓吊帶 的帶子綁住固定之孔65a及65b。如第8圖(d)所示,本實 施形態中,孔65a及65b係設置在外罩50的下面。此外, 本實施形態中,2個孔65a及65b係在外罩50的左右兩侧 ® 分別設置1個。亦即,孔65a設置在外罩50的下面較中央 為靠左側處,孔65b設置在外罩50的下面較中央為靠右側 處。使用者可將吊帶綁住於孔65a及65b中的任一個,並 將吊帶綁住於本身的手腕。藉此,即使當使用者不小心使 終端裝置7掉落或是使終端裝置7從手中脫離時,亦可防 止終端裝置7掉落或與其他物品碰撞。本實施形態中,由 於在左右兩侧分別設置有孔,因此使用者可將吊帶綁住於 ^ 任一隻手,因而極為便利。 關於第8圖至第13圖所示之終端裝置7,各操作鍵或 外罩50的形狀,或是各構成要素的數目及設置位置等僅僅 為一例,亦可為其他形狀、數目及設置位置。 接著參照第14圖來說明終端裝置7的内部構成。第 14圖係顯示終端裝置7的内部構成之方塊圖。如第14圖 所示,終端裝置7除了第8圖所示之構成外,亦具備:觸 控面板控制器71、磁性感測器72、加速度感測器73、迴 轉感測器74、使用者介面控制器(UI控制器)75、編解碼器 46 323330 201220109 LSI 76、喇叭77、聲音IC 78、麥克風79、無線模組80、 天線81、紅外線通訊模組82、快閃記憶體83、電源1C 84、 及電池85。此等電子零件係構裝於電子電路基板上並收納 於外罩50内。 UI控制器75,為用以對各種輸出輸入部控制資料的輸 出輸入之電路。UI控制器75,係連接於觸控面板控制器 71、類比搖桿53(類比搖桿53A及53B)、操作鍵54(各操 作鍵54A至54L)、標示部55、磁性感測器72、加速度感 ® 測器73、及迴轉感測器74。此外,UI控制器75係與編解 碼器LSI 76及擴充連接器58連接。此外,電源1C 84連 接於UI控制器75,並經由UI控制器75將電力供給至各 部。内建的電池85連接於電源1C 84並供給電力。此外, 可將經由充電連接器等從外部電源取得電力之充電器77 或纜線連接於電源1C 84,終端裝置7,當使用該充電器 86或纜線從外部電源供給電力時,可進行充電。終端裝置 φ 7,亦可藉由將終端裝置7裝著於未圖示之具有充電功能的 座充來進行充電。亦即,雖然圖中未顯示,但可經由充電 端子66,將可從外部電源取得電力之座充(第20圖所示之 支架210)連接於電源1C 84,終端裝置7可使用該座充來 進行來自外部電源的電力供給以及充電。 觸控面板控制器71連接於觸控面板52,為進行觸控 面板52的控制之電路。觸控面板控制器71係根據來自觸 控面板52的訊號,生成預定形式的觸控位置資料並輸出至 UI控制器75。觸控位置資料,係顯示出在觸控面板52的 47 323330 201220109 輸入面上進行輸入後之位置的座標。觸控面板控制器71係 讀取來自觸控面板52的訊號,並以每隔預定時間為1次之 比率來生成觸控位置資料。此外,對觸控面板52所進行之 各種控制指示,係從UI控制器75輸出至觸控面板控制器 '71 ° 類比搖桿53,係將顯示出以使用者的手指進行操作之 搖桿部所滑動(或傾倒)之方向及量之搖桿資料輸出至UI 控制器75。此外,操作鍵54係將顯示出對各操作鍵54A β 至54L所進行之輸入狀況(是否壓下)之操作鍵資料輸出至 UI控制器75。 磁性感測器72,係藉由偵測磁場的大小及方向來偵測 方位。顯示出所偵測之方位的方位資料被輸出至UI控制器 75。此外,對磁性感測器72所進行之控制指示,係從UI 控制器75輸出至磁性感測器72。關於磁性感測器72,有 採用MI(磁性阻抗)元件、磁通量閘感測器、霍爾元件、GMR φ (巨量磁性電阻)元件、TMR(穿隧磁性電阻)元件、或是AMR (異向磁性電阻)元件等之感測器,但只要可偵測方位者均 可使用。嚴格來說,在除了地磁之外產生磁場之場所中, 所得之方位資料未顯示出方位,但即使在此情況下,由於 終端裝置7移動時方位資料亦產生變化,因此可算出終端 裝置7的姿勢變化。 加速度感測器73係設置在外罩50内部,並偵測出沿 著3軸(第8圖的(a)圖所示之xyz軸)方向之直線加速度的 大小。具體而言,加速度感測器73以外罩50的長邊方向 48 323330 201220109 為x軸,以垂直於外罩50的表面之方向為y軸,以外罩 5 0的短邊方向為z軸,來债測出各軸之直線加速度的值。 顯示出所偵測之加速度之加速度資料被輸出至UI控制器 7 5。此外,對加速度感測器7 3所進行之控制指示,係從 UI控制器75輸出至加速度感測器73。加速度感測器73在 本實施形態例如為靜電電容型的MEMS型加速度感測器,但 在其他實施形態中,亦可使用其他方式的加速度感測器。 此外,加速度感測器73亦可為偵測出1軸或2軸方向之加 ®速度感測器。 迴轉感測器74係設置在外罩50内部,並偵測出繞著 上述X軸、y軸、及z軸的3軸之角速度。顯示出所偵測 之角速度之角速度資料被輸出至UI控制器75。此外,對 迴轉感測器74所進行之控制指示,係從UI控制器75輸出 至迴轉感測器74。用於偵測出3軸的角速度之迴轉感測器 的數目及組合可為任意,迴轉感測器74,與迴轉感測器48 ^ 相同,可由2軸迴轉感測器與1軸迴轉感測器所構成。此 外,迴轉感測器7 4亦可為偵測出1軸或2軸方向之迴轉感 測器。 UI控制器75,係將包含從上述各構成要素所接收之觸 控位置資料、搖桿資料、操作鍵資料、方位資料、加速度 資料、及角速度資料之操作資料輸出至編解碼器LSI 76。 當經由擴充連接器58將其他裝置與終端裝置7連接時,上 述操作資料更可包含顯示出對該其他裝置所進行之操作之 資料。 49 323330 201220109 編解碼器LSI 76為對傳送至遊戲裝置3之資料進行壓 縮處理,以及對從遊戲裝置3所傳送來之資料進行解壓縮 處理的電路。於編解碼器LSI 76,連接有LCD 51、攝影機 56、聲音1C 78、無線模組80、快閃記憶體83、及紅外線 通訊模組82。此外,編解碼器LSI 76包含CPU 87及内部 記憶體88。終端裝置7雖然構成為不進行遊戲處理本身, 但必須執行用於終端裝置7的管理或通訊之最低程度的程 式。開啟電源時將儲存於快閃記憶體83之程式讀取至内部 ® 記憶體88並由CPU 87來執行,藉此來啟動終端裝置7。 此外,内部記憶體88的一部分區域係用作為LCD 51的 VRAM。 攝影機56係依循來自遊戲裝置3的指示將圖像予以攝 像,並將攝像後的圖像資料輸出至編解碼器LSI 76。此外, 圖像的攝像指示等之對攝影機56所進行之控制指示,係從 編解碼器LSI 76輸出至攝影機56。攝影機56亦可進行動 φ 晝的攝影。亦即,攝影機56可進行重複攝像並將圖像資料 重複輸出至編解碼器LSI 76。 聲音1C 78係連接於喇77及麥克風79,且為對喇 叭77及麥克風79控制聲音資料的輸出輸入之電路。亦即, 當從編解碼器LSI 76接收聲音資料時,聲音1C 78係將對 該聲音資料進行D/A轉換所得之聲音訊號輸出至喇叭77, 並從喇《八77輸出聲音。此外,麥克風79偵測出傳達至終 端裝置7之聲音(使用者的聲音等),並將顯示該聲音之聲 音訊號輸出至聲音1C 78。聲音1C 78對來自麥克風79的 50 323330 201220109 聲音訊號進行A/D轉換,並將預定形式的聲音資料輸出至 編解碼器LSI 76。 編解碼器LSI 76,係將來自攝影機56的圖像資料、 來自麥克風79的聲音資料、以及來自UI控制器75的操作 資料,作為終端操作資料經由無線模組80傳送至遊戲裝置 3。本實施形態中,編解碼器LSI 76對圖像資料及聲音資 料進行與編解碼器LSI 27相同的壓縮處理。上述終端操作 資料以及壓縮後的圖像資料及聲音資料,係作為傳送資料 ® 被輸出至無線模組80。於無線模組80連接有天線81,無 線模組80經由該天線81將上述傳送資料傳送至遊戲裝置 3。無線模組80,具有與遊戲裝置3的終端通訊模組28同 樣功能。亦即,無線模組80係具有例如藉由依據IEEE 802. 1 In規格之方式而連接於無線LAN之功能。所傳送之 資料,可因應必要進行編碼或不進行編碼。 如上所述,從終端裝置7傳送至遊戲裝置3之傳送資 φ 料中,係包含操作資料(終端操作資料)、圖像資料、及聲 音資料。當經由擴充連接器58將其他裝置與終端裝置7連 接時,上述傳送資料更可包含從該其他裝置所接收之資 料。此外,紅外線通訊模組82,在與其他裝置之間可進行 例如依循IRM規格之紅外線通訊。編解碼器LSI 76,可 因應必要將經由紅外線通訊所接收之資料包含於上述傳送 資料而傳送至遊戲裝置3。 此外,如上所述,壓縮後的圖像資料及聲音資料係從 遊戲裝置3傳送至終端裝置7。此等資料經由天線81及無 51 323330 201220109 線模組80而被編解碼器LSI 76接收。編解碼器lsi 76將 接收的圖像資料及聲音資料解屋縮。解壓縮後的圖像資料 被輸出至LCD 5卜而在LCD 51上顯示圖像。亦即,編解 碼器LSI 76(CPU 87)將所接收之圖像資料顯示於顯示部。 此外,解壓縮後的聲音資料被輸出至聲音IC78,聲音〖CM 從喇77輸出聲音。 此外,當從遊戲裝置3所接收之資料中包含控制資料 時,編解碼器LSI 76及UI控制器75係對各部進行依循控 籲制資料之控制指示。如上所述,控制資料為表示出對終端 裝置7所具備之各構成要素(本實施形態巾,騎影機%、 觸控面板控制器71、標示部55、各感測器犯至74、及紅 外線通訊模組82)所進行之控制指示的資料。本實施形態 中,控制資料所表示之控制指示,可考量為使上述各構成 要素動作、或是使動作休止(停止)之指示。亦即,對於遊 戲中未使用之構成要素,為了抑制電力消耗可進行休止, 籲此時’從終端裝置7傳送至遊戲裝置3之傳送資料中,係 設為不包含來自休止的構成要素之資料。由於標示部巧為 紅外線LED ’所以該控制可僅設為電力供給的導通/關閉。 如上所述,終端褒置7具備觸控面板52、類比搖桿53、 及操作鍵54之操作手段,但在其他實施形態中,可構成為 具備其他操作手段來取代此等操作手段或一同具備。 此外’終端裝置7係具備磁性感測器72、加速度感測 器73、及迴轉感測器74,作為用以算出終端褒置了的動作 (包含位置或姿勢或是位置或姿勢的變化)之制器,但在 323330 52 201220109 其他實施形態中,可構成為僅具備此等感測器中的1個或 2個。此外,其他實施形態中,可構成為具備其他感測器 來取代此等感測器或一同具備。 此外,終端裝置7係具備攝影機56及麥克風79而構 成,但在其他實施形態中,亦可不具備攝影機56及麥克風 79或僅具備當中任一個。 此外,終端裝置7係具備標示器55作為用以算出終端 裝置7與控制器5之位置關係(從控制器5觀看時之終端裝 置7的位置及/或姿勢等)的構成,但在其他實施形態中, 亦可不具備標示器55而構成。此外,其他實施形態中,終 端裝置7可具備其他手段作為用以算出上述位置關係的構 成。例如,其他實施形態中,控制器5可具備標示部且終 端裝置7具備攝像元件而構成。此外,此時標示裝置6可 具備攝像元件來取代紅外線LED而構成。 (附加裝置的構成) 接著參照第15圖至第20圖,說明可裝著(連接)於終 端裝置7之附加裝置的例子。附加裝置可具有任意功能, 例如,為了進行預定操作而裝著於終端裝置7之追加的操 作裝置,或是對終端裝置7進行供電之充電器,或是用以 使終端裝置7豎立為預定姿勢之支架。 如第8圖(d)及第9圖所示,於突起部(簷部59)的下 面,設置有附加裝置所具有的爪部能夠卡止之卡止孔59a 及59b。卡止孔59a及59b係在連接其他附加裝置於終端 裝置7時所用。亦即,附加裝置具有可卡止於卡止孔59a 53 323330 201220109 P田將附加裝置連接於終端裝置7時,择由 使爪部卡止於卡止孔59a « 而政处 f 错由 5此及59b而將終端裝置7與附加梦 置固定。此外,於卡止孔59a及59b的内部更可設置= 孔,並以螺絲堅固地固定附加裝置。在此,設置在線 置7的背面之突起部,為具有詹狀的形狀之廣部59。亦^ 詹部59係往左右方向延伸設置。如第9圖所示,卡止孔 59a及59b係設置在詹部59的下面之(左右方向上)的 附近二設置在詹部59的下面之卡止孔59a及_的個數可 為任意數目’為1時,較録設£在料59的巾央,為複 數個犄,較佳配置為左右對稱。根據此,可均等地保持左 右均衡而安定地連接附加襄置。此外,當卡止孔設置在中 央附近時’與設置在左右兩端時相比,可縮小附加裝置的 大小。亦即,簷部59可用作為附加襞置的卡止構件。 此外,本實施形態中,如第8圖((1)所示,外罩5〇的 下面設置有卡止孔50a及50b。因此,當將附加裝置連接 於終端裝置7時,藉由使4個爪部分別卡止於4個各卡止 孔而將終端裝置7與附加裝置固定。藉此更能夠將附加裝 置堅固地固定在終端裝置7。於卡止孔5〇a及5〇b的内部 亦可设置螺絲孔,以將附加裝置螺合固定。此外,其他實 施形態中,設置在外罩之卡止孔可為任意配置。 第15圖及第16圖係顯示將附加裝置裝著於終端裝置 7之一例圖。第15圖為從終端裝置7的表面侧觀看終端裝 置7及輸入裝置200之圖,第16圖為從終端裝置7的背面 侧觀看終端裝置7及輸入裝置200之圖。第15圖及第16 323330 54 201220109 圖中^為附加裝置的輪人裝置钱著於終端裝置7。 剧入’置_係具備第1握把部2QQa及第2握把部 200b。各握把部2〇〇a及?nnu、 2〇〇b分別為棒狀(柱狀)形狀,使 者可早手握持。使用者可握持各振把部 200a 及 2〇〇b 中 2方來使用輸入裝置2〇0(及終端裝置7),或是握持兩方 來使用輸人裝置糊。輪人裝置亦可構成為僅具備i 個握把部。此外,輸入裂置_具備支撐部205。本實施To the terminal device 7. The lower side of the cover 50, the upper side of the cover 50, and the upper side of the cover 50 are considered to be attached to the outer charging connector (the cover portion 61). It is placed on the left and right side or the lower side. 323330 45 201220109 Further, the terminal device 7 has a battery cover 67 that is detachable from the outer cover 50. A battery (battery 85 shown in Fig. 14) is disposed inside the battery cover 67. In the present embodiment, the battery cover 67 is provided on the back side of the outer cover 50, and is provided on the lower side of the protruding portion (the crotch portion 59). Further, on the outer cover 50 of the terminal device 7, holes 65a and 65b for attaching the straps of the straps to the fixing are provided. As shown in Fig. 8(d), in the present embodiment, the holes 65a and 65b are provided on the lower surface of the outer cover 50. Further, in the present embodiment, two holes 65a and 65b are provided on the left and right sides of the outer cover 50, respectively. That is, the hole 65a is provided on the lower side of the outer cover 50 to the left side, and the hole 65b is provided on the lower side of the outer cover 50 to the right side. The user can tying the sling to either of the holes 65a and 65b and attaching the sling to the wrist of the user. Thereby, even when the user accidentally drops the terminal device 7 or detaches the terminal device 7 from the hand, the terminal device 7 can be prevented from falling or colliding with other articles. In the present embodiment, since the holes are provided on the left and right sides, the user can attach the sling to any of the hands, which is extremely convenient. Regarding the terminal device 7 shown in Figs. 8 to 13, the shape of each of the operation keys or the cover 50, the number of each component, the installation position, and the like are merely examples, and may be other shapes, numbers, and installation positions. Next, the internal configuration of the terminal device 7 will be described with reference to Fig. 14. Fig. 14 is a block diagram showing the internal configuration of the terminal device 7. As shown in FIG. 14, the terminal device 7 includes a touch panel controller 71, a magnetic sensor 72, an acceleration sensor 73, a rotation sensor 74, and a user in addition to the configuration shown in FIG. Interface controller (UI controller) 75, codec 46 323330 201220109 LSI 76, speaker 77, sound IC 78, microphone 79, wireless module 80, antenna 81, infrared communication module 82, flash memory 83, power supply 1C 84, and battery 85. These electronic components are mounted on an electronic circuit board and housed in the housing 50. The UI controller 75 is a circuit for inputting output of control data to various output input sections. The UI controller 75 is connected to the touch panel controller 71, the analog rocker 53 (analog rocker 53A and 53B), the operation keys 54 (each operation keys 54A to 54L), the indicator portion 55, the magnetic sensor 72, Acceleration detector 73, and swing sensor 74. Further, the UI controller 75 is connected to the codec LSI 76 and the expansion connector 58. Further, the power source 1C 84 is connected to the UI controller 75, and supplies power to each unit via the UI controller 75. The built-in battery 85 is connected to the power source 1C 84 and supplies power. Further, a charger 77 or a cable that obtains electric power from an external power source via a charging connector or the like can be connected to the power source 1C 84, and the terminal device 7 can be charged when the charger 86 or cable is used to supply electric power from an external power source. . The terminal device φ 7 can also be charged by attaching the terminal device 7 to a charger having a charging function (not shown). That is, although not shown in the drawing, the charger (the holder 210 shown in FIG. 20) that can obtain electric power from the external power source can be connected to the power source 1C 84 via the charging terminal 66, and the terminal device 7 can use the charger. Power supply and charging from an external power source are performed. The touch panel controller 71 is connected to the touch panel 52 and is a circuit for controlling the touch panel 52. The touch panel controller 71 generates a predetermined form of touch position data based on the signal from the touch panel 52 and outputs it to the UI controller 75. The touch position data indicates the coordinates of the position after input on the input face of the 47 323330 201220109 of the touch panel 52. The touch panel controller 71 reads the signal from the touch panel 52 and generates the touch position data at a ratio of one time every predetermined time. In addition, various control instructions for the touch panel 52 are output from the UI controller 75 to the touch panel controller '71 ° analog rocker 53 to display the rocker portion operated by the user's finger. The joystick data of the direction and amount of the sliding (or dumping) is output to the UI controller 75. Further, the operation key 54 outputs an operation key data indicating the input state (whether or not) of the operation keys 54A to 54L to the UI controller 75. The magnetic sensor 72 detects the orientation by detecting the magnitude and direction of the magnetic field. The orientation data showing the detected orientation is output to the UI controller 75. Further, a control instruction to the magnetic sensor 72 is output from the UI controller 75 to the magnetic sensor 72. Regarding the magnetic sensor 72, there are MI (magnetic impedance) elements, magnetic flux gate sensors, Hall elements, GMR φ (major magnetoresistance) elements, TMR (tunneling magnetic resistance) elements, or AMR (different). A sensor such as a magnetic resistance element, but can be used as long as it can detect the orientation. Strictly speaking, in the place where the magnetic field is generated in addition to the geomagnetism, the obtained orientation data does not show the orientation, but even in this case, since the orientation data changes when the terminal device 7 moves, the terminal device 7 can be calculated. Posture changes. The acceleration sensor 73 is disposed inside the outer cover 50 and detects the magnitude of linear acceleration along the three axes (the xyz axis shown in (a) of Fig. 8). Specifically, the longitudinal direction 48 323330 201220109 of the outer cover 50 of the acceleration sensor 73 is the x-axis, the direction perpendicular to the surface of the outer cover 50 is the y-axis, and the short-side direction of the outer cover 50 is the z-axis. The value of the linear acceleration of each axis is measured. The acceleration data showing the detected acceleration is output to the UI controller 75. Further, a control instruction to the acceleration sensor 73 is output from the UI controller 75 to the acceleration sensor 73. In the present embodiment, the acceleration sensor 73 is, for example, a capacitive MEMS type acceleration sensor. However, in other embodiments, other types of acceleration sensors may be used. In addition, the acceleration sensor 73 can also detect the acceleration sensor in the 1-axis or 2-axis direction. The swing sensor 74 is disposed inside the outer cover 50 and detects angular velocities of three axes around the X-axis, the y-axis, and the z-axis. The angular velocity data showing the detected angular velocity is output to the UI controller 75. Further, a control instruction to the swing sensor 74 is output from the UI controller 75 to the swing sensor 74. The number and combination of the rotary sensors for detecting the angular velocity of the three axes can be any, and the rotary sensor 74 is the same as the rotary sensor 48^, and can be sensed by a 2-axis rotary sensor and a 1-axis rotary sensor. The composition of the device. In addition, the swing sensor 74 can also be a rotary sensor that detects a 1-axis or 2-axis direction. The UI controller 75 outputs the operation data including the touch position data, the joystick data, the operation key data, the orientation data, the acceleration data, and the angular velocity data received from the respective constituent elements to the codec LSI 76. When other devices are connected to the terminal device 7 via the expansion connector 58, the operational data may further include information showing operations performed on the other devices. 49 323330 201220109 The codec LSI 76 is a circuit that compresses the data transmitted to the game device 3 and decompresses the data transmitted from the game device 3. The codec LSI 76 is connected to an LCD 51, a camera 56, a sound 1C 78, a wireless module 80, a flash memory 83, and an infrared communication module 82. Further, the codec LSI 76 includes a CPU 87 and an internal memory 88. The terminal device 7 is configured not to perform the game processing itself, but must perform the minimum degree of management or communication for the terminal device 7. When the power is turned on, the program stored in the flash memory 83 is read to the internal ® memory 88 and executed by the CPU 87, whereby the terminal device 7 is activated. Further, a part of the area of the internal memory 88 is used as the VRAM of the LCD 51. The camera 56 images the image in accordance with an instruction from the game device 3, and outputs the image data after the image capture to the codec LSI 76. Further, a control instruction for the camera 56, such as an image capturing instruction, is output from the codec LSI 76 to the camera 56. The camera 56 can also perform photography of moving φ 昼. That is, the camera 56 can perform repeated imaging and repeatedly output image data to the codec LSI 76. The sound 1C 78 is connected to the slave 77 and the microphone 79, and is a circuit for controlling the output of the sound data to the speaker 77 and the microphone 79. That is, when the sound data is received from the codec LSI 76, the sound 1C 78 outputs an audio signal obtained by D/A conversion of the sound data to the speaker 77, and outputs sound from the "eighth 77". Further, the microphone 79 detects the sound transmitted to the terminal device 7 (the user's voice or the like), and outputs the sound signal for displaying the sound to the sound 1C 78. The sound 1C 78 A/D converts the 50 323330 201220109 sound signal from the microphone 79, and outputs the sound data of a predetermined form to the codec LSI 76. The codec LSI 76 transmits image data from the camera 56, sound data from the microphone 79, and operation data from the UI controller 75 to the game device 3 via the wireless module 80 as terminal operation data. In the present embodiment, the codec LSI 76 performs the same compression processing as the codec LSI 27 on the image data and the audio data. The terminal operation data and the compressed image data and sound data are output to the wireless module 80 as transmission data. An antenna 81 is connected to the wireless module 80, and the wireless module 80 transmits the transmission data to the game device 3 via the antenna 81. The wireless module 80 has the same function as the terminal communication module 28 of the game device 3. That is, the wireless module 80 has a function of being connected to the wireless LAN by, for example, a method according to the IEEE 802.1 In specification. The transmitted data may or may not be encoded as necessary. As described above, the transmission information transmitted from the terminal device 7 to the game device 3 includes operation data (terminal operation data), image data, and sound data. When other devices are connected to the terminal device 7 via the expansion connector 58, the above-mentioned transmission data may further contain information received from the other devices. In addition, the infrared communication module 82 can perform infrared communication, for example, in accordance with IRM specifications, with other devices. The codec LSI 76 can transmit the data received via the infrared communication to the game device 3 by including the above-mentioned transmission data as necessary. Further, as described above, the compressed image data and sound data are transmitted from the game device 3 to the terminal device 7. These data are received by the codec LSI 76 via the antenna 81 and the line module 80 without the 51 323330 201220109. The codec lsi 76 decompresses the received image data and sound data. The decompressed image data is output to the LCD 5 to display an image on the LCD 51. That is, the codec LSI 76 (CPU 87) displays the received image data on the display unit. Further, the decompressed sound data is output to the sound IC 78, and the sound 〖CM outputs sound from the slave 77. Further, when the control data is included in the material received from the game device 3, the codec LSI 76 and the UI controller 75 instruct the respective units to follow the control information of the control data. As described above, the control data indicates that each component included in the terminal device 7 is present (in the present embodiment, the rider%, the touch panel controller 71, the indicator 55, and the sensors are 74, and The information of the control indication performed by the infrared communication module 82). In the present embodiment, the control instruction indicated by the control data may be an instruction to operate the above-described components or to stop (stop) the operation. In other words, in order to suppress the power consumption, the components that are not used in the game can be stopped. In this case, the data transmitted from the terminal device 7 to the game device 3 is set to include data from the components of the suspension. . Since the indicator portion is an infrared LED', the control can be set only to turn on/off the power supply. As described above, the terminal device 7 includes the operation means of the touch panel 52, the analog rocker 53, and the operation key 54, but in other embodiments, other operation means may be provided instead of or in addition to the operation means. . Further, the terminal device 7 includes a magnetic sensor 72, an acceleration sensor 73, and a rotation sensor 74 as calculations for calculating the position (including position or posture or position or posture change) of the terminal. However, in another embodiment, 323330 52 201220109 may be configured to include only one or two of the sensors. Further, in other embodiments, other sensors may be provided instead of or in addition to the sensors. Further, the terminal device 7 is configured to include the camera 56 and the microphone 79. However, in other embodiments, the camera 56 and the microphone 79 may not be provided or only one of them may be provided. Further, the terminal device 7 includes a marker 55 as a configuration for calculating a positional relationship between the terminal device 7 and the controller 5 (a position and/or a posture of the terminal device 7 when viewed from the controller 5, etc.), but other implementations In the form, the indicator 55 may not be provided. Further, in other embodiments, the terminal device 7 may be provided with other means as a configuration for calculating the positional relationship. For example, in another embodiment, the controller 5 may be provided with an indicator portion and the terminal device 7 may be provided with an imaging element. Further, at this time, the indicator device 6 may be provided with an image pickup element instead of the infrared LED. (Configuration of Attachment Device) Next, an example of an attachment device that can be attached (connected) to the terminal device 7 will be described with reference to Figs. 15 to 20 . The attachment device may have any function, for example, an additional operation device attached to the terminal device 7 for performing a predetermined operation, or a charger for supplying power to the terminal device 7, or for erecting the terminal device 7 to a predetermined posture. The bracket. As shown in Figs. 8(d) and 9, the locking holes 59a and 59b which the claws of the attachment device can lock are provided below the projections (the crotch portion 59). The locking holes 59a and 59b are used when connecting other attachment means to the terminal device 7. That is, the attachment device has a lockable hole 59a 53 323330 201220109. When the attachment device is connected to the terminal device 7, the claw is locked to the locking hole 59a « and the government is wrong. And the terminal device 7 and the additional dream are fixed by 59b. Further, a = hole may be provided inside the locking holes 59a and 59b, and the attachment is firmly fixed by screws. Here, the projection on the back surface of the wire 7 is provided as a wide portion 59 having a shape of a Jane shape. Also ^ Zhan Department 59 series is extended in the left and right direction. As shown in Fig. 9, the locking holes 59a and 59b are provided in the vicinity of the lower side of the Zhan portion 59 (in the left-right direction). The number of the locking holes 59a and _ provided under the Zhan portion 59 can be arbitrary. When the number ' is 1, it is more than a 犄, which is preferably placed in the center of the towel of the material 59. According to this, the additional device can be stably connected while maintaining the right and left equalization. Further, when the locking hole is disposed near the center, the size of the attachment can be reduced as compared with when it is disposed at the left and right ends. That is, the crotch portion 59 can be used as a locking member for the additional device. Further, in the present embodiment, as shown in Fig. 8 ((1), the locking holes 50a and 50b are provided on the lower surface of the outer cover 5A. Therefore, when the attachment device is connected to the terminal device 7, four are provided. The claw portions are respectively locked to the four locking holes to fix the terminal device 7 and the attachment device. Thereby, the attachment device can be firmly fixed to the terminal device 7. The inside of the locking holes 5〇a and 5〇b A screw hole may be provided to screw the attachment to the attachment. Further, in other embodiments, the locking hole provided in the cover may be arbitrarily arranged. Figures 15 and 16 show the attachment of the attachment to the terminal device. Fig. 15 is a view of the terminal device 7 and the input device 200 viewed from the front side of the terminal device 7, and Fig. 16 is a view of the terminal device 7 and the input device 200 viewed from the back side of the terminal device 7. 15 and 16 323 330 54 201220109 In the figure, the wheel device of the attachment device is placed on the terminal device 7. The drama is provided with a first grip portion 2QQa and a second grip portion 200b. 2〇〇a and ?nnu, 2〇〇b are rod-shaped (columnar) shapes, respectively The user can hold the two sides of each of the vibrating parts 200a and 2b to use the input device 2〇0 (and the terminal device 7), or hold both sides to use the input device paste. The human device may be configured to include only i grip portions. Further, the input splitting_ includes a support portion 205. This embodiment

形態中,支樓部205係支撐終端裝置7的背面(内面)。且 體而言’支樓部205具有4個爪部(凸部),4個爪部可; 別卡止於卡止孔50a、50b、59a&59b。 如第15圖所示,當將輪入裝置200連接於終端裝置7 時,藉由使4個爪部分別卡止於上述卡止孔_、娜、哪 及59b’而將終端裝置7與附加裝置岐。藉此更能夠將 輸入裝置200堅固地固定在終端裝置7。此外,其他實施 形態中,除了爪部與卡止孔的卡止之外(或是取代卡止的方 式)’亦可藉由將輸入裝置2〇〇與終端裝置7螺合固定等, 而將輸入裝置200更堅固地固定在終端裝置7。螺合固定 的位置可為任意位置,例如可將抵接於外罩5〇的背面之輸 入裝置200的支撐部205與簷部59螺合固定。 如此’本實施形態中,可藉由卡止孔59a及59b將附 加裝置緊緊地固定在終端裝置7。終端裝置7,由於具有用 以偵測終端裝置7的動作或傾斜之感測器(磁性感測器 72、加速度感測器73、迴轉感測器74),所以亦可使終端 裝置7本身移動來使用。例如,當將第15圖及第16圖所 323330 55 201220109 示之輸入裝置200連接於終端裝置7時,亦可構成為使用 者握持輸入裝置200的握把部200a及/或握把部200b,並 如搶般地移動輸入裝置200來操作之型態。如本實施形 態’當假定使終端裝置7本身移動來使用時,對於藉由卡 止孔59a及59b緊緊地固定附加裝置者乃特別有效。 此外,本實施形態中,支撐部205,係以當第1握把 部200a(或第2握把部200b)朝垂直方向時使LCD 51的書 面呈大致垂直朝向之方式,可裝卸地支撐終端裝置7 ^各 ® 握把部200a及200b,係形成為與連接於輸入裝置2〇〇之 終端装置7的顯不部(外罩50的表面)大致平行。換言之, 各握把部200a及200b,係朝向連接於輸入裝置2〇〇之終 端裝置7的顯示部的上下方向而形成。如此,輸入裝置 200,係以(使用者握持輸入裝置200時)使終端裝置7的顯 示部朝向使用者之姿勢與終端裝置7連接。使用者,藉由 大致垂直地握持各握把部200a及200b(的至少任一個), φ 可使顯示部的晝面朝向自己’因此可一邊觀看顯示部的晝 面一邊使用輸入裝置200進行操作。本實施形態中,第2 握把部200b係朝向與第1握把部200a大致平行的方向, 但在其他實施形態中,可至少使1個握把部形成為與Lcd 51 的畫面大致平行之朝向。藉此,使用者可藉由握住該握把 部’使LCD 51朝向自己而容易地握持輸入裝置2〇〇及終端 裝置7。 此外’上述實施形態中’支撐部205係設置在連接第 1握把部200a及第2握把部200b之連接構件2〇6。亦即, 323330 56 201220109 由於支撐部205設置在2個握把部2〇〇a及200b之間,所 以連接於輸入裝置200之終端裝置γ被配置在2個握把部 2〇〇a及2〇〇b之間。此時,由終端裝置7與輸入裝置2〇〇 所構成之操作裝置(操作系統)的重心位於2個握把部200a 及200b之間,因此使用者可藉由握住2個握把部2〇〇&及In the form, the branch portion 205 supports the back surface (inner surface) of the terminal device 7. Further, the branch portion 205 has four claw portions (protrusions), and four claw portions are slidable; the latching holes 50a, 50b, 59a & 59b are not locked. As shown in Fig. 15, when the wheeling device 200 is connected to the terminal device 7, the terminal device 7 is attached by attaching the four claw portions to the locking holes _, Na, and 59b', respectively. Device 岐. Thereby, the input device 200 can be firmly fixed to the terminal device 7. In addition, in other embodiments, in addition to the locking of the claw portion and the locking hole (or instead of locking), the input device 2A and the terminal device 7 may be screwed and fixed, etc. The input device 200 is more firmly fixed to the terminal device 7. The position where the screw is fixed can be any position. For example, the support portion 205 of the input device 200 that abuts against the back surface of the outer cover 5 can be screwed and fixed to the crotch portion 59. Thus, in the present embodiment, the attachment device can be tightly fixed to the terminal device 7 by the locking holes 59a and 59b. The terminal device 7 has a sensor (magnetic sensor 72, acceleration sensor 73, and rotation sensor 74) for detecting the motion or tilt of the terminal device 7, so that the terminal device 7 itself can be moved. To use. For example, when the input device 200 shown in FIGS. 15 and 16 of 323330 55 201220109 is connected to the terminal device 7, the user may also hold the grip portion 200a and/or the grip portion 200b of the input device 200. And, as in the case of grabbing, the input device 200 is moved to operate. As in the present embodiment, when it is assumed that the terminal device 7 itself is moved for use, it is particularly effective for the attachment device to be tightly fixed by the locking holes 59a and 59b. Further, in the present embodiment, the support portion 205 detachably supports the terminal so that the writing of the LCD 51 is substantially perpendicular when the first grip portion 200a (or the second grip portion 200b) faces in the vertical direction. The device 7 ^ each of the grip portions 200a and 200b is formed to be substantially parallel to the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 2A. In other words, each of the grip portions 200a and 200b is formed in the vertical direction of the display portion of the terminal device 7 connected to the input device 2A. As described above, the input device 200 connects the display unit of the terminal device 7 to the terminal device 7 in a posture toward the user when the user holds the input device 200. The user holds the grip portions 200a and 200b substantially at least vertically, and φ allows the face of the display portion to face toward the user'. Therefore, the input device 200 can be used while viewing the face of the display portion. operating. In the present embodiment, the second grip portion 200b is oriented substantially parallel to the first grip portion 200a. However, in other embodiments, at least one grip portion may be formed to be substantially parallel to the screen of the Lcd 51. Orientation. Thereby, the user can easily hold the input device 2 and the terminal device 7 by holding the grip portion ' toward the self. Further, in the above-described embodiment, the support portion 205 is provided in the connecting member 2〇6 that connects the first grip portion 200a and the second grip portion 200b. That is, 323330 56 201220109 Since the support portion 205 is provided between the two grip portions 2a and 200b, the terminal device γ connected to the input device 200 is disposed in the two grip portions 2a and 2 〇〇b between. At this time, the center of gravity of the operating device (operating system) constituted by the terminal device 7 and the input device 2 is located between the two grip portions 200a and 200b, so that the user can hold the two grip portions 2 by 〇〇& 

200b並握持,而輕鬆地握持操作裝置。上述實施形態中, 方的握把部(第1握把部2〇〇a),係設置在成為輸入裝置 200上所裝著之終端裝置7的畫面前側之位置上,另一方 的ί至把。卩(第2握把部20Gb)設置在該晝面後側之位置上。 因’使用者使—方的手位於晝面前方,使另-方的手位 ;旦,後方’並以持搶的握法握持2個握把部,可容易地 握^作裝置°因此’例如在將上述操作裝置用作為搶來 進行,戲操作之射擊賴㈣,上述操作裝置制適合。 此外,輸入震置2〇〇係具備第1鍵2〇卜第2鍵2〇2、 ^ 2〇3、及搖桿204作為操作部。各鍵201至203分 二使用者所壓下之鍵(按鍵)。搖桿204為可指示方向 之裝置。上述猫 .lV ,p ^ 锦作部,較佳是位於使用者握持握把部時能 2^1之手的手指所操作之位置上。本實施形態+,第 1鍵201及第9 ^ 部200a之手鏠2〇2及搖桿204,係設置在握持第1握把 係設置在握持^指所能夠操作之位置上。又’第3鍵203 置。 弟2握把部200b之手的食指所能操作之位 輸入果晋 裝置2〇〇 2〇〇可具備攝像裝置(攝像部)。例如,輸入 ^ β /、備與上述控制器5所具備之攝像資訊運算部 323330 57 201220109 35為同樣構成者。此時,攝像資訊運算部的攝像元件,可 設置為將輸入裝置200的前方(終端裝置7的畫面後方)予 以攝像之朝向。例如,係可藉由紅外線濾波器來取代第3 鍵203而配置在第3鍵203的位置上,並於該内侧配置攝 像元件。根據此,使用者藉由使輸入裝置200的前方朝電 視2(標示裝置6)來使用,可使遊戲裝置3算出輸入裝置 200的朝向或位置。因此,使用者可進行使輸入裝置200 朝期望的方向來操作,而能夠使用輸入裝置200進行直覺 ® 且容易的操作。此外,輸入裝置200亦可構成為具備與攝 影機56相同之攝影機來取代攝像資訊運算部。此時,攝影 機與上述攝像元件相同,可設置為將輸入裝置200的前方 予以攝像之朝向。根據此,使用者藉由使輸入裝置200的 前方朝電視2(標示裝置6)來使用,可在與終端裝置7的攝 影機56為相反朝向之攝像方向將圖像予以攝像。 此外,輸入裝置200係具備未圖示之連接器,連接器, φ 當終端裝置7裝著於輸入裝置200時與終端裝置7的擴充 連接器58連接。藉此,在輸入裝置200與終端裝置7之間 可進行資料的接收傳送。例如,可將顯示出對輸入裝置200 所進行之操作之資料,或顯示出上述攝像裝置的攝像結果 之資料傳送至終端裝置7。此時,終端裝置7亦能夠以無 線方式將顯示出對終端裝置7所進行之操作之資料、以及 從輸入裝置所傳送來之資料傳送至遊戲裝置3。此外,輸 入裝置200,可具備當終端裝置7裝著於輸入裝置200時 與終端裝置7的充電端子66連接之充電端子。根據此,當 58 323330 201220109 終端裝置7裝著於輸入裝置2GG _,可將電力從一方 置供給至另一方的裝置。例如,可將輸入裝置謂連胁 充電器,使終端裝置7經由輸入震置從充電器取得電 力來進行充電。 輸入裝置200 ’例如亦可構成如下。第17圖係顯示輸 入裝置的其他例之圖。此外,第18圖及第19圖係顯示將 弟Π圖所示之輸入裝置220裝著於終端裝置7之模樣之 圖。第18圖為從終财置7的背面侧觀看終端裝置7及輸 入裝置220之圖’第19圖為從終端裝置7的表面側觀看終 端,置7及輸入裝置220之圖。終端裝置7中,例如亦可 震著第17圖所不之輸人裝置22Q。以下說明輸人|置22〇。 第Π圖至第20圖中’關於對應於第15圖及第16圖所示 輸入裝置2GG的構成要素之構成要素,係附加與第圖 及第16,相同之參照圖號,並省略該詳細說明。 如第Π圖所示,輸入裝置22〇,與輸入裝置2〇〇相同, _係具備第1握把部200a及第2握把部薦。因此,使用 者可僅握持各握把部2〇〇a及2_中的一方來使用輸入裝 置220(及終端褒置7),或是握持兩方來使用輸入裝置⑽。 。立此外,輸入裝置220係具備與輸入裝置2〇〇相同之支 > P 205。支撐部205與輸入裝置2〇〇的支撐部相同,具 有=個爪。卩(第圖中僅顯示3個爪部至2的c)。各 爪°卩中,上側的2個爪部205a及205b可分別卡止於終端 裝置7的卡止孔59a及59b。剩下之下側的2個爪部可分 卡止於終^襄置7的卡止孔及祖。圖中未顯示的 323330 59 201220109 爪部,在左右方向上(裝著於支撐部205之終端裝置7的左 右方向上),被設置在與爪部205c對稱之位置。 如第18圖及第19圖所示,當將輸入裝置220連接於 終端裝置7時,藉由使4個爪部分別卡止於上述卡止孔 50a、50b、59a及59b,而將終端裝置7與輸入裝置220固 定。藉此更能夠將輸入裝置220堅固地固定在終端裝置7。 此外,其他實施形態中,除了爪部與卡止孔的卡止之外(或 是取代卡止的方式),亦可藉由將輸入裝置220與終端裝置 ® 7螺合固定等,而將輸入裝置220更堅固地固定在終端裝 置7。例如可在卡止孔50a及50b的内部設置螺絲孔,並 將上述下側的2個爪部螺合固定在卡止孔50a及50b。此 外,螺合固定的位置可為任意位置。 如上所述,關於輸入裝置220亦與輸入裝置200相同, 能夠緊緊地固定在終端裝置7。 此外,關於輸入裝置220亦與輸入裝置200相同,支 φ 撐部205,以當第1握把部200a(或第2握把部200b)朝垂 直方向時使LCD 51的晝面呈大致垂直朝向之方式,可裝卸 地支撐終端裝置7。各握把部200a及200b,係形成為與連 接於輸入裝置220之終端裝置7的顯示部(外罩50的表面) 大致平行。因此,使用者藉由大致垂直地握持各握把部200a 及200b(的至少任一個),可使顯示部的晝面朝向自己,因 此可一邊觀看顯示部的晝面一邊使用輸入裝置200進行操 作。此外,關於輸入裝置220,與輸入裝置220相同,支 撐部205亦在較握把部更上方來支撐終端裝置7,因此, 60 323330 201220109 對於握持握把部之使用者而言,為容易觀看晝面之配置。 其他實施形態中’可使至少1個握把部形成為與LCD 51的 晝面大致平行之朝向。 輸入裝置220中,連接部的形狀係與輸入裝置200不 同。第17圖所示之連接部2〇9,係連接於第1握把部2〇〇a 的上侧與下側2處’並且連接於第2握把部200b的上侧(上 端)。此外,關於輸入裝置22〇亦與輸入裝置2〇〇相同,連 φ接部209較第2握把部200b更往前方突出地形成。關於輸 入裝置220亦與輸入裝置2〇〇相同,支撐部2〇5係設置在 連接第1握把部20〇a及第2握把部200b之連接構件2〇9。 因此’使用者可藉由握住2個握把部200a及200b並握持, 而輕鬆地握持操作裝置。 ^此外,連接部209係具有從與支撐部2〇5的連接部分 在下方延伸之構件。該構件當連接於支樓部2〇5之終端裝 置7之LCD 51的晝面呈大致垂直朝向時,為在大致垂直方 #向上延伸之朝向。亦即,上述構件係成為與各握把部驗 及200b大致並行之朝向。因此,使用者在將上述構件作為 握把部來握持時’藉由將上述構件大致垂直地握持,可- 邊觀看LCD 51的晝面一邊使用輪入襄置進行操作。此 外,由於上述構件配置在支樓部2〇5的下方,因此,藉由 握持上述構件,可成為對使用者而言為容易觀看晝面之配 置。 — 關於輸入裝置220亦與輸A裝置2〇〇相同,一方的握 把部(第1握把部200a),設置在成為輸入裝置22〇上所裝 323330 61 201220109 著之終端裝置7的晝面前側之位置,另一方的握把部(第2 握把部200b)設置在該畫面後側之位置。因此,與輸入裝 置200相同,在容易以持搶的握法握持2個握把部,並將 操作裝置用作為槍來進行遊戲操作之射擊遊戲等中,輸入 裝置220特別適合。 此外,輸入裝置220作為操作部除了具備與輸入裝置 200相同之第2鍵202及搖桿204之外,更具備第4鍵207。 第2鍵202及搖桿204,與輸入裝置200相同,係設置在 ® 第1握把部200a的上側。第4鍵207為可由使用者所壓下 之鍵(按鍵)。第4鍵207設置在第2握把部200b的上側。 亦即,第4鍵207係設置在握持第2握把部200b之手的食 指所能夠操作之位置上。 輸入裝置220係具備攝像元件(攝像裝置)。在此,輸 入裝置220係具備與上述控制器5所具備之攝像資訊運算 部35為同樣構成者。此時,攝像資訊運算部的攝像元件, φ 可設置為將輸入裝置220的前方(終端裝置7的晝面後方) 予以攝像之朝向。具體而言,於輸入裝置220的前方(連接 部206的前端部)設置窗部(紅外線濾波器)208,攝像元件 設置在窗部208的内側,並設置為從窗部208將前方予以 攝像之朝向。根據上述内容,使用者藉由使輸入裝置220 的前方朝電視2(標示裝置6)來使用,可使遊戲裝置3算出 輸入裝置220的朝向或位置。因此,使用者可進行使輸入 裝置220朝期望的方向來操作,而能夠使用輸入裝置220 直覺且容易地進行操作。 62 323330 201220109 β I+輪入裝置220亦可構成為具備與攝影機56相同 於入裝取代攝像資訊運算部。根據此,使用者藉由使 的前方朝電視2(標线置6)來額,可在與 、的攝影機56為相反朝向之攝像方向將圖像予以 攝像。 、查姑:入裝置220與輸入裝置200相同,係具備未圖示之 欢山器連接器係當終端裂i 7裝著於輸入裝i 22〇時與 % 、端裝置7的擴充連接器58連接。藉此,在輸入裝置220 與終端裝置7之間可進行資料的接收傳送。因此,可將顯 示出對輸入敦置220所進行之操作之資料,以及顯示出上 述攝像裝置的攝像結果之資料,經由終端裝置7傳送至遊 戲裝置3。此外,其他實施形態中,亦可構成為輸入裝置 220與遊戲裝置3直接進行通訊。亦即,顯示對輸入裝置 220所進行之操作之資料,例如和控制器5與遊戲裝置3 之間的無線通訊相同,採用Bluetooth(註冊商標)技術等, • 從輸入裝置220直接傳送至遊戲裝置3。此時,顯示出對 終端裝置7所進行之操作之資料,係從終端裝置7傳送至 遊戲裝置3。此外,輸入裝置220與輸入裝置200相同, 可具備當終端裝置7裝著於輸入裝置220時與終端裝置7 的充電端子66連接之充電端子。 此外’其他實施形態中,亦可提供終端裝置7與輸入 裴置200(或輸入裝置220)呈一體之操作裝置。此時,不需 具備終端裝置7中的各卡止孔50a、50b、59a及59b,和 輪入裝置200中的爪部等之用以可裝卸地連接終端裝置γ 323330 63 201220109 與輸入裝置200之機構。 第20圖係顯示將附加裝置裝著於終端裝置7之其他一 例之圖。第20圖中,終端裝置7係連接(裝著)於作為附加 裝置的一例之支架210。支架210為用以將終端裝置7豎 立為預定角度而載置(支撐)之支撐裝置。支架210係具 備:支撐構件21卜充電端子212、導引構件213a及213b。 本實施形態中,支架210亦具有充電器之功能,具有 充電端子212。充電端子212為能夠與終端裝置7的充電 ® 端子66連接之端子。本實施形態中,各充電端子66及212 為金屬端子,惟亦可是一方具有可連接於另一方之形狀的 連接器。當終端裝置7連接於支架210時,支架210的充 電端子212與終端裝置7的充電端子66接觸,可從支架 210將電力供給至終端裝置7來進行充電。 支撐構件211,係用於以預定角度來支撐終端裝置7 的背面者。支撐構件211當終端裝置7的端子(充電端子 φ 66)與支架210的端子(充電端子212)連接時,係支撐外罩 50的預定面(在此為背面)。如第20圖所示,支撐構件211 具備壁部211a與槽部211b。支撐構件211係藉由壁部 211a,以使外罩50的背面沿著預定支撐面(在此是由壁部 211a所形成之面)被載置之方式來支撐外罩50。此外,槽 部211b係當終端裝置7與支架210連接時為外罩50的一 部分(下側部分)所***之部分。因此,槽部211b係形成為 大致配合於外罩50的上述一部分形狀。槽部211b,在與 上述支撐面平行之方向上延伸。 64 323330 201220109 此外’導引構件213a及213b,為可***於終端裝置7 的第2卡止孔5〇a及50b,並將終端裝置7連接於支架210 之位置予以定位之構件。各導引構件213a及213b,係設 置在對應於終端裝置7的卡止孔5〇a及5〇b之位置上。亦 即,各導引構件213a及213b傣設置在當終端裝置7與支 架210正確地連接時被***於卡止孔50a及50b之位置 上。所謂終端裝置7與支架210正確地連接時,是指支架 210的充電端子212與終端裝置7的充電端子66連接之情 形。此外,導引構件213a及213b,其一部分從槽部211b 的底面突出地設置。亦即’導引構件213a及213b,其一 部分係從支撐構件211的表面朝上方突出地設置。當終端 裝置7連接於支架210時,係成為導引構件213a及213b 的一部分分別***於卡止孔50a及50b之狀態。 本實施形態中,各導引構件213a及213b分別為可旋 轉之車輪構件(滾輪部)。各導引構件213a及213b可在預 φ 定方向上旋轉。在此,所謂預定方向,為(水平方向且)與 上述支撑面平行之方向,換言之,為終端裝置7連接於支 架210時之終端裝置7的左右方向。導引構件,只要是可 在預定方向上旋轉之旋轉構件即可。例如,其他實施形態 中’導引構件可為藉由球狀的凹部能夠旋轉地支撐之球 體。此外,本實施形態中’導引構件的數目為2個,但可 因應設置在終端裝置7的下面之卡止孔的數目來設置該數 目的導引構件’支架210亦可具備1個或3個以上的導引 構件。 323330 '65 201220109 當終端裝置7連接於支架210時,藉由使終端裝置7 的背面抵接於支撐構件211 ’能夠以預定角度將終端裝置7 載置於支架210上。亦即,外罩50下側的一部分***於槽 部211b,使壁部211a支撐外罩50的背面,藉此能夠以預 定角度將終端裝置7載置於支架210上。因此,本實施形 態中,在垂直於上述預定方向之方向上,可藉由支樓構件 211將終端裝置7的位置定位在正確的位置。 在此,當終端裝置7連接於支架210時,若終端裝置 ® 7與支架210未處於正確的位置關係時,係藉由各導引構 件213a及213b來修正終端裝置7的位置而連接。亦即, 當在上述預定方向上卡止孔50a及50b從導引構件213a及 213b偏離時,各導引構件213a及213b接觸於卡止孔5〇a 及50b周邊的外罩50。因應於此,因導引構件213a及213b 的旋轉而使終端裝置7往預定方向滑動移動。本實施形態 中’由於2個導引構件213a及213b在預定方向上排列設 φ 置’所以可使終端裝置7的下面僅接觸於導引構件213a及 213b,而能夠順暢地移動終端裝置7。此外,若在卡止孔 50a及50b的周圍設置傾斜(凹入傾斜),則更能夠順暢地 移動終端裝置7。如上所述,由於終端裝置7的滑動移動 的結果,而成為使導引構件213a及213b的各一部分*** 於卡止孔50a及50b之狀態。藉此,使支架21〇的充電端 子212與終端裝置7的充電端子66接觸,而確實地進行充 電。 如上所述’使用者即使未將終端裝置7載置於正確位 66 323330 201220109 置,亦可容易將終端裝置7連接於支架210。根據本實施 形態,終端裝置7相對於支架210之定位,可藉由終端裝 置7的卡止孔與支架210的導引構件之簡易構成來進行, 因此可將支架210形成為小型簡易的構成。本實施形態 中,終端裝置7雖然為相對較大型的可搬運型裝置,但即 使是如此之大型可搬運型裝置,支架210本身亦可形成如 第20圖所示之小型構成。此外,由於支架210可連接各種 形狀或大小的終端裝置,因此可提供泛用性高之支撐裝置。 此外,本實施形態中,卡止孔50a及50b係用作為用 以卡止附加裝置的爪部之孔,並且可用作為讓導引構件插 入之對象。因此,可減少終端裝置7的外罩50上所設置之 孔的數目,而可簡化外罩50的形狀。 上述實施形態中,成為讓支架210的導引構件***之 對象的孔,為設置在外罩50的下側側面之孔(卡止孔50a 及50b),但孔的位置可為任意位置。例如,可在外罩50 的其他側面設置孔,或在外罩50的表面或背面設置孔。由 於導引部必須設置在因應孔的位置之位置,所以當在外罩 50的表面或背面設置孔時,支架210的導引部,例如可設 置在壁部211a的位置上。此外,可在外罩50的複數個面 設置孔,此時,能夠以種種朝向將終端裝置7載置於支架 210 上。 [5.遊戲處理] 接著詳細說明在本遊戲系統中所執行之遊戲處理的詳 細内容。首先說明遊戲處理中所使用之各種資料。第21圖 67 323330 201220109 係顯示遊戲處理中所使用之各種資料之圖。第21圖中為顯 示出遊戲裝置3的主記憶體(外部主記憶體ι2或内部主兮己 憶體lie)中所記憶之主要資料之圖。如第21圖所示,遊 戲裝置3的主記憶體中,記憶有遊戲程式9〇、接收資料91 及處理用資料106。除了第21圖所示之資料外,係記憶有 在遊戲中登場之各種物件的圖像資料或遊戲中所使用之聲 音資料等之遊戲所需的資料於主記憶體。Hold the 200b and hold the operating unit easily. In the above embodiment, the square grip portion (the first grip portion 2A) is provided at the position on the front side of the screen of the terminal device 7 mounted on the input device 200, and the other .卩 (the second grip portion 20Gb) is provided at a position on the rear side of the kneading surface. Because the user's hand is located in front of the face, so that the other hand position; Dan, rear' and holding the two grips in a gripping manner, can easily hold the device. For example, the above-described operating device is used as a rush to perform, and the above-described operating device is suitable for shooting. Further, the input vibrating unit 2 includes a first key 2, a second key 2〇2, ^2〇3, and a rocker 204 as an operation unit. Each of the keys 201 to 203 is divided into two keys (keys) pressed by the user. The rocker 204 is a device that can indicate the direction. The above-mentioned cat .lV, p ^ compositing part is preferably located at a position where the finger of the hand of the hand can be operated when the user grips the grip portion. In the present embodiment, the handcuffs 2〇2 and the rocker 204 of the first key 201 and the ninth key portion 200a are provided at positions where the gripping first grip is provided at the position where the gripping fingers can be operated. Further, the third button 203 is set. Brother 2 The position of the index finger of the hand of the grip portion 200b can be operated. Input Guojin Device 2〇〇 2〇〇 can be equipped with an imaging device (camera). For example, the input θ / is the same as that of the imaging information calculation unit 323330 57 201220109 35 provided in the controller 5 described above. At this time, the imaging element of the imaging information computing unit can be set to face the front side of the input device 200 (the rear of the screen of the terminal device 7). For example, the third key 203 can be placed in the position of the third key 203 instead of the third key 203 by an infrared filter, and the image pickup element can be disposed inside. According to this, the user can use the front side of the input device 200 toward the television 2 (the pointing device 6), so that the game device 3 can calculate the orientation or position of the input device 200. Therefore, the user can perform the operation of the input device 200 in a desired direction, and the input device 200 can be used for intuitive control and easy operation. Further, the input device 200 may be configured to include a camera similar to the camera 56 instead of the imaging information computing unit. At this time, the camera is the same as the above-described imaging element, and can be set to face the front side of the input device 200. According to this, by using the front side of the input device 200 toward the television 2 (the pointing device 6), the user can image the image in the imaging direction opposite to the camera 56 of the terminal device 7. Further, the input device 200 includes a connector (not shown), and the connector φ is connected to the extension connector 58 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. Thereby, data can be received and transmitted between the input device 200 and the terminal device 7. For example, data showing the operation performed on the input device 200 or data showing the imaging result of the imaging device can be transmitted to the terminal device 7. At this time, the terminal device 7 can also transmit the data showing the operation performed on the terminal device 7 and the data transmitted from the input device to the game device 3 in a wireless manner. Further, the input device 200 may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 200. According to this, when the terminal device 7 is attached to the input device 2GG_, the power can be supplied from one side to the other device. For example, the input device can be connected to the charger, and the terminal device 7 can be charged by taking power from the charger via the input. The input device 200' may be configured, for example, as follows. Fig. 17 is a view showing another example of the input device. Further, Fig. 18 and Fig. 19 are views showing the appearance of the input device 220 shown in the drawing of the terminal device 7. Fig. 18 is a view of the terminal device 7 and the input device 220 viewed from the back side of the terminal device 7. Fig. 19 is a view showing the terminal 7 and the input device 220 as viewed from the front side of the terminal device 7. In the terminal device 7, for example, the input device 22Q of Fig. 17 may be struck. The following description of the input | set 22 〇. In the drawings, the components of the components corresponding to the input device 2GG shown in Figs. 15 and 16 are attached with the same reference numerals as in the drawings and the drawings, and the detailed description is omitted. Description. As shown in the figure, the input device 22A is the same as the input device 2A, and includes a first grip portion 200a and a second grip portion. Therefore, the user can use the input device 220 (and the terminal device 7) by holding only one of the grip portions 2a and 2_, or use the input device (10) while holding both. . Further, the input device 220 is provided with the same branch > P 205 as the input device 2A. The support portion 205 is the same as the support portion of the input device 2'', and has = claws.卩 (only 3 claws to 2 c are shown in the figure). In the respective claws, the upper two claw portions 205a and 205b are respectively locked to the locking holes 59a and 59b of the terminal device 7. The two claws on the lower side can be separated into the locking holes and the ancestors of the final set 7. 323330 59 201220109 The claw portion is not disposed in the left-right direction (in the left-right direction of the terminal device 7 attached to the support portion 205), and is disposed at a position symmetrical with the claw portion 205c. As shown in FIGS. 18 and 19, when the input device 220 is connected to the terminal device 7, the terminal devices are locked by locking the four claw portions to the locking holes 50a, 50b, 59a, and 59b, respectively. 7 is fixed to the input device 220. Thereby, the input device 220 can be firmly fixed to the terminal device 7. In addition, in other embodiments, in addition to the locking of the claw portion and the locking hole (or instead of locking), the input device 220 and the terminal device 7 may be screwed and fixed, and the input may be input. The device 220 is more firmly fixed to the terminal device 7. For example, screw holes may be provided inside the locking holes 50a and 50b, and the two lower claw portions may be screwed and fixed to the locking holes 50a and 50b. In addition, the position where the screw is fixed can be any position. As described above, the input device 220 is also similar to the input device 200, and can be tightly fixed to the terminal device 7. Further, the input device 220 is also the same as the input device 200, and supports the φ struts 205 such that the face of the LCD 51 is substantially perpendicular when the first grip portion 200a (or the second grip portion 200b) is oriented in the vertical direction. In this manner, the terminal device 7 is detachably supported. Each of the grip portions 200a and 200b is formed substantially in parallel with the display portion (the surface of the outer cover 50) of the terminal device 7 connected to the input device 220. Therefore, by holding the at least one of the grip portions 200a and 200b substantially vertically, the user can face the face of the display portion toward the user, so that the input device 200 can be used while viewing the face of the display portion. operating. Further, with respect to the input device 220, like the input device 220, the support portion 205 also supports the terminal device 7 above the grip portion, and therefore, 60 323330 201220109 is easy for the user holding the grip portion for easy viewing. The configuration of the face. In other embodiments, at least one of the grip portions may be formed to be substantially parallel to the face of the LCD 51. In the input device 220, the shape of the connecting portion is different from that of the input device 200. The connecting portion 2〇9 shown in Fig. 17 is connected to the upper side and the lower side 2' of the first grip portion 2A, and is connected to the upper side (upper end) of the second grip portion 200b. Further, the input device 22A is also formed in the same manner as the input device 2A, and the φ connecting portion 209 is formed to protrude further forward than the second grip portion 200b. The input device 220 is also the same as the input device 2A, and the support portion 2〇5 is provided in the connection member 2〇9 that connects the first grip portion 20〇a and the second grip portion 200b. Therefore, the user can easily hold the operating device by holding and holding the two grip portions 200a and 200b. Further, the connecting portion 209 has a member extending downward from the connecting portion with the supporting portion 2〇5. When the face of the LCD 51 of the terminal device 7 connected to the branch unit 2〇5 is substantially vertically oriented, the member is oriented in a substantially vertical direction. That is, the above-described members are oriented substantially in parallel with the respective grip portions 200b. Therefore, when the user holds the member as the grip portion, by holding the member substantially vertically, it is possible to operate using the wheel-engaging device while viewing the kneading surface of the LCD 51. Further, since the member is disposed below the branch portion 2〇5, by holding the member, it is possible to easily view the face of the user. - The input device 220 is also the same as the A-transfer device 2A, and one of the grip portions (the first grip portion 200a) is placed in front of the terminal device 7 of the 323330 61 201220109 which is mounted on the input device 22A. At the side position, the other grip portion (second grip portion 200b) is placed at the rear side of the screen. Therefore, similarly to the input device 200, the input device 220 is particularly suitable for a shooting game in which the grip portion is easily gripped by the grip and the operation device is used as a gun to perform a game operation. Further, the input device 220 as the operation unit further includes a fourth key 207 in addition to the second key 202 and the rocker 204 which are the same as the input device 200. The second key 202 and the rocker 204 are provided on the upper side of the first grip portion 200a, similarly to the input device 200. The fourth key 207 is a key (button) that can be pressed by the user. The fourth key 207 is provided on the upper side of the second grip portion 200b. That is, the fourth key 207 is provided at a position where the index finger of the hand holding the second grip portion 200b can be operated. The input device 220 is provided with an imaging element (imaging device). Here, the input device 220 is configured similarly to the imaging information computing unit 35 included in the controller 5. At this time, the imaging element φ of the imaging information computing unit can be set to face the front side of the input device 220 (the rear side of the terminal device 7). Specifically, a window portion (infrared filter) 208 is provided in front of the input device 220 (the front end portion of the connection portion 206), and an imaging element is provided inside the window portion 208, and is provided to image the front side from the window portion 208. Orientation. According to the above, the user can use the front of the input device 220 toward the television 2 (the pointing device 6), so that the game device 3 can calculate the orientation or position of the input device 220. Therefore, the user can operate the input device 220 in a desired direction, and can intuitively and easily operate using the input device 220. 62 323330 201220109 The β I+ wheel-in device 220 may be configured to be provided with the same as the camera 56 instead of the camera information computing unit. According to this, the user can image the image in the imaging direction opposite to the camera 56 with the front side facing the television 2 (the reticle 6). The check device 220 is the same as the input device 200, and is provided with a connector (not shown). When the terminal is mounted on the input device 22, the expansion connector 58 of the terminal device 7 is installed. connection. Thereby, data can be received and transmitted between the input device 220 and the terminal device 7. Therefore, the information showing the operation performed on the input device 220 and the data showing the imaging result of the above-described imaging device can be transmitted to the game device 3 via the terminal device 7. Further, in other embodiments, the input device 220 may be configured to directly communicate with the game device 3. That is, the information showing the operation performed on the input device 220 is, for example, the same as the wireless communication between the controller 5 and the game device 3, using Bluetooth (registered trademark) technology or the like, • directly transmitted from the input device 220 to the game device. 3. At this time, the information showing the operation performed on the terminal device 7 is transmitted from the terminal device 7 to the game device 3. Further, the input device 220 may be the same as the input device 200, and may include a charging terminal that is connected to the charging terminal 66 of the terminal device 7 when the terminal device 7 is mounted on the input device 220. Further, in other embodiments, an operation device in which the terminal device 7 is integrated with the input device 200 (or the input device 220) may be provided. At this time, it is not necessary to provide the respective locking holes 50a, 50b, 59a, and 59b in the terminal device 7, and the claws in the wheel-in device 200 to detachably connect the terminal device γ 323330 63 201220109 and the input device 200. The institution. Fig. 20 is a view showing another example in which the attachment device is attached to the terminal device 7. In Fig. 20, the terminal device 7 is attached (mounted) to the holder 210 as an example of an attachment device. The bracket 210 is a supporting device for placing (supporting) the terminal device 7 at a predetermined angle. The bracket 210 is provided with a support member 21, a charging terminal 212, and guiding members 213a and 213b. In the present embodiment, the holder 210 also has a function as a charger and has a charging terminal 212. The charging terminal 212 is a terminal that can be connected to the charging ® terminal 66 of the terminal device 7. In the present embodiment, each of the charging terminals 66 and 212 is a metal terminal, but one of the connectors may be connected to the other. When the terminal device 7 is connected to the holder 210, the charging terminal 212 of the holder 210 comes into contact with the charging terminal 66 of the terminal device 7, and electric power can be supplied from the holder 210 to the terminal device 7 for charging. The support member 211 is for supporting the back side of the terminal device 7 at a predetermined angle. The support member 211 supports the predetermined surface (here, the back surface) of the outer cover 50 when the terminal (charging terminal φ 66) of the terminal device 7 is connected to the terminal (charging terminal 212) of the holder 210. As shown in Fig. 20, the support member 211 is provided with a wall portion 211a and a groove portion 211b. The support member 211 supports the outer cover 50 by the wall portion 211a such that the rear surface of the outer cover 50 is placed along a predetermined support surface (here, the surface formed by the wall portion 211a). Further, the groove portion 211b is a portion into which a portion (lower side portion) of the outer cover 50 is inserted when the terminal device 7 is connected to the bracket 210. Therefore, the groove portion 211b is formed to substantially fit the above-described partial shape of the outer cover 50. The groove portion 211b extends in a direction parallel to the support surface. 64 323330 201220109 Further, the 'guide members 213a and 213b are members that can be inserted into the second locking holes 5a and 50b of the terminal device 7, and the terminal device 7 is connected to the position of the holder 210. Each of the guide members 213a and 213b is provided at a position corresponding to the locking holes 5a and 5b of the terminal device 7. That is, the respective guide members 213a and 213b are provided at positions where the terminal device 7 and the holder 210 are correctly connected, and are inserted into the locking holes 50a and 50b. When the terminal device 7 is properly connected to the bracket 210, it means that the charging terminal 212 of the bracket 210 is connected to the charging terminal 66 of the terminal device 7. Further, a part of the guiding members 213a and 213b is provided to protrude from the bottom surface of the groove portion 211b. That is, the guide members 213a and 213b are partially provided to protrude upward from the surface of the support member 211. When the terminal device 7 is connected to the holder 210, a part of the guiding members 213a and 213b is inserted into the locking holes 50a and 50b, respectively. In the present embodiment, each of the guide members 213a and 213b is a rotatable wheel member (roller portion). Each of the guiding members 213a and 213b is rotatable in a predetermined direction. Here, the predetermined direction is a direction parallel to the support surface (in the horizontal direction), in other words, the left-right direction of the terminal device 7 when the terminal device 7 is connected to the support 210. The guiding member may be any rotating member that can rotate in a predetermined direction. For example, in other embodiments, the 'guide member may be a sphere that is rotatably supported by a spherical recess. Further, in the present embodiment, the number of the guide members is two, but the number of the guide members can be provided in accordance with the number of the locking holes provided in the lower surface of the terminal device 7. The bracket 210 can also have one or three. More than one guiding member. 323330 '65 201220109 When the terminal device 7 is connected to the cradle 210, the terminal device 7 can be placed on the cradle 210 at a predetermined angle by abutting the back surface of the terminal device 7 against the support member 211'. That is, a part of the lower side of the outer cover 50 is inserted into the groove portion 211b, and the wall portion 211a supports the back surface of the outer cover 50, whereby the terminal device 7 can be placed on the holder 210 at a predetermined angle. Therefore, in the present embodiment, the position of the terminal device 7 can be positioned at the correct position by the branch member 211 in the direction perpendicular to the predetermined direction. Here, when the terminal device 7 is connected to the cradle 210, if the terminal device │ 7 and the cradle 210 are not in the correct positional relationship, the positions of the terminal device 7 are corrected by the respective guiding members 213a and 213b. That is, when the locking holes 50a and 50b are deviated from the guiding members 213a and 213b in the predetermined direction, the guiding members 213a and 213b contact the outer cover 50 around the locking holes 5a and 50b. In response to this, the terminal device 7 is slidably moved in a predetermined direction by the rotation of the guiding members 213a and 213b. In the present embodiment, since the two guide members 213a and 213b are arranged in the predetermined direction, the lower surface of the terminal device 7 can be brought into contact with only the guide members 213a and 213b, and the terminal device 7 can be smoothly moved. Further, when the inclination (concave inclination) is provided around the locking holes 50a and 50b, the terminal device 7 can be moved more smoothly. As described above, each of the guide members 213a and 213b is inserted into the locking holes 50a and 50b as a result of the sliding movement of the terminal device 7. Thereby, the charging terminal 212 of the holder 21 is brought into contact with the charging terminal 66 of the terminal device 7, and charging is surely performed. As described above, the user can easily connect the terminal device 7 to the cradle 210 even if the terminal device 7 is not placed in the correct position 66 323330 201220109. According to the present embodiment, the positioning of the terminal device 7 with respect to the holder 210 can be performed by the simple configuration of the locking hole of the terminal device 7 and the guiding member of the holder 210. Therefore, the holder 210 can be formed in a compact and simple configuration. In the present embodiment, the terminal device 7 is a relatively large type of transportable device. However, even if it is such a large transportable device, the holder 210 itself can have a small configuration as shown in Fig. 20. Further, since the bracket 210 can be connected to terminal devices of various shapes or sizes, it is possible to provide a support device having high versatility. Further, in the present embodiment, the locking holes 50a and 50b are used as holes for locking the claw portions of the attachment means, and can be used as objects for inserting the guide members. Therefore, the number of holes provided in the outer cover 50 of the terminal device 7 can be reduced, and the shape of the outer cover 50 can be simplified. In the above-described embodiment, the hole into which the guide member of the holder 210 is inserted is a hole (the locking holes 50a and 50b) provided on the lower side surface of the outer cover 50, but the position of the hole may be any position. For example, a hole may be provided on the other side of the outer cover 50, or a hole may be provided on the front or back surface of the outer cover 50. Since the guide portion must be disposed at the position corresponding to the position of the hole, when the hole is provided on the front surface or the back surface of the outer cover 50, the guide portion of the bracket 210 can be disposed, for example, at the position of the wall portion 211a. Further, holes may be provided in a plurality of faces of the outer cover 50. At this time, the terminal device 7 can be placed on the holder 210 in various directions. [5. Game Processing] Next, the details of the game processing executed in the game system will be described in detail. First, various materials used in game processing will be described. Figure 21 323330 201220109 A diagram showing the various materials used in game processing. Fig. 21 is a view showing main data stored in the main memory (external main memory ι2 or internal main memory lie) of the game device 3. As shown in Fig. 21, the game memory 9 is stored in the main memory of the game device 3, and the received data 91 and the processing data 106 are stored. In addition to the information shown in Fig. 21, the information required for the game such as the image data of various objects appearing in the game or the sound data used in the game is stored in the main memory.

遊戲程式9 0係在對遊戲裝置3開啟電源後的適當時機 中’從光碟4讀取該全部或一部分並記憶於主記憶^。遊 戲程式90 ’亦可從快閃記憶體Π或遊戲裝置3 ^外部裂 置(例如經由網際網路)來取得,以取代從光碟4讀取之方 式。此外,關於遊戲程式90巾所包含之一部分(例如= 算出控制器5及/或終端裝置7的姿勢之程式、,β 工W ’ 預先記 憶於遊戲裝置3内。 接收資料9卜為㈣卿5轉餐£7所接收 種資料。接收資料91係包含:控制器操作資料92 操作資料97、攝影機圖像資料1()4、及麥克風聲立資 105。當連接有複數個控制器5時,控制器操作資料曰卯' 有複數個。當連射餘個終端Μ ?時,終端 半 97、攝影機圖像資料刚、及麥克風聲音資料1〇5亦有^ 數個。 。控制器操作資料92,為顯示出使用者(遊戲者)對控制 器5所進行之操作的資料。控制器操作資料92係從控制器 5被傳送並在遊戲裝置3中取得,並被記憶於主記憶體。 323330 68 201220109 控制器操作資料92係包含:第1操作鍵資料93、第1加 速度資料94、第1角速度資料95、及標示器座標資料96。 主記憶體中,可從最新(最後取得)者依序記憶預定個數的 控制器操作資料。 第1操作鍵資料93,為顯示出對設置在控制器5之各 操作鍵32a至32i的輸入狀態之資料。具體而言,第1操 作鍵資料93係顯示各操作鍵32a至32i是否被壓下。 第1加速度資料94,為顯示出由控制器5的加速度感 ® 測器37所偵測出之加速度(加速度向量)的資料。在此,第 1加速度資料94顯示出將第3圖所示之XYZ的3軸方向上 之加速度設為各成分之3維的加速度,但在其他實施形態 中,只要可顯示出任意1個方向以上的加速度即可。 第1角速度資料95,為顯示出由控制器5的迴轉感測 器48所偵測出之角速度的資料。在此,第1角速度資料 95顯示出第3圖所示之繞著XYZ的3軸上之各角速度,但 φ 在其他實施形態中.,只要可顯示出繞著任意1軸以上的角 速度即可。 標示器座標資料96,為顯示出由攝像資訊運算部35 的圖像處理電路41所算出之座標,亦即上述標示器座標之 資料。標示器座標是以用來顯示對應於攝像圖像之平面上 的位置之2維座標系所表現,標示器座標資料96顯示出該 2維座標系上的座標值。 控制器操作資料92,只要可顯示出操作控制器5之使 用者的操作者即可,亦可僅包含上述各資料93至96的一 69 323330 201220109 部分。此外,當控制器5具 - 或類比榣桿等)時’抻 、^其他輸入手段(例如觸控面板 其他輸人手段料行資料92 Φ可包含表示對該 控制器5本身的動作,、的資料。如本實施形態,當將 係如第i加速度資料器操作資料92 1弗1角速度資舰,、 =料96’可包含因應控制器5本身的動作’而: =The game program 90 is read from the optical disc 4 in the appropriate timing after the power is turned on for the game device 3 and is recorded in the main memory. The game program 90' can also be retrieved from the flash memory or the game device 3 externally (e.g., via the Internet) instead of reading from the disc 4. In addition, one part of the game program 90 (for example, the program for calculating the posture of the controller 5 and/or the terminal device 7, and the β machine W' are stored in advance in the game device 3. The received data 9 is (4) Qing 5 The meal data received by the meal is received by the book. The received data 91 includes: the controller operation data 92 operation data 97, the camera image data 1 () 4, and the microphone sound capital 105. When a plurality of controllers 5 are connected, There are a plurality of controller operation data 。'. When the remaining terminals are connected, the terminal half 97, the camera image data, and the microphone sound data 1〇5 also have several. The controller operation data 92, In order to display the information of the operation performed by the user (player) on the controller 5, the controller operation data 92 is transmitted from the controller 5 and acquired in the game device 3, and is memorized in the main memory. 323330 68 201220109 The controller operation data 92 includes: the first operation key data 93, the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. In the main memory, the latest (final) can be sequentially ordered. Memory reservation The number of controller operation data. The first operation key data 93 is for displaying information on the input states of the operation keys 32a to 32i provided in the controller 5. Specifically, the first operation key data 93 indicates each operation. Whether the keys 32a to 32i are depressed. The first acceleration data 94 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 37 of the controller 5. Here, the first acceleration data 94 The acceleration in the three-axis direction of the XYZ shown in FIG. 3 is shown as the three-dimensional acceleration of each component. However, in other embodiments, the acceleration in any one direction or more may be displayed. The angular velocity data 95 is data showing the angular velocity detected by the rotation sensor 48 of the controller 5. Here, the first angular velocity data 95 is displayed on the 3 axes around the XYZ shown in Fig. 3. In the other embodiments, the angular velocity may be displayed around any one axis or more. The marker coordinate data 96 is displayed by the image processing circuit 41 of the imaging information computing unit 35. The coordinates, that is, the upper The information of the marker coordinates. The marker coordinates are represented by a 2-dimensional coordinate system for displaying the position on the plane corresponding to the captured image, and the marker coordinate data 96 shows the coordinate value on the 2-dimensional coordinate system. The device operation data 92 may be any one of the operators of the user who operates the controller 5, and may only include a part of 69 323330 201220109 of each of the above-mentioned materials 93 to 96. In addition, when the controller 5 has - or analogy When the lever or the like is used, other input means (for example, the touch panel other input means data line 92 Φ may include information indicating the operation of the controller 5 itself. As in the present embodiment, when the ith acceleration data device operating data 92 1 is used, the material 96' may include the action of the controller 5 itself:

終端操作資料97為顯示出使用者對終端 之操作的資料。終端操作資料 、 斤進订 貝枓97係從終端裝置7被傳送並 在遊戲裝置3中取得,並被記憶於主記㈣。終端操作資 料97係包含:第2操作鍵資料98、搖桿資料99、觸控位 置身料100、第2加速度資料1(n、第2角速度資料1〇2、 及方位資料。主記憶體中,可從最新(最後取得)者依序記 憶預定個數的終端操作資料。 第2操作鍵資料98 ’為顯示出對設置在終端裝置7之 各操作鍵54A至54L的輸入狀態之資料。具體而言,第2 操作鍵資料98係顯示各操作鍵μα至54L是否被壓下。 搖桿資料99 ’為顯示出類比搖桿53(類比搖桿53A及 類比搖桿53B)的搖桿部所滑動(或傾倒)之方向及量的資 料。上述方向及量,例如可顯示為2維座標或2維向量。 觸控位置資料100,為顯示出在觸控面板52的輸入面 進行輸入之位置(觸控位置)的資料。本實施形態中,.觸控 位置資料100係顯示出用以顯示上述的輸入面上的位置之 2維座標系上的座標值。當觸控面板52為多點觸控方式 70 323330 201220109 時,觸控位置資料100亦顯示出複數個觸控位置。 第2加速度資料101,為顯示出由加速度感測器73所 偵測出之加速度(加速度向量)的資料。本實施形態中,第 2加速度資料101顯示出將第8圖所示之xyz的3軸方向 上之加速度設為各成分之3維的加速度,但在其他實施形 態中,只要可顯示出任意1個以上方向的加速度即可。 第2角速度資料10 2 ’為顯示出由迴轉感測器7 4所{貞 測出之角速度的資料。本實施形態中,第2角速度資料102 ® 顯示出第8圖所示之繞著xyz的3軸之各角速度,但在其 他實施形態中,只要可顯示出繞著任意1軸以上的角速度 即可。 方位資料103,為顯示出由磁性感測器62所偵測出之 方位的資料。本實施形態中,方位資料103係以終端裝置 7為基準顯示出預定方位(例如北方)的朝向。惟在產生地 磁以外的磁場之場所中,方位資料1〇3雖然嚴格來說並未 φ顯不絕對方位(北方等),但顯示出終端裝置7相對於該場 所上的磁場方向之相對方向,故即使在此情況下,亦可算 出終Λτό裝置7的姿勢變化。 終端操作資/料97 ’只要可顯示出操作終端裝置7之使 用者的操作者即可,亦可僅包含上述各資料98至⑽中的 任-個。此夕卜,當終端裝置7具有其他輸入手段(例如觸控 墊或控制器5的攝像手段等)時,終端操作資料97亦可包 =對該其他輪人手段所進行之操作的資料。如本實施形態 所不’當將終端裝置7本身的動作用在遊戲操作時,終端 323330 71 201220109 操作資料97係如第2加速度資料101、第2角速度資料 102、或方位資料103,可包含因應終端裝置7本身的動作 而改變該值之資料。 攝影機圖像資料104,為顯示出由終端裝置7的攝影 機56所攝像之圖像(攝影機圖像)的資料。攝影機圖像資料 104,為藉由編解碼器LSI 27將來自終端裝置7的壓縮圖 像資料解壓縮後之圖像資料,並藉由輸出輸入處理器11a 記憶於主記憶體。主記憶體中,可從最新(最後取得)者依 ® 序記憶預定個數的攝影機圖像資料。 麥克風聲音資料105,為顯示出由終端裝置7的麥克 風79所偵測出之聲音(麥克風聲音)的資料。麥克風聲音資 料105為藉由編解碼器LSI 27將從終端裝置7傳送來的壓 縮聲音資料解壓縮後之聲音資料,並藉由輸出輸入處理器 11 a記憶於主記憶體。 處理用資料106,為後述遊戲處理(第22圖)中所使用 φ 之資料。處理用資料106係包含:控制資料107、控制器 姿勢資料108、終端姿勢資料109、圖像辨識資料110、及 聲音辨識資料111。除了第21圖所示之資料外,處理用資 料106亦包含:顯示出在遊戲中登場之各種物件中所設定 的各種參數之資料等之遊戲處理中所使用之各種資料。 控制資料107,為顯示出對終端裝置7所具備之構成 要素所進行的控制指示之資料。控制資料107例如顯示出 控制標示部55的點燈之指示,或是控制攝影機56的攝像 之指示等。控制資料107係在適當的時機被傳送至終端裝 72 323330 201220109 置7。 控制器姿勢資料108,為顯示出控制器5的姿勢之資 料。本實施形態中,控制器姿勢資料108係根據上述控制 器操作資料92中所包含之第1加速度資料94、第1角速 度資料95、及標示器座標資料96來算出。關於控制器姿 勢資料108的算出方法,將於步驟S23中說明。 終端姿勢資料109,為顯示出終端裝置7的姿勢之資 料。本實施形態中,終端姿勢資料109係根據上述終端操 ® 作資料97中所包含之第2加速度資料101、第2角速度資 料102、及方位資料103來算出。關於終端姿勢資料109 的算出方法,將於步驟S24中說明。 圖像辨識資料110,為顯示出對上述攝影機圖像進行 預定的圖像辨識處理之結果的資料。該圖像辨識處理,只 要可從攝影機圖像偵測出某種特徵並輸出該結果者均可為 任意處理,例如可為從攝影機圖像中擷取預定對象(例如使 φ 用者的臉或標示等),並算出與擷取對象相關之資訊之處 理。 聲音辨識資料m,為顯示出對上述麥克風聲音進行 預定的聲音辨識處理之結果的資料。該聲音辨識處理,只 要可從麥克風聲音偵測出某種特徵並輸出該結果者均可為 任意處理,例如可為偵測出使用者的言語之處理,或僅為 輸出音量之處理。 接著參照第22圖,說明在遊戲裝置3中所進行之遊戲 處理的詳細内容。第22圖係顯示遊戲裝置3中所執行之遊 73 323330 201220109 戲處理的流程之主流程圖。當開啟遊戲裝置3的電源時, 遊戲裝置3的CPU 10係執行記憶於未圖示的開機ROM之啟 動程式,藉此進行主記憶體等之各單元的初始化。然後將 記憶於光碟4之遊戲程式讀取至主記憶體,並藉由CPU 10 開始執行該遊戲程式。遊戲裝置3中,可構成為電源開啟 後立即執行記憶於光碟4之遊戲程式,或是構成為電源開 啟後首先執行顯示出預定選單晝面之内建程式,然後在藉 由使用者指示遊戲的開始時再執行記憶於光碟4之遊戲程 式。第22圖所示之流程圖,為結束上述處理後所進行的處 理之流程圖。 第22圖所示之流程圖的各步驟的處理僅為一例,只要 可得到同樣結果,則亦可替換各步驟的處理順序。此外, 變數值或判斷步驟中所運用之臨限值亦僅為一例,可因應 必要採用其他值。此外,本實施形態中,係說明由CPU 10 來執行上述流程圖之各步驟的處理,但亦可由CPU 10以外 的處理器或專用電路來執行上述各步驟之一部分步驟的處 理。 首先,步驟S1中,CPU 10執行初始處理。初始處理, 例如為建構虛擬遊戲空間,並將在遊戲中登場之各物件配 置在初始位置,或是設定遊戲處理中所使用之各種參數的 初始值之處理。 此外,本實施形態中,初始處理中,CPU 10係根據遊 戲程式的種類來控制標示裝置6及標示部55的點燈。在 此,遊戲系統1係具有標示裝置6與終端裝置7的標示部 74 323330 201220109 55的兩者作為控制器5的攝像手段(攝像資訊運算部35) ^攝像對象。因遊戲内容(遊戲程式的種類)的不同來使用 ^不裝置6及標示部55令的任一個或兩者。遊戲程式卯 /中係包3顯7F出是否對標示裝置6及標示部55的各個進 f點燈之資料。CPU 1G讀取該資料並判斷是否該點燈。將 標不裝置6及/或標示部55點燈時,係執行下列處理。 亦即,將標示農置6點燈時,cpu 10將使標示裝置6 斤八備之各紅外線LED點燈之控制訊號傳送至標示裝置 6。該控制簡的傳送可為僅供給電力者。因應於此,標示 裝置6的各紅外線LED被點燈。另一方面,將標示部55點 f時’ CPU 1G生成表示出使標示部55點燈之指示的控制 貝料並錢於主記憶體中。所生成之控制資料,在後述 步驟S1G中被傳送至終端裝置7。在終端裝置7的無線模 、、且70中所接收之控制資料,經由編解碼器⑸Μ被傳送 至匕控制器75,w控制器75進行將標示部55點燈之指 :。精此使標示部5 5的紅外線L E D點燈吐述中係說明將 ^不裝置6及標示部55點燈之情況,關於標示裝置6及標 不部55的熄滅’可藉由與點燈時為相同處理來進行。 在以上的步驟S1後,執行步驟S2的處理。以下,係 2隔預定時間α圖框時間)為卜欠之比率來重複執行由 人驟S2至S11的一連串處理所構成之處理迴路。 中,CPU 1G取得從控制器5所傳送來之控制 4作-貝料。由於控制器5重複將控制器操作資料傳送至 遊戲裝置3,所以在遊戲裝置3巾,控㈣通 323330 75 201220109 次接收該㈣賴作f料,輕 所接收之控制器操作資料逐次記憶於主: 的間隔,較佳係較遊戲的處理時^ 、 -秒。細中侧。從=為二,^ 步驟 S3 中,CPU 10 & γ "a 種資料。終端裝置7由於:::::裝47所傳送來之各The terminal operation data 97 is information showing the operation of the terminal by the user. The terminal operation data and the booklet are transmitted from the terminal device 7 and acquired in the game device 3, and are recorded in the main note (4). The terminal operation data 97 includes: a second operation key data 98, a joystick data 99, a touch position body 100, a second acceleration data 1 (n, a second angular velocity data 1〇2, and azimuth data. In the main memory A predetermined number of terminal operation data can be sequentially stored from the latest (final acquisition). The second operation key data 98' is for displaying information on the input states of the operation keys 54A to 54L provided in the terminal device 7. In other words, the second operation key data 98 indicates whether or not each of the operation keys μα to 54L is depressed. The joystick data 99' is a rocker portion showing the analog rocker 53 (analog rocker 53A and analog rocker 53B). The direction and amount of the sliding (or dumping) direction, for example, can be displayed as a 2-dimensional coordinate or a 2-dimensional vector. The touch position data 100 is used to display the input position on the input surface of the touch panel 52. (Touch position) data. In the embodiment, the touch position data 100 displays a coordinate value on a two-dimensional coordinate system for displaying the position on the input surface. When the touch panel 52 is multi-point Touch mode 70 323330 201220109, touch The position data 100 also displays a plurality of touch positions. The second acceleration data 101 is data showing the acceleration (acceleration vector) detected by the acceleration sensor 73. In the present embodiment, the second acceleration data 101 The acceleration in the three-axis direction of the xyz shown in Fig. 8 is shown as the three-dimensional acceleration of each component. However, in other embodiments, the acceleration in any one or more directions may be displayed. The angular velocity data 10 2 ' is a data showing the angular velocity measured by the gyro sensor 74. In the present embodiment, the second angular velocity data 102 ® shows the three axes around the xyz shown in Fig. 8. In the other embodiments, the angular velocity may be displayed around any one axis or more. The orientation data 103 is data indicating the orientation detected by the magnetic sensor 62. In the form, the orientation data 103 displays the orientation of the predetermined orientation (for example, the north) based on the terminal device 7. However, in the place where the magnetic field other than the geomagnetism is generated, the orientation data 1〇3 is not strictly shown. Absolute orientation (north or the like), but showing the relative direction of the direction of the magnetic field of the terminal device 7 with respect to the location, even in this case, the posture change of the final Λ device 7 can be calculated. Terminal operation material / material 97 ' As long as the operator of the user who operates the terminal device 7 can be displayed, only one of the above-mentioned respective materials 98 to (10) can be included. Further, when the terminal device 7 has other input means (for example, a touch pad) In the case of the imaging means of the controller 5, etc., the terminal operation data 97 may also include information on operations performed by the other wheeler means. As in the present embodiment, the operation of the terminal device 7 itself is used in the game. In operation, the terminal 323330 71 201220109 operating data 97 is such that the second acceleration data 101, the second angular velocity data 102, or the orientation data 103 may include information that changes the value in response to the operation of the terminal device 7 itself. The camera image data 104 is data for displaying an image (camera image) imaged by the camera 56 of the terminal device 7. The camera image data 104 is an image data obtained by decompressing the compressed image data from the terminal device 7 by the codec LSI 27, and is stored in the main memory by the output/output processor 11a. In the main memory, a predetermined number of camera image data can be memorized from the latest (final). The microphone sound data 105 is data for displaying the sound (microphone sound) detected by the microphone 79 of the terminal device 7. The microphone sound data 105 is sound data decompressed from the compressed sound data transmitted from the terminal device 7 by the codec LSI 27, and is memorized in the main memory by the output input processor 11a. The processing material 106 is data of φ used in the game processing (Fig. 22) to be described later. The processing data 106 includes control data 107, controller posture data 108, terminal posture data 109, image recognition data 110, and sound recognition data 111. In addition to the information shown in Fig. 21, the processing material 106 also includes various materials used for game processing such as displaying various parameters set in various objects appearing in the game. The control data 107 is information for displaying a control instruction to the constituent elements included in the terminal device 7. The control data 107 displays, for example, an instruction to light the control indicator unit 55 or an instruction to control imaging of the camera 56. The control data 107 is transmitted to the terminal at the appropriate time, 72 323330 201220109. The controller posture data 108 is a material showing the posture of the controller 5. In the present embodiment, the controller posture data 108 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96 included in the controller operation data 92. The method of calculating the controller posture data 108 will be described in step S23. The terminal posture data 109 is a material for displaying the posture of the terminal device 7. In the present embodiment, the terminal posture data 109 is calculated based on the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 included in the terminal operation data 97. The method of calculating the terminal posture data 109 will be described in step S24. The image identification data 110 is data for displaying a result of performing predetermined image recognition processing on the camera image. The image recognition processing can be any processing as long as a certain feature can be detected from the camera image and outputting the result, for example, a predetermined object can be extracted from the camera image (for example, the face of the user or the user's face or Mark, etc., and calculate the processing of the information related to the captured object. The voice recognition data m is data showing the result of performing predetermined voice recognition processing on the microphone sound. The voice recognition process can be arbitrarily processed as long as a certain feature can be detected from the microphone sound and outputted, for example, the process of detecting the user's speech, or only the output volume. Next, the details of the game processing performed in the game device 3 will be described with reference to Fig. 22. Fig. 22 is a main flow chart showing the flow of the processing performed by the game device 3 in the game device 73 323330 201220109. When the power of the game device 3 is turned on, the CPU 10 of the game device 3 executes an activation program stored in a boot ROM (not shown), thereby performing initialization of each unit of the main memory or the like. Then, the game program stored on the disc 4 is read to the main memory, and the game program is started by the CPU 10. In the game device 3, the game program stored in the disc 4 can be executed immediately after the power is turned on, or the built-in program that displays the predetermined menu face is first executed after the power is turned on, and then the game is instructed by the user. At the beginning, the game program stored on the disc 4 is executed. The flowchart shown in Fig. 22 is a flowchart of the processing performed after the above processing is completed. The processing of each step of the flowchart shown in Fig. 22 is only an example, and the processing order of each step can be replaced as long as the same result can be obtained. In addition, the variable value or the threshold used in the judgment step is only an example, and other values may be used as necessary. Further, in the present embodiment, the processing of each step of the above-described flowchart is executed by the CPU 10. However, the processing of some of the above steps may be performed by a processor or a dedicated circuit other than the CPU 10. First, in step S1, the CPU 10 performs initial processing. The initial processing, for example, constructs a virtual game space, and arranges the objects that appear in the game at the initial position, or sets the initial values of various parameters used in the game processing. Further, in the present embodiment, in the initial processing, the CPU 10 controls the lighting of the pointing device 6 and the indicator portion 55 in accordance with the type of the game program. Here, the game system 1 includes both the indicator device 6 and the indicator portion 74 323330 201220109 55 of the terminal device 7 as an imaging means (imaging information computing unit 35) of the controller 5. Any one or both of the device 6 and the labeling unit 55 are used depending on the content of the game (the type of the game program). The game program 卯 / middle package 3 shows whether or not the information of the marking device 6 and the indicator portion 55 is turned on. The CPU 1G reads the data and judges whether or not the lighting is to be performed. When the standard device 6 and/or the indicator portion 55 are turned on, the following processing is performed. That is, when the indicator is placed on the 6th light, the cpu 10 will transmit the control signal of each infrared LED lighting of the marking device to the marking device 6. The transmission of the control can be a power supply only. In response to this, each of the infrared LEDs of the marking device 6 is lit. On the other hand, when the indicator portion 55 is f, the CPU 1G generates a control material indicating that the indicator portion 55 is turned on, and the money is stored in the main memory. The generated control data is transmitted to the terminal device 7 in step S1G which will be described later. The control data received in the wireless mode of the terminal device 7 and 70 is transmitted to the UI controller 75 via the codec (5), and the w controller 75 performs the pointing of the indicator portion 55. In this case, the infrared LED lighting of the indicator portion 5 5 indicates that the device 6 and the indicator portion 55 are turned on, and the "off" of the indicator device 6 and the indicator portion 55 can be used by lighting. For the same processing. After the above step S1, the processing of step S2 is performed. Hereinafter, the processing circuit constituted by the series of processes of the steps S2 to S11 is repeatedly executed in a ratio of the predetermined time period α frame time. In the middle, the CPU 1G obtains the control transmitted from the controller 5 as a bedding material. Since the controller 5 repeatedly transmits the controller operation data to the game device 3, the game device 3 receives the (4) pass 323330 75 201220109 times to receive the (four) work, and the received controller operation data is successively memorized in the main device. The interval of : is better than the processing time of the game ^, - seconds. Fine middle side. From = to two, ^ step S3, CPU 10 & γ " a kind of data. The terminal device 7 is connected by :::::

料及麥克風聲音資料重複傳送至攝影機二象資 罟q <芏避戲裝置3,所以遊戲裝 =3,接收料倾。遊㈣置3中,終端通訊模組找 人接收此等貧料,II由編解竭⑸27對攝影機圖像資 ,麥,風聲音資料逐次施以解壓縮處理。此外,輸出輪 器lla將終^操作資料及攝影機圖像資料及麥克風 士音資料逐次記憶於主記憶體。步驟s3中,cpU 1〇從主 =隱體中讀取最新的終職作資料9卜 3後 步驟S4的處理。 步驟S4中,CPU 1〇執行遊戲控制處理。遊戲控制處 j為依循使用者的遊戲操作來執行使遊戲空間内的物件動 之處理等’以使遊戲進行之處理。本實施形態中,使用 ^可使用控制11 5及/或終7來進行種種遊戲。以下 '照第23 ®來說明遊戲控制處理。 + _圖係顯示遊戲控制處理的詳細流程之流輕圖。第 ,所不之-連串處理,為將控制器5及終端装篆7用作 二呆作裝置時所能夠執行之種種處理,但並不〆定須執行 玉部的各處理,可因應遊戲的種類或内容而僅執行-部分 76 32333〇 201220109 處理。 遊戲控制處理中,首先在步驟S21中,CPU 10係判定 是否變更所使用的標示器。如上所述,本實施形態中,遊 戲處理的開始時(步驟S1),係執行控制標示裝置6及標示 部55的點燈之處理。在此,因遊戲的不同,亦考量有在遊 戲的中途會變更標示裝置6及標示部55中所使用(點燈) 之對象的情況。此外,因遊戲的不同,雖考量有使用標示 裝置6及標示部55兩者之情況,但將兩者點燈時,會有將 ® 某一方的標示器誤偵測為另一方的標示器之疑慮。因此, 遊戲中,亦有較佳僅將任一方點燈來切換點燈而使用之情 況。步驟S21的處理,為考量該情形,來判定是否在遊戲 的中途需變更點燈對象之處理。 上述步驟S21的判定,例如可藉由以下方法來進行。 亦即,CPU 10可因應遊戲狀況(遊戲的階段或操作對象等) 是否產生變化來進行上述判定。此係由於可考量到當遊戲 ^ 狀況產生變化時,會在朝向標示裝置6來操作控制器5之 操作方法、與朝向標示部55來操作控制器5之操作方法之 間變更操作方法之故。此外,CPU 10可根據控制器5的姿 勢來進行上述判定。亦即,可藉由判斷控制器5朝向標示 裝置6或朝向標示部55者來進行上述判定。控制器5的姿 勢例如可根據加速度感測器37或迴轉感測器48的偵測結 果來算出(參照後述步驟S23)。此外,CPU 10亦可藉由判 斷是否有使用者的變更指示來進行上述判定。 當上述步驟S21的判定結果為肯定時,執行步驟S22 77 323330 201220109 的處理。另一方面,上述步驟S21的判定結果為否定時, 跳過步驟S22的處理而執行步驟S23的處理。 步驟S22中,CPU 10係控制標示裝置6及標示部55 的點燈。亦即,變更標示裝置6及/或標示部55的點燈狀 態。將標示裝置6及/或標示部55點燈或熄滅之具體處理, 可與上述步驟S1時同樣地進行。在步驟S22後執行步驟 S23的處理。 如上所述,根據本實施形態,藉由上述步驟S1的處 ® 理,可因應遊戲程式的種類來控制標示裝置6及標示部55 的發光(點燈),並藉由上述步驟S21及步驟S22的處理, 因應遊戲狀況來控制標示裝置6及標示部55的發光(點 燈)。 步驟S23中,CPU 10係算出控制器5的姿勢。本實施 形態中,控制器5的姿勢係根據第1加速度資料94、第1 角速度資料95、及標示器座標資料96來算出。以下說明 φ 控制器5的姿勢之算出方法。 首先,CPU 10根據記憶於主記憶體之第1角速度資料 95,算出控制器5的姿勢。從角速度算出控制器5的姿勢 之方法,可為任意方法,該姿勢可使用前次的姿勢(前次所 算出之姿勢)與此次的角速度(此次的處理迴路中在步驟S2 所取得之角速度)來算出。具體而言,CPU 10係藉由以此 次的角速度使前次的姿勢旋轉某單位時間份來算出姿勢。 前次的姿勢是由主記憶體中所記憶之控制器姿勢資料108 來顯示,此次的角速度由主記憶體中所記憶之第1角速度 78 323330 201220109 ,咖1〇係從主記憶體讀取控制器 勢。顯不出以上述方式所算出之「依 = 資料,被記憶於主記憶體。 、又-」. "從角速度算出姿勢時,可預先決定初始姿勢。亦即, 出!制器5的姿勢時,10可於最初先算出 勢。控制器5的初始姿勢可根據加速度 來真出’或疋在使控制器5處於特定姿勢之狀離下讓 遊戲者進行預定操作,而將進行預定操作之時點鐘㈣定 姿勢用作為初始姿勢。當算出控制器5的姿勢作為以空間 上的預定方向為基準之絕對姿勢時,可算出上述初始姿 勢,而當算出控制器5的姿勢作為以例如在遊戲開始時點 中之控制器5的姿勢為基準之相對姿勢時,可不算出上述 初始姿勢。 接著,CPU 10係使用第丨加速度資料94來修正根據 •角速度所算出之控制器5的姿勢。具體而言,CPU 10首先 從^記憶體中讀取第!加速度資料94,並根據該第(加速 度資料94來算出控制器5的姿勢。在此,在控制器5幾乎 呈靜止之狀態下,施加於控制器5之加速度係意味著重力 加速度。因此,該狀態下,可使用加速度感測器37所輸出 之第1加速度資料94來算出重力加速度的方向(重力方 向),因此可根據s亥第1加速度資料94來算出相對於重力 方向之控制器5的朝向(姿勢)。顯示出以上述方式所算出 之「依據加速度之姿勢」的資料’被記憶於主記憶體。 79 323330 201220109 當算出依據加速度之姿勢時,CPU 10接著使用依據加 速度之姿勢,來修正依據角速度之姿勢。具體而言,CPU 10 係從主記憶體中讀取顯示出依據角速度之姿勢的資料以及 顯示出依據加速度之姿勢的資料,並以預定比率使依據角 速度資料之姿勢接近於依據加速度資料之姿勢來進行修 正。該預定比率可為預先決定的固定值,或是因應第1加 速度資料94所顯示之加速度等來設定。此外,關於依據加 速度之姿勢,由於對以重力方向為軸之旋轉方向無法算出 ® 姿勢,因此CPU 10對該旋轉方向不進行修正。本實施形態 中,將顯示出以上述方式所得之校正後的姿勢之資料記憶 於主記憶體。 以上述方式來修正依據角速度之姿勢後,CPU 10更使 用標示器座標資料96將修正後的姿勢進行修正。首先,CPU 10根據標示器座標資料96來算出控制器5的姿勢(依據標 示器座標之姿勢)。由於標示器座標資料96係顯示出攝像 φ 圖像之標示器6R及6L的位置,所以可從此等位置中算出 與橫搖方向(繞著Z軸之旋轉方向)相關之控制器5的姿 勢。亦即,攝像圖像内,可從連結標示器6R的位置及標示 器6L的位置之直線的斜率中,算出與橫搖方向相關之控制 器5的姿勢。此外,當可特定出控制器5相對於標示裝置 6之位置時(可假定例如控制器5位於標示裝置6的正面 時),可從攝像圖像之標示裝置6的位置,算出與縱搖方向 及偏搖方向相關之控制器5的姿勢。例如,當攝像圖像内 標示器6R及6L的位置往左移動時,可判斷為控制器5改 80 323330 201220109 變朝向(姿勢)往右。如此,可從標示器服及6L的位置中, 算出與縱搖方向及偏搖方向相關之控制器5的姿勢。如上 所述’可根據標示器座標資料96來算出控制器5的姿勢。 當算出依據標示器座標之姿勢時,cpu 1〇接著藉由依 據心示:座‘之^勢來修正上述修正後的姿勢(依據力口速 度之姿勢進行修正後之姿勢)。亦即,cpu 1〇係以預定比 率使修正後的姿勢接近於依據標示器座標之姿勢來進行修 正^該預定比率可為預先決定的固定值。此外,由依據標 不益座標之姿勢所進行的修正,可僅對橫搖方向、縱搖方 向、及偏搖方向中的任意1方向或2個方向進行。例如, 使用標示器座標資料96時可對橫搖方向精度佳地算出姿 勢,所以CPU 10可使用依據標示器座標資料96之姿勢僅 對橫搖方向進行修正。此外,當未藉由控制器5的攝像元 件40將標示裝置6或標示部55攝像時,無法算出依據標 示器座標資料96之姿勢,故此時亦可不執行使用標示器座 ^ 標資料96的修正處理。 根據上述’ CPU 10係使用第1加速度資料94及標示 器座標資料96,來修正根據第!角速度資料95所算出之 控制器5的姿勢。在此’於算出控制器5的姿勢之方法中, 使用角速度之方法中,不論控制器5如何動作,均可算出 姿勢。另-方面,使用角速度之方法中,由於累積加算所 逐次價測出之角速度來算出姿勢,故會有因誤差的累積等 而導致精度惡化,或是因所謂溫度漂移的問題使迴轉感測 器的精度惡化之疑慮。此外,使用加速度之方法中,雖不 323330 81 201220109 會累積誤差,但在控制器5激烈動作之狀態下,無法精度 佳地算出姿勢(無法正確地偵測重力方向之故)。此外,使 用標示器座標之方法中,雖可精度佳地算出姿勢(尤其對橫 搖方向),但在無法將標示部55攝像時無法算出姿勢。相 對於此,根據本實施形態,如上所述由於使用不同特長的 3種方法,所以更能夠正確地算出控制器5的姿勢。其他 實施形態中,亦可使用上述3種方法中的任一種或2種來 算出姿勢。此外,在上述步驟S1或步驟S22的處理中進行 ® 標示器的點燈控制時,CPU 10較佳係至少使用標示器座標 來算出控制器5的姿勢。 在上述步驟S23後執行步驟S24的處理。步驟S24中, CPU 10係算出終端裝置7的姿勢。亦即,由於從終端裝置 7所取得之終端操作資料97中包含有第2加速度資料 101、第2角速度資料102、及方位資料103,因此CPU 10 可根據此等資料來算出終端裝置7的姿勢。在此,CPU 10 φ 可藉由第2角速度資料102而得知終端裝置7之每單位時 間的旋轉量(姿勢的變化量)。此外,在終端裝置7幾乎呈 靜止之狀態下,施加於終端裝置7之加速度係意味著重力 加速度,因此可藉由第2加速度資料101得知施加於終端 裝置7之重力方向(亦即以重力方向為基準之終端裝置7的 姿勢)。此外,可藉由方位資料103來得知以終端裝置7為 基準之預定方位(亦即以預定方位為基準之終端裝置7的 姿勢)。即使在產生地磁以外的磁場時,亦可得知終端裝置 7的旋轉量。因此,CPU10可根據此等第2加速度資料10卜 82 323330 201220109 第2角速度資料102、及方位資料103來算出終端裝置7 的姿勢。本實施形態中,係根據上述3種資料來算出終端 裝置7的姿勢,但在其他實施形態中,亦可依據上述3種 資料中的任一種或2種來算出姿勢。 終端裝置7的姿勢之具體的算出方法可為任意方法, 例如可考量使用第2加速度資料101及方位資料103,來 修正根據第2角速度資料102所顯示之角速度所算出的姿 勢。具體而言,CPU 10首先根據第2角速度資料102來算 ® 出終端裝置7的姿勢。根據角速度來算出姿勢之方法,可 與上述步驟S23中的方法相同。接著,CPU 10在適當的時 機中(例如終端裝置7接近於靜止狀態時),藉由根據第2 加速度資料101所算出之姿勢及/或根據方位資料103所算 出之姿勢,來修正根據角速度所算出之姿勢。以依據加速 度之姿勢來修正依據角速度之姿勢之方法,可使用與上述 算出控制器5的姿勢時相同之方法。此外,當以依據方位 φ 資料之姿勢來修正依據角速度之姿勢時,CPU 10亦能夠以 預定比率使依據角速度之姿勢接近於依據方位資料之姿勢 來進行修正。根據以上内容,CPU 10可正確地算出終端裝 置7的姿勢。 由於控制器5具備作為紅外線偵測手段之攝像資訊運 算部35,所以遊戲裝置3可取得標示器座標資料96。因此, 關於控制器5,遊戲裝置3可從標示器座標資料96得知實 際空間中的絕對姿勢(於設定在實際空間之座標系中,控制 器5處於何種姿勢)。另一方面,終端裝置7並不具備如攝 83 323330 201220109 像資訊運算部35的紅外線偵測手段。因此,遊戲裝置3僅 能夠從第2加速度資料101及第2角速度資料102中,對 於以重力方向為軸之旋轉方向,得知實際空間中的絕對姿 勢。因此,本實施形態中,終端裝置7構成為具備磁性感 測器72,以使遊戲裝置3取得方位資料103。根據此,遊 戲裝置3,對於以重力方向為軸之旋轉方向,可從方位資 料103算出實際空間中的絕對姿勢,而更能夠正確地算出 終端裝置7的姿勢。 上述步驟S24的具體處理,CPU 10係從主記憶體讀取 第2加速度資料101、第2角速度資料102、及方位資料 103,並根據此等資料來算出終端裝置7的姿勢。並將顯示 出所算出之終端裝置7的姿勢之資料,作為終端姿勢資料 109記憶於主記憶體。在步驟S24後執行步驟S25的處理。 步驟S25中,CPU 10係執行攝影機圖像的辨識處理。 亦即,CPU 10對攝影機圖像資料104進行預定的辨識處理。 該辨識處理,只要可從攝影機圖像偵測出某種特徵並輸出 該結果者均可為任意處理。例如在攝影機圖像中包含遊戲 者的臉時,可為辨識出臉之處理。具體而言,可為辨識出 臉的一部分(眼或鼻或嘴等)之處理,或是偵測出臉的表情 之處理。此外,顯示出辨識處理的結果之資料,係作為圖 像辨識資料110記憶於主記憶體。在步驟S25後執行步驟 S26的處理。 步驟S26中,CPU 10係執行麥克風聲音的辨識處理。 亦即,CPU 10對麥克風聲音資料105進行預定的辨識處理。 84 323330 201220109 該辨識處理,只要可從麥克風聲音偵測出某種特徵並輸出 該結果者均可為任意處理。例如為從麥克風聲音偵測出遊 戲者的指示之處理,或僅為偵測出麥克風聲音的音量之處 理。此外,當顯示出辨識處理的結果之資料,係作為聲音 辨識資料111記憶於主記憶體。在步驟S26後執行步驟S27 的處理。 步驟S27中,CPU 10係執行因應遊戲輸入之遊戲處理。 在此,所謂遊戲輸入,可為從控制器5或終端裝置7所傳 ® 送來之資料或是從該資料所得到之資料均可。具體而言, 遊戲輸入除了控制器操作資料92及終端操作資料97中所 包含之各資料之外,亦可為從該各資料所得到之資料(控制 器姿勢資料108、終端姿勢資料109、圖像辨識資料110、 及聲音辨識資料111)。此外,步驟S27中之遊戲處理的内 容可為任意内容,例如為使遊戲中登場之物件(角色)動作 之處理、控制虛擬攝影機之處理、或是使顯示於晝面上之 φ 遊標移動之處理。此外,將攝影機圖像(或其一部分)用作 為遊戲圖像之處理、或是將麥克風聲音用作為遊戲聲音之 處理等。上述遊戲處理的例子將於之後說明。步驟S27中, 例如遊戲中登場之物件(角色)中所設定之各種參數的資 料、或是與配置在遊戲空間之虛擬攝影機相關之參數的資 料、得分的資料等之顯次出遊戲控制結果之資料,係記憶 於主記憶體。在步驟S27後,CPU 10結束步驟S4的遊戲 控制處理。 返回第22圖的說明,步驟S5中,用以顯示於電視2 85 323330 201220109 之電視用遊戲圖像,係由CPU 10及GPU lib所生成。亦即, CPU 10及GPU lib係從主記憶體中,讀取顯示出步驟S4 之遊戲控制處理的結果之資料,並從VRAM lid中讀取用以 生成遊戲圖像所需之資料,來生成遊戲圖像。遊戲圖像只 要可顯示出步驟S4之遊戲控制處理的結果者即可,可由任 意方法來生成。例如,遊戲圖像的生成方法,可為藉由將 虛擬攝影機配置在虛擬的遊戲空間内並計算從虛擬攝影機 所觀看之遊戲空間,來生成3維CG圖像之方法,或是(不 使用虛擬攝影機)生成2維圖像之方法。所生成之電視用遊 戲圖像被記憶於VRAM lid。在上述步驟S5後執行步驟S6 的處理。 步驟S6中,用以顯示於終端裝置7之終端用遊戲圖 像,係由CPU 10及GPU lib所生成。終端用遊戲圖像亦與 上述電視用遊戲圖像相同,只要可顯示出步驟S4之遊戲控 制處理的結果者即可,可由任意方法來生成。此外,終端 φ 用遊戲圖像可藉由與上述電視用遊戲圖像相同之方法來生 成,或是不同方法來生成。所生成之終端用遊戲圖像被記 憶於VRAM lid。因遊戲内容的不同,電視用遊戲圖像與終 端用遊戲圖像可為同一,此時在步驟S6中可不執行遊戲圖 像的生成處理。在上述步驟S6後執行步驟S7的處理。 步驟S7中,係生成用以輸出至電視2的喇队2a之電 視用遊戲聲音。亦即,CPU 10係於DSP 11c中生成因應步 驟S4之遊戲控制處理的結果之遊戲聲音。所生成之遊戲聲 音,例如為遊戲的效果音、在遊戲中登場之角色的聲音、 86 323330 201220109 或是BGM等。在上述步驟S7後執行步驟S8的處理。 步驟S8中,係生成用以輸出至終端裝置7的°刺σ八77 之終端用遊戲聲音。亦即,CPU 10係於DSP 11c中生成因 應步驟S4之遊戲控制處理的結果之遊戲聲音。終端用遊戲 聲音,可與上述電視用遊戲聲音相同或不同。此外,例如 效果音雖不同,但BGM可相同之僅有一部分不同。當電視 用遊戲聲音與終端用遊戲聲音為同一時,在步驟S8中可不 執行遊戲聲音的生成處理。在上述步驟S8後執行步驟S9 •的處理。 步驟S9中,CPU 10將遊戲圖像及遊戲聲音輸出至電 視2。具體而言,CPU 10將記憶於VRAM lid之電視用遊戲 圖像的資料、及在步驟S7中由DSP 11c所生成之電視用遊 戲聲音的資料,傳送至AV-IC 15。因應於此,AV-IC 15經 由AV連接器16將圖像及聲音的資料輸出至電視2。藉此, 電視用遊戲圖像被顯示於電視2,並且電視用遊戲聲音從 φ 喇叭2a輸出。在步驟S9後執行步驟S10的處理。 步驟S10中,CPU 10將遊戲圖像及遊戲聲音傳送至終 端裝置7。具體而言,記憶於VRAM lid之終端用遊戲圖像 的圖像資料、及在步驟S8中由DSP 11c所生成之聲音的資 料,係藉由CPU 10傳送至編解碼器LSI 27,並藉由編解 碼器LSI 27進行預定的壓縮處理。再者,施以壓縮處理後 之圖像及聲音的資料^係藉由終端通訊核組.2 8經由天線 29傳送至終端裝置7。終端裝置7藉由無線模組80而接收 從遊戲裝置3所傳送來之圖像及聲音的資料,並藉由編解 87 323330 201220109 碼器LSI 76進行預定的解壓縮處理。進行解壓縮處理後的 圖像資料被輸出至LCD 51,進行解壓縮處理後的聲音資料 被輸出至聲音1C 78。藉此,終端用遊戲圖像被顯示於LCD 51,並且終端用遊戲聲音從喇叭77輸出。在步驟S10後執 行步驟S11的處理。 步驟S11中,CPU 10判定是否結束遊戲。步驟S11的 判定,例如藉由是否為遊戲結束或使用者進行中止遊戲之 指示等來進行。步驟S11的判定為否定時,再次執行步驟 ® S2的處理。另一方面,步驟S11的判定為肯定時,CPU 10 結束第12圖所示之遊戲處理。以下,至步驟S11中判定結 束遊戲為止,重複執行步驟S2至S11的一連串處理。 如上所述,本實施形態中,終端裝置7係具備觸控面 板52與加速度感測器73或迴轉感測器74之慣性感測器, 觸控面板52與慣性感測器的輸出作為操作資料被傳送至 遊戲裝置3而用作為遊戲的輸入(步驟S3、S4)。此外,終 φ 端裝置7具備顯示裝置(LCD 51),藉由遊戲處理所得之遊 戲圖像被顯示於LCD 51(步驟S6、S10)。因此,使用者可 使用觸控面板52對遊戲圖像直接進行觸控操作,此外,(由 於終端裝置7的動作可藉由慣性感測器偵測出,所以)可進 行將顯示有遊戲圖像之LCD 51本身移動之操作。使用者可 藉由此等操作,以對遊戲圖像直接進行操作般之操作感覺 來進行遊戲,因此可提供例如後述第1及第2遊戲例之新 穎操作感覺的遊戲。 此外,本實施形態中,終端裝置7係具備可在握持終 88 323330 201220109 端裝置7之狀態下操作之翻 裝置3可㈣減搖椁53^桿53料賴54,遊戲 為遊戲輸人(步驟S3、S4)=作鍵54所進行的操作用作 圖像直接進行㈣,㈣=此^上所述,即使對遊戲 進行詳細㈣謝㈣。者村作祕桿操作來 再者,本實施形態Φ ^ , 麥克風79,攝影機56所攝像H置7係具備攝影機56及 風79所制轉錢聲資料及麥克 驟⑻。因此,遊驗置料被傳送至遊戲裝置3(步 風聲音用作為賴h賴影_像及/或麥克 56將圖像料縣,作亦可藉㈣攝影機 作,來進行遊麵作。此^t 麥克風79之操 m % r/ , L 4細作可在握持終端裝置7之狀 i由圖像直接進行操作時,可 θ此:’太二二讓使用者進行更多樣化的遊戲操作。 之線端m m ’由於遊戲圖像顯示於可搬運型 由地配置终端裝置7。因:驟::),所以使用者可自 可置7配置在自由位置,使用者 了使控似5朝向自由方向來奸 5所進行之操作的自由度。此^對控㈣ 置在任意位置,如後述第5遊心1 於可將終端I置7配 配置在適合遊戲内容之位置,可實藉由將广、端裝置7 此外,根據本實施形態,遊戲裝置、3H,之遊戲。 終端裝置7取得操作資料等(步驟 ' ”空制器5及 S3),所以使用者可 323330 89 201220109 將控制器5及終端裝置7的2個裝置用作為操作手段。因 此’遊戲系統1中,亦可讓複數位使用者使用各裝置而讓 複數個人進行遊戲,此外,1位使用者亦可使用2個農置 來進行遊戲。 此外’根據本實施形態,遊戲裝置3係生成2種遊戲 圖像(步驟S5、S6),並可將遊戲圖像顯示於電視2及終端 裴,7(步驟S9、sl〇)。如此,藉由將2種遊戲圖像顯示於 IS置,可提供對使用者而言更容易觀看之遊戲圖像: 的操作性。例如’當2個人進行遊戲時,如後 或第4遊戲例般’將對某一方的使用 =看之視點的遊戲圖像顯示於電視2,將對 者而言為容易觀看之視點的遊戲 的使用 使為例如‘=:進行遊戲。此外,即 .t 進订遊戲時’如後述第1、筮9 +杜 般’可在不同處的視點巾顯 5 4 5遊戲 [6.遊戲例] 接著說明遊戲系統丨中 、 ‘明之遊戲例中,結仃之遊戲的具體例 遊戲者更容易掌握遊戲=朗像,藉此, 「6_游銳也丨1 、袠而楗升遊戲的操作性。 以下 成中的-部分之情形,此外=系統1 【所示之-連串處对的1 執:第22圖及第23 ***1可不具備上述全部的構成,之情形。亦即,遊戲 不執行第22圖及第23圖所示此外,遊戲裝置3亦可 (第1遊戲例) 、—連串處理中的一部分。 90 32333〇 201220109 第1遊戲例,為藉由操作終端裝置7而在遊戲空間内 射出物件(飛鏢)之遊戲。遊戲者,藉由改變終端裝置7的 姿勢之操作、及在觸控面板52上將線描繪出之操作,可指 示射出飛縛之方向。 第24圖係顯示第1遊戲例中之電視2的晝面與終端裝 置7之圖。第24圖中,於電視2及終端裝置7的LCD 51 上,顯示出表示遊戲空間之遊戲圖像。於電視2顯示出飛 鏢121、控制面122、及標靶123。於LCD 51顯示出控制 • 面122(及飛鏢121)。第1遊戲例中,遊戲者係藉由使用終 端裝置7之操作,射出飛鏢121來擊中標靶123而進行遊 戲。 射出飛鏢121時,遊戲者首先藉由操作終端裝置7的 姿勢,來改變虛擬遊戲空間内所配置之控制面122的姿勢 而成為期望姿勢。亦即,CPU 10根據慣性感測器(加速度 感測器7 3及迴轉感測器7 4 )以及磁性感測器7 2的輸出, φ 算出終端裝置7的姿勢(步驟S24),並根據所算出的姿勢 來改變控制面122的姿勢(步驟S27)。第1遊戲例中,控 制面122的姿勢係以因應實際空間中之終端裝置7的姿勢 之姿勢方式被控制。亦即,遊戲者藉由改變終端裝置7(顯 示於終端裝置7之控制面122)的姿勢,而可在遊戲空間内 改變控制面122的姿勢。第1遊戲例中,控制面122的位 置被固定在遊戲空間中的預定位置。 接著,遊戲者使用觸控筆124等在觸控面板52上進行 將線描繪出之操作(參照第24圖所示之箭頭)。在此,第1 91 323330 201220109 遊戲例中’於終端裝置7的LCD 51,係以使觸控面板52 的輸入面與控制面122相對應之方式來顯示控制面122。 因此,藉由栺繪在觸控面板52上之線,可算出在控制面 122上的方向(該線所顯示之方向)。飛鏢121被射往如此 決定之方向。從上述中,cpu 1〇從觸控面板52的觸控位 置資料100來算出在控制面122上的方向,並進行將飛錦 121移動至该算出後的方向之處理(步驟 S27)。CPU 10,例 如可因應線的長度或將線描繪出的速度來控制麟121的 速度。 、、如上所逑,根據第丨遊戲例,遊戲裝置3藉由將慣性 感:則器的輸4用作為遊戲輸人,可因應終端裝置7的移動 (姿勢)使&制面122移動,並藉由將觸控面板Μ的輸出用 ,可特定出在控制面122上的方向。根據此, 圖像)或1是^顯7^於終端裝置7之遊戲圖像(控制面122的The material and microphone sound data are repeatedly transmitted to the camera 2 资q < 芏 戏 装置 3, so the game is installed = 3, the receiving material is tilted. Tour (four) is set to 3, the terminal communication module finds people to receive such poor materials, II is edited and exhausted (5) 27 pairs of camera images, wheat, wind sound data are decompressed one by one. In addition, the output wheel 11a sequentially memorizes the operation data, the camera image data, and the microphone sound data in the main memory. In step s3, cpU 1〇 reads the latest final job data from the main = hidden body, and then the processing of step S4. In step S4, the CPU 1 executes the game control processing. The game control unit j performs processing for moving the objects in the game space or the like in accordance with the game operation of the user to cause the game to be processed. In the present embodiment, various games can be performed using the control 11 5 and/or the final 7 using . The following describes the game control processing according to the 23rd. The + _ diagram shows a streamlined diagram of the detailed flow of the game control process. In the first place, the series of processes can be executed in order to use the controller 5 and the terminal device 7 as two devices. However, it is not necessary to perform various processes of the jade, and the game can be played. The type or content is only executed - part 76 32333〇201220109 processing. In the game control processing, first, in step S21, the CPU 10 determines whether or not to change the marker to be used. As described above, in the present embodiment, at the start of the game processing (step S1), the processing of the lighting of the control signing device 6 and the indicator portion 55 is performed. Here, depending on the game, it is also considered that the object used (lighting) in the marking device 6 and the indicator portion 55 is changed in the middle of the game. In addition, depending on the game, although the use of both the marking device 6 and the indicator portion 55 is considered, when both lights are turned on, the marker of one of the ® detectors may be erroneously detected as the other marker. doubt. Therefore, in the game, it is preferable to use only one of the lights to switch the lighting. In the process of step S21, in order to consider this situation, it is determined whether or not the processing of the lighting object needs to be changed in the middle of the game. The determination of the above step S21 can be performed, for example, by the following method. That is, the CPU 10 can perform the above determination in accordance with whether or not the game situation (the stage of the game, the operation target, and the like) changes. This is because it is possible to change the operation method between the operation method of operating the controller 5 toward the pointing device 6 and the operation method of operating the controller 5 toward the indicator portion 55 since it is possible to change the situation of the game. Further, the CPU 10 can make the above determination based on the posture of the controller 5. That is, the above determination can be made by judging that the controller 5 faces the pointing device 6 or the pointing portion 55. The posture of the controller 5 can be calculated, for example, based on the detection results of the acceleration sensor 37 or the rotation sensor 48 (refer to step S23 described later). Further, the CPU 10 can also perform the above determination by judging whether or not there is a change instruction of the user. When the result of the determination in the above step S21 is affirmative, the processing of step S22 77 323330 201220109 is performed. On the other hand, if the result of the determination in the above step S21 is negative, the processing of step S23 is executed while the processing of step S22 is skipped. In step S22, the CPU 10 controls the lighting of the marking device 6 and the indicator portion 55. That is, the lighting state of the marking device 6 and/or the indicator portion 55 is changed. The specific processing of turning on or off the marking device 6 and/or the indicator portion 55 can be performed in the same manner as in the above-described step S1. The process of step S23 is performed after step S22. As described above, according to the present embodiment, the illumination (lighting) of the pointing device 6 and the indicator portion 55 can be controlled in accordance with the type of the game program by the above-described step S1, and by the above steps S21 and S22 The processing controls the lighting (lighting) of the marking device 6 and the indicator portion 55 in response to the game situation. In step S23, the CPU 10 calculates the posture of the controller 5. In the present embodiment, the posture of the controller 5 is calculated based on the first acceleration data 94, the first angular velocity data 95, and the marker coordinate data 96. The method of calculating the posture of the φ controller 5 will be described below. First, the CPU 10 calculates the posture of the controller 5 based on the first angular velocity data 95 stored in the main memory. The method of calculating the posture of the controller 5 from the angular velocity may be any method in which the previous posture (the previously calculated posture) and the current angular velocity (the current processing loop obtained in step S2) can be used. The angular velocity is calculated. Specifically, the CPU 10 calculates the posture by rotating the previous posture by a certain unit time portion by the angular velocity at this time. The previous posture is displayed by the controller posture data 108 memorized in the main memory. The angular velocity of this time is read from the main memory by the first angular velocity 78 323330 201220109 stored in the main memory. Controller potential. It is not possible to display the "in accordance with the data stored in the main memory." and "-." That is, when the posture of the controller 5 is released, 10 can first calculate the potential. The initial posture of the controller 5 can be made true based on the acceleration or the player is allowed to perform a predetermined operation while the controller 5 is in a specific posture, and the hour (4) posture is used as the initial posture when the predetermined operation is performed. When the posture of the controller 5 is calculated as an absolute posture based on a predetermined direction in space, the initial posture can be calculated, and the posture of the controller 5 can be calculated as, for example, the posture of the controller 5 at the time of the start of the game. When the relative posture of the reference is made, the initial posture may not be calculated. Next, the CPU 10 uses the second acceleration data 94 to correct the posture of the controller 5 calculated based on the angular velocity. Specifically, the CPU 10 first reads the first from the memory! The acceleration data 94 calculates the posture of the controller 5 based on the first (acceleration data 94. Here, in the state where the controller 5 is almost stationary, the acceleration applied to the controller 5 means the acceleration of gravity. Therefore, In the state, the direction (gravity direction) of the gravitational acceleration can be calculated using the first acceleration data 94 output from the acceleration sensor 37. Therefore, the controller 5 with respect to the gravity direction can be calculated from the first acceleration data 94 The orientation (posture) shows that the "data based on the acceleration posture" calculated in the above manner is memorized in the main memory. 79 323330 201220109 When calculating the posture according to the acceleration, the CPU 10 then uses the posture according to the acceleration. Specifically, the CPU 10 reads data showing a posture according to the angular velocity and data showing a posture according to the acceleration from the main memory, and approximates the posture according to the angular velocity data at a predetermined ratio. Correcting according to the posture of the acceleration data. The predetermined ratio may be a predetermined fixed value Alternatively, it is set in accordance with the acceleration displayed by the first acceleration data 94. Further, since the posture according to the acceleration cannot be calculated in the direction of rotation in the direction of the gravity direction, the CPU 10 does not correct the rotation direction. In the present embodiment, the data showing the corrected posture obtained as described above is stored in the main memory. After the posture according to the angular velocity is corrected as described above, the CPU 10 further uses the marker coordinate data 96 to correct the posture. First, the CPU 10 calculates the posture of the controller 5 based on the marker coordinate data 96 (according to the posture of the marker coordinates). Since the marker coordinate data 96 shows the positions of the markers 6R and 6L of the imaging φ image. Therefore, the posture of the controller 5 related to the roll direction (rotation direction around the Z axis) can be calculated from these positions. That is, the position of the marker 6R and the marker 6L can be obtained from the captured image. In the slope of the straight line of the position, the posture of the controller 5 related to the roll direction is calculated. Further, when the controller 5 is specified relative to the pointing device At the position of 6 (assuming, for example, that the controller 5 is located on the front side of the pointing device 6), the posture of the controller 5 related to the pitch direction and the yaw direction can be calculated from the position of the pointing device 6 of the captured image. When the position of the marker 6R and 6L in the captured image moves to the left, it can be determined that the controller 5 changes 80 323330 201220109 to the right direction (posture). Thus, it can be calculated from the position of the marker device and 6L. The posture of the controller 5 related to the pitch direction and the yaw direction. As described above, the posture of the controller 5 can be calculated based on the marker coordinate data 96. When calculating the posture according to the marker coordinates, the cpu 1 〇 then borrows The corrected posture (the posture corrected according to the posture of the force port speed) is corrected by the basis of the heart: the potential of the seat. That is, the cpu 1 system corrects the corrected posture by a predetermined ratio close to the posture of the marker coordinates. The predetermined ratio may be a predetermined fixed value. Further, the correction by the posture according to the target coordinates may be performed in only one of the roll direction, the pitch direction, and the tilt direction, or two directions. For example, when the marker coordinate data 96 is used, the posture can be accurately calculated in the pan direction, so that the CPU 10 can correct only the pan direction using the posture according to the marker coordinate data 96. Further, when the pointing device 6 or the indicator portion 55 is not imaged by the image pickup device 40 of the controller 5, the posture according to the marker coordinate data 96 cannot be calculated, so that the correction using the marker holder information 96 may not be performed at this time. deal with. According to the above, the CPU 10 uses the first acceleration data 94 and the marker coordinate data 96 to correct the according to the first! The posture of the controller 5 calculated by the angular velocity data 95. Here, in the method of calculating the posture of the controller 5, in the method of using the angular velocity, the posture can be calculated regardless of how the controller 5 operates. On the other hand, in the method of using the angular velocity, since the posture is calculated by the angular velocity measured by the cumulative price, the accuracy is deteriorated due to the accumulation of errors or the like, or the rotary sensor is caused by the so-called temperature drift problem. The doubt that the accuracy is deteriorating. In addition, in the method of using acceleration, although 323330 81 201220109 does not accumulate errors, in the state where the controller 5 is in a state of intense operation, the posture cannot be accurately calculated (the direction of gravity cannot be accurately detected). Further, in the method using the marker coordinates, the posture can be accurately calculated (especially in the pan direction), but the posture cannot be calculated when the indicator portion 55 cannot be imaged. On the other hand, according to the present embodiment, since the three methods of different characteristics are used as described above, the posture of the controller 5 can be accurately calculated. In other embodiments, the posture may be calculated using either or both of the above three methods. Further, when the lighting control of the indicator is performed in the processing of the above step S1 or step S22, the CPU 10 preferably calculates the posture of the controller 5 using at least the marker coordinates. The processing of step S24 is performed after the above step S23. In step S24, the CPU 10 calculates the posture of the terminal device 7. In other words, since the terminal operation data 97 acquired from the terminal device 7 includes the second acceleration data 101, the second angular velocity data 102, and the orientation data 103, the CPU 10 can calculate the posture of the terminal device 7 based on the data. . Here, the CPU 10 φ can know the amount of rotation (the amount of change in posture) per unit time of the terminal device 7 by the second angular velocity data 102. Further, in the state where the terminal device 7 is almost stationary, the acceleration applied to the terminal device 7 means the gravitational acceleration, so that the direction of gravity applied to the terminal device 7 can be known by the second acceleration data 101 (that is, by gravity) The direction of the terminal device 7 whose direction is the reference). Further, the orientation information 103 can be used to know the predetermined orientation based on the terminal device 7 (i.e., the posture of the terminal device 7 based on the predetermined orientation). Even when a magnetic field other than geomagnetism is generated, the amount of rotation of the terminal device 7 can be known. Therefore, the CPU 10 can calculate the posture of the terminal device 7 based on the second acceleration data 10, 82 323330 201220109, the second angular velocity data 102, and the orientation data 103. In the present embodiment, the posture of the terminal device 7 is calculated based on the above three types of data. However, in another embodiment, the posture may be calculated based on either or both of the above three types of data. The specific calculation method of the posture of the terminal device 7 may be any method. For example, the second acceleration data 101 and the orientation data 103 may be used to correct the posture calculated based on the angular velocity displayed by the second angular velocity data 102. Specifically, the CPU 10 first calculates the posture of the terminal device 7 based on the second angular velocity data 102. The method of calculating the posture based on the angular velocity can be the same as the method in the above step S23. Next, the CPU 10 corrects the angular velocity based on the posture calculated from the second acceleration data 101 and/or the posture calculated from the orientation data 103 at an appropriate timing (for example, when the terminal device 7 is near the stationary state). Calculate the posture. The method of correcting the posture according to the angular velocity in accordance with the posture of the acceleration can be the same as the method of calculating the posture of the controller 5 described above. Further, when the posture according to the angular velocity is corrected in accordance with the posture of the azimuth φ data, the CPU 10 can also correct the posture according to the angular velocity close to the posture based on the orientation data at a predetermined ratio. Based on the above, the CPU 10 can correctly calculate the posture of the terminal device 7. Since the controller 5 is provided with the imaging information processing unit 35 as an infrared detecting means, the game device 3 can acquire the marker coordinate data 96. Therefore, with respect to the controller 5, the game device 3 can know from the marker coordinate data 96 the absolute posture in the actual space (in the coordinate system set in the actual space, the posture of the controller 5). On the other hand, the terminal device 7 does not have an infrared detecting means such as the image information calculating unit 35. Therefore, the game device 3 can know the absolute posture in the real space from the second acceleration data 101 and the second angular velocity data 102 with respect to the direction of rotation about the direction of gravity. Therefore, in the present embodiment, the terminal device 7 is configured to include the magnetic sensor 72 so that the game device 3 acquires the orientation data 103. According to this, the game device 3 can calculate the absolute posture in the real space from the orientation data 103 with respect to the rotation direction of the gravity direction, and can more accurately calculate the posture of the terminal device 7. In the specific processing of the above step S24, the CPU 10 reads the second acceleration data 101, the second angular velocity data 102, and the orientation data 103 from the main memory, and calculates the posture of the terminal device 7 based on the data. The data showing the calculated posture of the terminal device 7 is stored in the main memory as the terminal posture data 109. The process of step S25 is performed after step S24. In step S25, the CPU 10 performs recognition processing of the camera image. That is, the CPU 10 performs predetermined identification processing on the camera image data 104. The identification processing can be arbitrary as long as a certain feature can be detected from the camera image and the result is output. For example, when the face of the player is included in the camera image, the process of recognizing the face can be recognized. Specifically, it is possible to recognize the treatment of a part of the face (eye or nose or mouth, etc.) or to detect the expression of the face. Further, the data showing the result of the identification processing is stored as the image identification data 110 in the main memory. The process of step S26 is performed after step S25. In step S26, the CPU 10 performs recognition processing of the microphone sound. That is, the CPU 10 performs predetermined identification processing on the microphone sound material 105. 84 323330 201220109 This identification process can be arbitrarily processed as long as it can detect a certain characteristic from the microphone sound and output the result. For example, the process of detecting the player's indication from the microphone sound, or simply detecting the volume of the microphone sound. Further, the data showing the result of the recognition processing is stored in the main memory as the sound recognition data 111. The process of step S27 is performed after step S26. In step S27, the CPU 10 executes game processing in response to the game input. Here, the game input may be data sent from the controller 5 or the terminal device 7 or data obtained from the data. Specifically, the game input may be data obtained from the data (controller posture data 108, terminal posture data 109, map) in addition to the data included in the controller operation data 92 and the terminal operation data 97. Image identification data 110, and voice recognition data 111). In addition, the content of the game processing in step S27 may be any content, for example, processing for moving an object (character) appearing in the game, controlling the processing of the virtual camera, or moving the φ cursor displayed on the screen. . Further, the camera image (or a part thereof) is used as a process for processing a game image or a process for using a microphone sound as a game sound. An example of the above game processing will be described later. In step S27, for example, the data of various parameters set in the object (character) appearing in the game, or the data of the parameter related to the virtual camera arranged in the game space, the data of the score, etc., are displayed as the game control result. The data is stored in the main memory. After the step S27, the CPU 10 ends the game control processing of the step S4. Returning to the description of FIG. 22, in step S5, the television game image for display on the television 2 85 323330 201220109 is generated by the CPU 10 and the GPU lib. That is, the CPU 10 and the GPU lib read the data showing the result of the game control processing of step S4 from the main memory, and read the data necessary for generating the game image from the VRAM lid to generate Game image. The game image may be displayed as long as the result of the game control processing of step S4 can be displayed, and can be generated by any method. For example, the method of generating the game image may be a method of generating a 3-dimensional CG image by arranging the virtual camera in a virtual game space and calculating a game space viewed from the virtual camera, or (without using virtual Camera) A method of generating a 2-dimensional image. The generated TV game image is memorized in the VRAM lid. The processing of step S6 is performed after the above step S5. In step S6, the game image for the terminal displayed on the terminal device 7 is generated by the CPU 10 and the GPU lib. The game image for the terminal is also the same as the game image for the television described above, and may be generated by any method as long as the result of the game control process of step S4 can be displayed. Further, the game image of the terminal φ can be generated by the same method as the above-described game image for television, or can be generated by a different method. The generated game image for the terminal is recorded in the VRAM lid. The game image for the television and the game image for the terminal may be the same depending on the content of the game. In this case, the process of generating the game image may not be executed in step S6. The processing of step S7 is performed after the above step S6. In step S7, the television game sound for outputting to the racquet 2a of the television 2 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of the step S4 in the DSP 11c. The generated game sound is, for example, the sound of the game, the sound of the character appearing in the game, 86 323330 201220109 or BGM. The processing of step S8 is performed after the above step S7. In step S8, a terminal game sound for outputting to the terminal device 7 is generated. That is, the CPU 10 generates a game sound in response to the result of the game control processing of step S4 in the DSP 11c. The game sound for the terminal can be the same as or different from the game sound for the above television. In addition, for example, the effect sounds are different, but the BGM can be different only in part. When the television game sound is the same as the terminal game sound, the game sound generation processing may not be executed in step S8. The processing of step S9 • is performed after the above step S8. In step S9, the CPU 10 outputs the game image and the game sound to the television 2. Specifically, the CPU 10 transmits the data of the television game image stored in the VRAM lid and the data of the television game sound generated by the DSP 11c in step S7 to the AV-IC 15. In response to this, the AV-IC 15 outputs the image and sound data to the television 2 via the AV connector 16. Thereby, the television game image is displayed on the television 2, and the television game sound is output from the φ speaker 2a. The process of step S10 is performed after step S9. In step S10, the CPU 10 transmits the game image and the game sound to the terminal device 7. Specifically, the image data of the game image for the terminal stored in the VRAM lid and the data of the sound generated by the DSP 11c in step S8 are transmitted to the codec LSI 27 by the CPU 10, and by The codec LSI 27 performs predetermined compression processing. Further, the image and sound data subjected to the compression processing are transmitted to the terminal device 7 via the antenna 29 via the terminal communication core group. The terminal device 7 receives the data of the image and sound transmitted from the game device 3 by the wireless module 80, and performs predetermined decompression processing by editing the code 872330 201220109. The image data subjected to the decompression processing is output to the LCD 51, and the sound data subjected to the decompression processing is output to the sound 1C 78. Thereby, the terminal game image is displayed on the LCD 51, and the terminal game sound is output from the speaker 77. The process of step S11 is executed after step S10. In step S11, the CPU 10 determines whether or not to end the game. The determination in step S11 is performed, for example, by whether or not the game is over or the user is instructed to suspend the game. When the determination in step S11 is negative, the processing of step ® S2 is executed again. On the other hand, when the determination in step S11 is affirmative, the CPU 10 ends the game processing shown in Fig. 12. Hereinafter, a series of processes of steps S2 to S11 are repeatedly executed until the end of the game in step S11. As described above, in the present embodiment, the terminal device 7 includes the inertial sensor of the touch panel 52 and the acceleration sensor 73 or the rotation sensor 74, and the output of the touch panel 52 and the inertial sensor is used as the operation data. It is transmitted to the game device 3 and used as an input to the game (steps S3, S4). Further, the final φ end device 7 is provided with a display device (LCD 51), and the game image obtained by the game processing is displayed on the LCD 51 (steps S6, S10). Therefore, the user can directly perform a touch operation on the game image by using the touch panel 52. In addition, (because the action of the terminal device 7 can be detected by the inertial sensor, the game image can be displayed) The operation of the LCD 51 itself moves. The user can perform the game by directly operating the game image in such a manner as to operate the game image. Therefore, it is possible to provide a game such as the new operation feelings of the first and second game examples described later. In addition, in the present embodiment, the terminal device 7 is provided with a flipping device 3 that can be operated in a state in which the terminal device 832330201220109 end device 7 is held, and the game is input to the game. S3, S4) = the operation performed by the key 54 is used as the image directly (4), (4) = this is described above, even if the game is detailed (four) thank you (four). In the present embodiment, the microphone Φ ^ , the microphone 79, and the camera 56 are mounted on the H-unit 7 with the money-carrying data and the microphone (8) produced by the camera 56 and the wind 79. Therefore, the check-in stock is transmitted to the game device 3 (the step sound is used as the image and/or the microphone 56 is used as the image county, and the camera can also be used as the camera to perform the game.) ^t The operation of the microphone 79 m % r / , L 4 can be directly manipulated by the image when the terminal device 7 is held by the image, θ: 'Tai 22 makes the user perform more diverse game operations. The line end mm' is displayed on the transportable type terminal device 7 because the game image is displayed. Because of the::::), the user can arrange the position in the free position by the user 7 and the user makes the control 5 direction. Freedom to rape the 5 degrees of freedom of operation. The control button (4) is placed at any position. For example, the fifth game 1 described later can be placed in the position suitable for the game content, and the wide and end device 7 can be used. Device, 3H, game. Since the terminal device 7 acquires the operation data and the like (step '" the air compressors 5 and S3), the user can use the two devices of the controller 5 and the terminal device 7 as the operation means by the 323330 89 201220109. Therefore, in the game system 1, It is also possible for a plurality of users to play a game by using a plurality of individuals using a plurality of devices, and one user can play the game using two farms. In addition, according to the present embodiment, the game device 3 generates two kinds of game maps. Like (steps S5, S6), the game image can be displayed on the TV 2 and the terminal 裴, 7 (step S9, sl〇). Thus, by displaying the two kinds of game images on the IS, the use can be provided. It is easier to watch the game image: the operability. For example, 'When two people play the game, as in the case of the fourth game, the game image of the use of one side = the point of view is displayed on the TV. 2. The use of a game that is easy for the viewer to view is, for example, '=: play the game. In addition, when the game is booked in the .t, as in the following, the first, the 筮9 + Du-like can be different. The point of view towel shows 5 4 5 games [6. Game examples] Then say In the game system 、中, in the example of the game of Ming, the specific example of the game of the game is more likely to grasp the game = lang image, and thus, "6_You Rui also 丨1, 袠 楗 楗 楗 游戏 游戏 游戏 。. In the case of the - part of the system, in addition to the system 1 [shown as a series of 1 pairs: the 22nd picture and the 23rd system 1 may not have all of the above configurations. That is, the game does not execute the first In addition, as shown in Fig. 22 and Fig. 23, the game device 3 may be a part of the series of processing (first game example). 90 32333〇201220109 The first game example is in the game space by operating the terminal device 7. The game of the object (dart) is emitted. The player can instruct the direction of the shooting by changing the posture of the terminal device 7 and the operation of drawing the line on the touch panel 52. Fig. 24 is a display In the first game example, the screen of the television 2 and the terminal device 7 are shown. In Fig. 24, the game image indicating the game space is displayed on the LCD 51 of the television 2 and the terminal device 7. The television 2 displays The dart 121, the control surface 122, and the target 123 are displayed on the LCD 51. In the first game example, the player uses the operation of the terminal device 7 to shoot the darts 121 and hit the target 123 to play the game. When the darts 121 is shot, the player first uses the game. The posture of the control device 122 disposed in the virtual game space is changed to the desired posture by operating the posture of the terminal device 7. That is, the CPU 10 is based on the inertial sensor (the acceleration sensor 73 and the rotation sensor 7 4 And the output of the magnetic sensor 7 2, φ calculates the posture of the terminal device 7 (step S24), and changes the posture of the control surface 122 based on the calculated posture (step S27). In the first game example, the posture of the control plane 122 is controlled in accordance with the posture of the posture of the terminal device 7 in the actual space. That is, the player can change the posture of the control plane 122 in the game space by changing the posture of the terminal device 7 (displayed on the control plane 122 of the terminal device 7). In the first game example, the position of the control plane 122 is fixed at a predetermined position in the game space. Next, the player performs an operation of drawing a line on the touch panel 52 using the stylus pen 124 or the like (refer to an arrow shown in Fig. 24). Here, in the game example of the first 91 323330 201220109, the LCD 51 of the terminal device 7 displays the control surface 122 such that the input surface of the touch panel 52 corresponds to the control surface 122. Therefore, the direction on the control surface 122 (the direction in which the line is displayed) can be calculated by drawing the line drawn on the touch panel 52. The dart 121 is shot in the direction determined so. From the above, the CPU 1 calculates the direction on the control surface 122 from the touch position data 100 of the touch panel 52, and performs a process of moving the flying brocade 121 to the calculated direction (step S27). The CPU 10, for example, can control the speed of the lining 121 in response to the length of the line or the speed at which the line is drawn. As described above, according to the game example of the third game, the game device 3 can use the input of the inertia sensor as the game input, and the & face 122 can be moved in response to the movement (posture) of the terminal device 7. And by using the output of the touch panel ,, the direction on the control surface 122 can be specified. According to this, the image) or 1 is a game image of the terminal device 7 (the control surface 122

闰w条二疋二亥遊戲圖像進行觸控操作,因在匕能夠以對遊戲 、仃操作般之新穎操作感覺來進行遊戲。 匕夕卜,】 . . ,at „ ^遊戲例中,藉由將慣性感測器及觸控面板 中^出用作為遊戲輸人,可容易地指示3維空間 〒的方向。亦即, 的姿勢,以另一恧戲者以一邊的手實際調整終端裝置7 *彡的手用線將方向輸入於觸控面板52,藉 -方θ 内實際輸人方向般之直覺性操作而容易地 才日不方向。再者,游齢 之操作以及對觸控面^同:進行對終端裝置7的姿勢 ^ τ- 3 ^ ^ Φ ^ 之輪入操作,所以可迅速地進行 才曰不3維空間中的方向之操作。 323330 92 201220109 此外,根據第1遊戲例,為了容易對控制面122進行 觸控輸入的操作,所以在終端裝置7中,於全體晝面顯示 控制面122。另一方面,係β容易掌握控制面122的姿勢、 以及容易瞄準標靶123之方式,在電視2中顯示出包含全 體控制面122及標靶123之遊戲空間的圖像(參照第24 圖)。亦即,上述步驟S27中,用以生成電視用遊戲圖像之 第1虛擬攝影機,係以使全體控制面122及標靶123包含 於視野範圍之方式來設定,並且用以生成終端用遊戲圖像 ® 之第2虛擬攝影機,係以使LCD 51的畫面(觸控面板52的 輸入面)與控制面122在晝面上呈一致之方式來設定。因 此,第1遊戲例中,藉由將從不同視點所觀看之遊戲空間 的圖像顯示於電視2與終端裝置7,可更容易進行遊戲操 作。 (第2遊戲例) 將慣性感測器及觸控面板52的感測器輸出用作為遊 φ 戲輸入之遊戲,並不限於上述第1遊戲例,可考量種種遊 戲例。第2遊戲例,與第1遊戲例相同,為藉由操作終端 裝置7而在遊戲空間***出物件(砲彈)之遊戲。遊戲者, 藉由改變終端裝置7的姿勢之操作、及指定觸控面板52上 的位置之操作,可指示發射砲彈之方向。 第25圖係顯示第2遊戲例中之電視2的晝面與終端裝 置7之圖。第25圖中,於電視2上顯示出大砲131、砲彈 132、 及標靶133。於終端裝置7上顯示出砲彈132及標靶 133。 顯示於終端裝置7之終端用遊戲圖像,為從大砲131 93 323330 201220109 的位置觀看遊戲空間之圖像。 第2遊戲例中,遊戲者藉由操作終端裝置7的姿勢, 可改變作為終端用遊戲圖像而顯示於終端裝置7之顯示範 圍。亦即,CPU 10根據慣性感測器(加速度感測器73及迴 轉感測器74)以及磁性感測器72的輸出,算出終端裝置7 的姿勢(步驟S24),並根據所算出的姿勢,來控制用以生 成終端用遊戲圖像之第2虛擬攝影機的位置及姿勢(步驟 S27)。具體而言,第2虛擬攝影機係設置在大砲131的位 ® 置,並因應終端裝置7的姿勢來控制朝向(姿勢)。如此, 遊戲者藉由改變終端裝置7的姿勢,而可改變顯示於終端 裝置7之遊戲空間的範圍。 此外,第2遊戲例中,遊戲者在觸控面板52上進行將 點輸入之操作(觸控操作),藉此指定砲彈132的發射方 向。具體而言,上述步驟S27的處理中,CPU 10算出對應 於觸控位置之遊戲空間内的位置(控制位置),並以從遊戲 φ 空間内的預定位置(例如大砲131的位置)往控制位置之方 向作為發射方向來算出。然後進行將砲彈132往發射方向 移動之處理。如此,上述第1遊戲例中,遊戲者係進行在 觸控面板52上將線描繪出之操作,但在第2遊戲例中,則 進行指定觸控面板52上的點之操作。上述控制位置,可藉 由設定與上述第1遊戲例相同之控制面(惟第2遊戲例中未 顯示控制面)來算出。亦即,以對應於終端裝置7中的顯示 範圍之方式,因應第2虛擬攝影機的姿勢來配置控制面(具 體而言,控制面係以大砲131的位置為中心,因應終端裝 94 323330 201220109 置7的姿勢變化來旋轉移動),藉此可算出對應於觸控位置 之控制面上的位置作為控制位置。 根據上述第2遊戲例,遊戲裝置3藉由將慣性感測器 的輸出用作為遊戲輸入,可因應終端裝置7的移動(姿勢) 來改變終端用遊戲圖像的顯示範圍,並藉由指定該顯示範 圍内的位置之觸控輸入用作為遊戲輸入,可特定出遊戲空 間内的方向(砲彈132的發射方向)。因此,第2遊戲例中, 與第1遊戲例中相同,遊戲者可移動顯示於終端裝置7之 遊戲圖像或是對該遊戲圖像進行觸控操作,因此能夠以對 遊戲圖像直接進行操作般之新穎操作感覺來進行遊戲。 此外,第2遊戲例中,亦與第1遊戲例中相同,遊戲 者以一邊的手實際調整終端裝置7的姿勢,以另一邊的手 對觸控面板52進行觸控輸入,藉此能夠以在空間内實際輸 入方向地之直覺性操作而容易地指示方向。再者,遊戲者 可同時進行對終端裝置7的姿勢之操作以及對觸控面板52 之輸入操作,所以可迅速地進行指示3維空間中的方向之 操作。 此外,第2遊戲例中顯示於電視2之圖像,可為從與 終端裝置7相同之視點所觀看的圖像,但在第25圖中,遊 戲裝置3係顯示從不同視點所觀看的圖像。亦即,用以生 成終端用遊戲圖像之第2虛擬攝影機,被設定在大砲131 的位置,相對於此,用以生成電視用遊戲圖像之第1虛擬 攝影機,被設定在大砲131後方的位置。在此,例如將終 端裝置7的晝面中無法看到的範圍顯示於電視2,藉此可 95 323330 201220109 實現一種ΓίίΓ夠觀看電視2的晝面來晦準在終端裝置 7的畫面中”·"看到的標輕133之玩法。如此,藉由將電 視2與終端裝置7之顯示範圍設定為不同,不僅^容 握遊戲㈣内的模樣,更能夠提升遊戲的趣味性。 π如上戶It根據本實施形態,由於可將具備觸控面板 2與慣性之終端裝置7用作為操作裝置,因此可實闰w Article 2 疋二亥 game image for touch operation, because 匕 can play the game with the novel operation feeling of the game, 仃 operation. In the game example, by using the inertial sensor and the touch panel as the game input, the direction of the 3-dimensional space can be easily indicated. That is, In the posture, the other player actually adjusts the terminal device 7 by one hand, and the hand line is input to the touch panel 52, and the intuitive operation is actually performed by the direction of the actual input direction in the square θ. In the future, the operation of the crucible and the touch surface are the same: the rounding operation of the posture of the terminal device 7 ^ τ - 3 ^ ^ Φ ^ is performed, so that the 3D space can be quickly performed. In the first game example, in order to facilitate the touch input operation on the control surface 122, the terminal device 7 displays the control surface 122 on the entire surface. It is easy to grasp the posture of the control surface 122 and the manner in which the target 123 is easily grasped, and the image of the game space including the entire control surface 122 and the target 123 is displayed on the television 2 (see Fig. 24). In the above step S27, used to generate a game map for television The first virtual camera is set such that the entire control surface 122 and the target 123 are included in the field of view, and the second virtual camera for generating the terminal game image® is used to display the LCD 51 ( The input surface of the touch panel 52 is set to match the control surface 122 on the top surface. Therefore, in the first game example, an image of the game space viewed from different viewpoints is displayed on the television 2 and The terminal device 7 can make the game operation easier. (Second game example) The game in which the sensor output of the inertial sensor and the touch panel 52 is used as the game input is not limited to the first game example described above. Various game examples can be considered. The second game example is a game in which an object (projectile) is fired in the game space by operating the terminal device 7 in the same manner as the first game example. The player changes the posture of the terminal device 7. The operation and the operation of designating the position on the touch panel 52 can indicate the direction in which the projectile is fired. Fig. 25 is a view showing the face of the television 2 in the second game example and the terminal device 7. In Fig. 25, Displayed on TV 2 The cannon 131, the projectile 132, and the target 133. The projectile 132 and the target 133 are displayed on the terminal device 7. The terminal game image displayed on the terminal device 7 is used to view the game space from the position of the cannon 131 93 323330 201220109. In the second game example, the player can change the display range of the terminal device 7 as the terminal game image by operating the posture of the terminal device 7. That is, the CPU 10 is based on the inertial sensor (acceleration). The sensor 73 and the rotation sensor 74) and the output of the magnetic sensor 72 calculate the posture of the terminal device 7 (step S24), and control the generation of the game image for the terminal based on the calculated posture. 2 Position and posture of the virtual camera (step S27). Specifically, the second virtual camera is set in the position of the cannon 131, and the orientation (posture) is controlled in accordance with the posture of the terminal device 7. Thus, the player can change the range of the game space displayed on the terminal device 7 by changing the posture of the terminal device 7. Further, in the second game example, the player performs an operation of inputting a point (touch operation) on the touch panel 52, thereby specifying the emission direction of the projectile 132. Specifically, in the processing of the above step S27, the CPU 10 calculates a position (control position) in the game space corresponding to the touch position, and takes a predetermined position (for example, the position of the cannon 131) from the game φ space to the control position. The direction is calculated as the emission direction. Then, the process of moving the projectile 132 in the direction of emission is performed. As described above, in the first game example described above, the player performs an operation of drawing a line on the touch panel 52. However, in the second game example, an operation of designating a point on the touch panel 52 is performed. The control position can be calculated by setting the same control surface as the first game example (but the control surface is not displayed in the second game example). In other words, the control surface is arranged in accordance with the posture of the second virtual camera in accordance with the display range in the terminal device 7 (specifically, the control surface is centered on the position of the cannon 131, and the terminal is installed in accordance with the terminal installation 94 323330 201220109 The posture of 7 is changed to rotate (moving), whereby the position on the control surface corresponding to the touch position can be calculated as the control position. According to the second game example described above, the game device 3 can change the display range of the terminal game image in response to the movement (posture) of the terminal device 7 by using the output of the inertial sensor as the game input, and by specifying the The touch input of the position within the display range is used as a game input to specify the direction within the game space (the direction in which the projectile 132 is fired). Therefore, in the second game example, as in the first game example, the player can move the game image displayed on the terminal device 7 or perform a touch operation on the game image, so that the game image can be directly played. A game-like operation feels like a game. Further, in the second game example, as in the first game example, the player actually adjusts the posture of the terminal device 7 with one hand and touch input the touch panel 52 with the other hand. The direction is easily indicated by an intuitive operation of the actual input direction in the space. Further, the player can simultaneously perform the operation of the posture of the terminal device 7 and the input operation to the touch panel 52, so that the operation of instructing the direction in the three-dimensional space can be quickly performed. Further, the image displayed on the television 2 in the second game example may be an image viewed from the same viewpoint as the terminal device 7, but in FIG. 25, the game device 3 displays a view viewed from a different viewpoint. image. In other words, the second virtual camera for generating the game image for the terminal is set at the position of the cannon 131. On the other hand, the first virtual camera for generating the game image for the television is set behind the cannon 131. position. Here, for example, a range that cannot be seen in the face of the terminal device 7 is displayed on the television 2, whereby 95 323330 201220109 can be realized to view the face of the television 2 to be in the screen of the terminal device 7" "Seeing the gameplay of the standard light 133. Thus, by setting the display range of the TV 2 and the terminal device 7 to be different, not only can the appearance of the game (4) be accommodated, but also the fun of the game can be improved. According to the present embodiment, the terminal device 7 including the touch panel 2 and the inertia can be used as an operation device.

現如上述第1及第2遊戲例之以對遊戲圖像直接進行摔作 之操作減的顿。 ㈣㈣探作 (第3遊戲例) 遊戲第27圖來_3遊戲例。第3 位遊戲者传用松w對喊之形式的棒球遊戲。亦即,第1 裝置7操作投5操作打者’第2位賴者使用終端 出對各遊戲二:此外’於電視2及終端m,係顯示 第:圖易進行遊戲操作之遊戲圖像。 戲圖像Μ 丁第3遊戲例中顯示於電視2之電視用遊 於第1位遊戲者之圖:所示之電視用遊戲圖像,主要用 出從屬於第i 、圃像。亦即,電視用遊戲圖像,係顯示 側觀看屬於第=遊戲者的操作對象之打者(打者物件)141 142之遊戲空間操作對象之投手(投手物件) 影機,位耐Μ Λ生成電視用遊戲圖像之第1虛擬攝 〜 本配置在fr去1 >1 1 手14?沾+丄 1的後方位置,並從打者141朝投 叩万向而配置。 另一方面,第? 7 isi ^ 置7之終端、 圖係顯示第3遊戲例中顯示於終端裝 '' _戲圖像的—例圖。第27圖所示之終端用遊 323330 96 201220109 戲圖像,主要用於第2位遊戟者 戲圖像,係顯示出從屬於第2位之圖像。亦即,終端用遊 142側觀看為第1位遊戲者的,戲者的操作對象之投手 空間。具體而言,上述步驟S27、=對象之打者141之遊戲 置7的姿勢,來控制用以生成 cpu 10係根據終端裝 攝影機。第2虛擬攝影機的用遊戲圖像之第2虛擬 與上述第2遊戲例相同, ,以對應於料裝置7的姿勢之方式來算出。此外,第2As in the first and second game examples described above, the operation of directly dropping the game image is reduced. (4) (4) Quest (3rd game example) Game 27th _3 game example. The third player uses a baseball game in the form of a shouting. That is, the first device 7 operates the operation 5 and the second user uses the terminal to play the game 2: In addition, the television image 2 and the terminal m display the game image of the game operation. The movie image shown in the third game example is displayed on the television of the TV 2. The game player image shown in the figure is mainly used for the i-th and the image. In other words, the game image for the television is a pitcher (pitcher object) of the game space operation target of the hitter (player object) 141 142 belonging to the operation target of the player = 142 142. The first virtual shot of the game image ~ This configuration is placed at the rear position of the fr: 1 > 1 1 hand 14? dip + 丄 1, and is placed from the hitter 141 toward the cast. On the other hand, the first? 7 isi ^ The terminal of the 7th, the figure shows the example of the image displayed in the terminal game in the third game example. The terminal video shown in Fig. 27 323330 96 201220109 The play image is mainly used for the second player's play image, and the image belonging to the second place is displayed. That is, the terminal uses the side of the game 142 to view the pitcher space of the player's operation object as the first player. Specifically, in the above-described step S27, the posture of the game player 7 of the target hitter 141 is controlled to generate a cpu 10 system based on the terminal mounted camera. The second virtual game image for the second virtual camera is calculated in the same manner as the second game example described above, in accordance with the posture of the material device 7. In addition, the second

虛擬攝影機的位置,被固定在預先蚊的預定位置。終端 用遊戲圖像中,係包含用以顯示扮丰 、、 游標143。 丁•又手142投出球的方向之 第1位遊戲者所操控之打纟141㈣作方法及第2位 遊戲者所操控之投手142的操作方法,可為任意方法。例 如’ CPU 1〇係根據控制器5之慣性感測器的輸出資料,偵 ,出對控制II 5所進行之揮舞操作,並因應揮舞操作來進 2打者141揮棒之動作。此外,例如CPU1G依循對類比 中ό!箱3所進彳了之操作來移動游標143,當按下操作鍵54 行=定鍵時,使投手142朝向游標143所指示之位置進 勢來蒋2作。此外’游標143 ’亦可因應終端裝置7的姿 ,以取代對類比搖桿53所進行之操作。 中生:上所述’第3遊戲例中,係在電視2及終端展置7 者而+ ΐ為不同的視點之遊戲圖像,藉此可提供對各遊戲 〇各易觀看且容易操作之遊戲圖像。 此外, 虛擬攝影機 第3遊戲例中,係在單一遊戲空間中設定2個 ,並分別顯示出從各虛擬攝影機觀看遊戲空間 323330 97 201220109 之2種遊戲圖像(第26圖及第27圖)。因此,關於第3遊 戲例中所生成之2種遊戲圖像,對遊戲空間所進行之遊戲 處理(遊戲空間内之物件的控制等)幾乎為共通,僅須對共 通的遊戲空間進行2次描繪處理即能夠生成各遊戲圖像, 因此與分別進行該遊戲處理時相比,具有處理效率高之優 點。 此外,第3遊戲例中,顯示出投球方向之游標143僅 顯示於終端裝置7侧,因此第1位遊戲者無法看到游標143 ® 所指示之位置。因此,不會產生投球方向被第1位遊戲者 得知而對第2位遊戲者不利之遊戲上的缺失。如此,本實 施形態中,當產生某一方的遊戲者看到該遊戲圖像時會對 另一方的遊戲者不利之遊戲上的缺失時,只需將該遊戲圖 像顯示於終端裝置7即可。藉此可防止遊戲的戰略性降低 等缺失。其他實施形態中,因遊戲内容的不同(例如,即使 終端用遊戲圖像被第1位遊戲者看到亦不會產生上述缺失 φ 時),遊戲裝置3亦可將終端用遊戲圖像與電視用遊戲圖像 一同顯示於電視2。 (第4遊戲例) 以下參照第28圖及第29圖來說明第4遊戲例。第4 遊戲例為2位遊戲者彼此合作之形式的射擊遊戲。亦即, 第1位遊戲者使用控制器5來進行移動飛機之操作,第2 位遊戲者使用終端裝置7來進行控制飛機的大砲的發射方 向之操作。第4遊戲例中亦與第3遊戲例相同,於電視2 及終端裝置7係顯示出對各遊戲者而言容易進行遊戲操作 98 323330 201220109 之遊戲圖像。 第28圖係顯示第4遊戲例中顯示於電視2之電視用蠖 戲圖像的一例圖。此外,第29圖係顯示第4遊戲例中顯示 於終4裝_置7之終端用遊戲圖像的一例之圖。如第28圖戶斤 不,第4遊戲例中,飛機(飛機物件)丨η與標靶(氣球物件) I53於虛擬遊戲空間中登場。此外,飛機151具有大砲(大 砲物件)152。The position of the virtual camera is fixed at a predetermined position of the pre-mosquito. The game image for the terminal includes a cursor 143 for displaying the game. Ding • The direction in which the hand 142 throws the ball is the first player to control the snoring 141 (four) and the second player to control the pitcher 142. For example, the CPU 1 detects the swinging operation of the control II 5 according to the output data of the inertial sensor of the controller 5, and responds to the swinging operation to enter the 2 player 141 swing action. In addition, for example, the CPU 1G moves the cursor 143 according to the operation of the analogy box 3, and when the operation key 54 is pressed = the button is pressed, the pitcher 142 is brought toward the position indicated by the cursor 143. Work. Further, the 'cursor 143' may be adapted to the operation of the analog rocker 53 in response to the posture of the terminal device 7. Zhongsheng: In the above-mentioned '3rd game example, the game image is displayed on the TV 2 and the terminal and the + is a different viewpoint, thereby providing easy viewing and easy operation for each game. Game image. Further, in the third game example of the virtual camera, two game spaces are set in a single game space, and two kinds of game images (Fig. 26 and Fig. 27) of the game space 323330 97 201220109 are displayed from the respective virtual cameras. Therefore, regarding the two types of game images generated in the third game example, the game processing (the control of the objects in the game space, etc.) performed in the game space is almost common, and only the common game space has to be drawn twice. The processing can generate each game image, and therefore has an advantage of higher processing efficiency than when the game processing is performed separately. Further, in the third game example, since the cursor 143 indicating the pitching direction is displayed only on the terminal device 7, the first player cannot see the position indicated by the cursor 143 ® . Therefore, there is no loss in the game in which the pitching direction is known by the first player and which is disadvantageous to the second player. As described above, in the present embodiment, when a game in which one of the players sees the game image is lost to the other player, the game image needs to be displayed on the terminal device 7. . This prevents the lack of strategic reduction of the game. In other embodiments, the game device 3 may also use the game image for the terminal and the television depending on the content of the game (for example, even if the game image for the terminal is not seen by the first player). Displayed on TV 2 together with the game image. (Fourth game example) A fourth game example will be described below with reference to Figs. 28 and 29. The fourth game example is a shooting game in the form of a cooperation between two players. That is, the first player uses the controller 5 to perform the operation of the mobile aircraft, and the second player uses the terminal device 7 to perform the operation of controlling the launching direction of the aircraft's cannon. In the fourth game example, similarly to the third game example, the game image of the game operation 98 323330 201220109 is displayed on the television 2 and the terminal device 7 for each player. Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. In addition, Fig. 29 is a view showing an example of a terminal game image displayed in the final game 7 in the fourth game example. As shown in Figure 28, in the fourth game example, the aircraft (aircraft object) 丨η and the target (balloon object) I53 appear in the virtual game space. In addition, the aircraft 151 has a cannon (cannon article) 152.

一如第找圖所示,包含飛機151之遊戲空間的圖像係_ 不作為電遊戲圖像。用以生成f視用遊_像之第1 虛擬攝影機,係以生成從後方觀看飛機151之遊戲空間的 =之方式來設定。亦即,帛1虛擬攝影機係以飛機叫 破^含於攝影範圍(視野範圍)之姿勢來配置在飛機ι5ΐ的 =方位置。此外,第1虛擬攝影機被控制為伴隨著飛機151 :動而移動。亦即,CPU 1Q在上述步驟奶的處理中, 艮,制器操作資料來控制飛機151的移動,並且控制第 影機的位置及姿勢。如此…虛擬攝影歸 置及姿勢仙應第1位遊戲者的操作而被控制。 為女t 1广面’如第29圖所* ’從飛機151(更具體而言 看之顿⑼的圖像侧示作為終端用遊 i係两二此’用以生成終端用遊戲圖像之第2虛擬攝影 置=謂的位置(更具體大砲 置)。CPU 10在上述步驟於 資料來控制飛機⑸的移動7處理中’根據控制器操作 位置。第2虔擬攝影機亦=控制第2虚擬攝影機的 J配置在飛機151或大砲152周 323330 99 201220109 邊的位置(例如大砲152稍微後方的位置)。如上所述 2虛擬攝影機的位置’是由(操作飛機151的移動之 位遊戲者的操作來控制。因此,第4遊戲例中 ^ 1 攝影機及第2虛擬攝影機係連動地移動。 虛擬 此外’朝大砲152的發射方向觀看之遊戲空間 係顯示作為終端用遊戲圖冑❶在此,大砲152的發 象 係控制為對應於終«置7的姿勢。亦即,本實施形_ ’ 第2虛_影制錢,係_為使第2虛機中 線方向與大跑152的發射方向-致。麵在上述步= 中’因應上述步驟%4_算出〜料 = 勢,^控制大砲152的朝向及第2虛擬攝影機的姿勢^ 此’第2虛擬攝影機的姿勢是藉由第2位遊戲者的操作來 控制。此外’帛2位軸者藉由改變終端裝置7的姿勢 可改變大砲152的發射方向。 町安勢 當從大砲152發射砲彈時,第2位遊戲者壓下終端穿 置7的預定鍵。壓下@ # μ * ^ 了貝疋鍵&下預疋鍵時,砲祕Μ 152_ 發射。終端用遊戲圖像中,於⑽51的晝面中央顯示出準 星154’砲彈往準星154所指示的方向發射。 如上所述,第4遊戲例中,第α遊戲者,主要一邊 觀看顯示出往飛機151的行進方向觀看之遊戲空間的電視 用遊戲圖像(第28圖)’(例如以往期望躲153的方向移 動之方式)一邊操作飛機⑸。另一方面,帛2位遊戲者, 主要一邊觀看顯不出往大砲152的發射方向觀看之遊戲空 間的終端用遊戲圖像(第29圖)’ 一邊操作大砲152。如此, 323330 100 201220109 第4遊戲例中,在2位 可分別將對各遊戲者而言容作之形式的遊戲中, 像,分別顯示於電視2及終端裝置看?且容錢作之遊戲圖 此外,第4遊戲例中,藉 制第1虛_影機及$ 2虛 ^賴者的操作來控 施形態中,因影機的姿勢。亦即,本實 的位置或姿勢產生 =戲操作而使虛擬攝影機 戲空間的顯示範圍產生㈣果示於各顯示裝置之遊 空間的顯示範圍因應二=示於顯示裝置之遊戲 各遊戲者可實地感受到自㈣而產生變化’因此, 的進行中,而能夠充分地享受作充分地反映於遊戲 所翻例中,係於電視2顯示出從飛機151的後方 戶觀看之遊戲时,於終端裝置7顯Μ從飛機151之大 ㈣看之遊_像。在此,其他遊戲例中,遊戲 、亦可於終端裝置7顯示出從飛機151的後方所觀看 之遊戲圖像,於電視2顯示出從飛機151之大袍152的位 置所觀看之遊戲圖像。此時’各遊戲者的卫作,與上述第 4遊戲例替換,可设定為第1位遊戲者使用控制器5來進 行大砲152的操作’第2位遊戲者使用終端裝置7來進行 飛機151的操作。 (第5遊戲例) 以下參照第30圖來說明第5遊戲例。第5遊戲例,為 遊戲者使用控制ϋ 5來進行操作之遊戲,終端裝置7並非 101 323330 201220109 操:裝置,、而是用作為顯示褒置。具體而言,帛5遊戲例 為同爾夫球賴’因應遊戲者將控制器5如高球桿般地揮 舞之操作(揮#操作),遊戲裝置3在虛擬遊戲空間中的遊 戲者,色巾,進行高爾夫球的揮桿動作。As shown in the first figure, the image of the game space containing the aircraft 151 is not used as an electric game image. The first virtual camera for generating the video view is set such that the game space of the aircraft 151 is viewed from the rear. That is, the 帛1 virtual camera is placed at the = square position of the airplane 以5ΐ in a posture in which the aircraft is called photographic range (field of view). Further, the first virtual camera is controlled to move along with the movement of the aircraft 151. That is, the CPU 1Q, in the processing of the above-described step milk, controls the movement of the aircraft 151 and controls the position and posture of the first camera. Thus, the virtual photography placement and posture are controlled by the operation of the first player. For the female t 1 wide face 'as shown in Fig. 29' 'from the aircraft 151 (more specifically, the image of the image (9) is shown as the terminal for the game, the two are used to generate the terminal game image. The second virtual photography is set to the position (more specific cannon). The CPU 10 controls the movement of the aircraft (5) in the above-mentioned steps in the process of controlling the movement of the aircraft (5) according to the controller operation position. The second virtual camera also controls the second virtual The J of the camera is placed at the side of the aircraft 151 or the cannon 152 weeks 323330 99 201220109 (for example, the position slightly behind the cannon 152). As described above, the position of the 2 virtual camera is determined by the operation of the player who operates the aircraft 151. Therefore, in the fourth game example, the camera 1 and the second virtual camera move in conjunction with each other. Virtually, the game space displayed in the direction of the launch of the cannon 152 is displayed as a game map for the terminal, and the cannon 152 The syllabic system is controlled to correspond to the posture of the final set. That is, the present embodiment _ 'the second virtual _ shadow money, the system _ for the second virtual machine midline direction and the direction of the run 152 - To the surface in the above step = In the above step %4_, the material = potential is calculated, and the orientation of the cannon 152 and the posture of the second virtual camera are controlled. ^ The posture of the second virtual camera is controlled by the operation of the second player. The 2-axis operator can change the direction of the launch of the cannon 152 by changing the posture of the terminal device 7. When the castle launches the projectile from the cannon 152, the second player presses the predetermined button of the terminal piercing 7. Pressing @# μ * ^ When the 疋 疋 key & 疋 疋 下 152 152 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ In the fourth game example, the alpha game player mainly views the television game image (Fig. 28) that displays the game space viewed in the traveling direction of the aircraft 151 (for example, it is desired to move in the direction of the 153 in the past). On the other hand, the two players are mainly watching the game image (Fig. 29) of the game space that is displayed in the direction of the launch of the cannon 152, while operating the cannon 152. So, 3233 30 100 201220109 In the fourth game example, in the game where the two players can each play a form for each player, the images are displayed on the TV 2 and the terminal device, respectively, and the game map of the money is made. In the fourth game example, the operation of the first virtual camera and the $2 virtual player is used to control the posture of the camera, that is, the actual position or posture is generated. The display range of the virtual camera play space is generated. (4) The display range of the play space of each display device is determined by the fact that the game players shown in the display device can feel the change from (4) in real time. Therefore, the progress is in progress. In the case where the television 2 displays a game viewed from the rear of the aircraft 151, the terminal device 7 displays the game image viewed from the large (four) of the aircraft 151. Here, in other game examples, the game, the terminal device 7 may display the game image viewed from the rear of the aircraft 151, and the television 2 displays the game image viewed from the position of the robes 152 of the aircraft 151. . At this time, the "player of each player" is replaced with the fourth game example described above, and the first player can be set to operate the cannon 152 using the controller 5. The second player uses the terminal device 7 to carry out the aircraft. 151 operation. (Fifth Game Example) A fifth game example will be described below with reference to Fig. 30. In the fifth game example, the game is operated by the player using the control ϋ 5, and the terminal device 7 is not a device, but is used as a display device. Specifically, the 帛5 game example is the same as the player who plays the controller 5 like a golf club, and the game player 3 plays the game in the virtual game space. The towel performs the swing of the golf ball.

★第30圖係顯不第5遊戲例中使用遊戲系統上之模樣 圖。第3G圖中,於電視2的晝面顯示出包含遊戲者角色(的 物件)161及高球桿(的物件)162之遊戲空間的圖像。第 圖中,雖隱藏於高球桿162而未被顯示 ,但配置在遊戲空 間之球(的物件)163亦顯示於電視2。另〆方面,如第如 圖所示’終職置7係以使咖51的晝面#絲上之方式 配置在電視2之前方正面的地板上。於終端裝置7顯示出: :具=球163之圖像、及顯示高球桿162的〆部分(具體而言 為同球杯162的桿頭162a)之圖像、顯示遊戲空間的地面 之圖像、終端用遊戲圖像為從上方觀看球的周圍之圖像。 …遊戲時’遊戲者160站在終端裝置7的附近,來 ^丁將控制11 5如高料般地揮舞操作。此時,CPU 10, =2 Γ:處理中’因應上述步謂的處理所算 位#的文勢’來控制遊戲空間中之高球桿162的 方勹(第3圓。具體而言,高球桿162,當控制器5的前端 ⑽向)朝向於LCD5i所顯示之球 163 0 寺係控制為遊戲空間内的高球桿162擊出球 323330 102 201220109 164(參照第30圖)。關於終端用遊戲圖像,為了增加臨場 感’能夠以實物大來顯示球163的圖像,或是顯示為因應 控制器5繞著z軸的旋轉使桿頭圖像164的朝向旋轉。此 外’終端用遊戲圖像’亦可使用設置在遊戲空間之虛擬攝 影機來生成’或是使用預先準備的圖像資料來生成。使用 預先準備的圖像資料來生成時,不需詳細建構高爾夫球場 的地形模型,而能夠以較小處理負荷來生成詳細且真實的 圖像。★ Figure 30 shows the pattern of the game system used in the fifth game example. In Fig. 3G, an image of the game space including the player character (object) 161 and the high club (object) 162 is displayed on the face of the television 2. In the figure, although hidden in the high club 162 and not displayed, the ball (object) 163 placed in the game space is also displayed on the television 2. On the other hand, as shown in the figure, the final position 7 is placed on the floor on the front side of the TV 2 in such a manner as to make the face of the coffee 51. The terminal device 7 displays: an image of the ball 163, an image showing the 〆 portion of the high club 162 (specifically, the head 162a of the same cup 162), and a map showing the ground of the game space. The image for the image and the terminal is an image of the surroundings of the ball viewed from above. When the game is played, the player 160 stands in the vicinity of the terminal device 7, and the control device 11 is swiped as if it were high. At this time, the CPU 10, =2 Γ: in the process of controlling the position of the high club 162 in the game space in accordance with the "texture of the bit # calculated by the processing of the above-mentioned step" (third circle. Specifically, high The club 162, when the front end (10) of the controller 5 is directed toward the ball 163 0 displayed by the LCD 5i, controls the high club 162 in the game space to hit the ball 323330 102 201220109 164 (refer to Fig. 30). Regarding the game image for the terminal, in order to increase the sense of presence, the image of the ball 163 can be displayed in a large size, or the orientation of the head image 164 can be rotated in response to the rotation of the controller 5 about the z-axis. Further, the "terminal game image" can be generated using a virtual camera installed in the game space or generated using image materials prepared in advance. When the image data prepared in advance is used for generation, it is not necessary to construct the terrain model of the golf course in detail, and it is possible to generate a detailed and realistic image with a small processing load.

藉由遊戲者160進行上述揮桿操作來揮舞高球桿 162,結果當高球桿162擊出球163時,球163會移動(飛 出)。亦即,cpu ίο在上述步驟S27中判定高球桿162與 球163疋否接觸,接觸時則將球163移動。在此,電視用 遊戲圖像係以包含有移動後的球163之方式來生成。亦 即’ CPU 1〇係以使移動的球包含於該攝影範圍之方式,來 控制用以^電視料戲®像之第1虛_影機的位置及 姿勢另—方面,終端裝置7中當高球桿162擊出球163 時球163的圖像移動並立即消失於晝面外。因此,第5 遊戲例中’球移動的模樣主要顯示於電視2 ’遊戲者16〇 可藉由電視料戲圖縣確顧揮桿操作所㈣之球 向。 制零如Λ^Ι’Λ5賴例巾,遊戲者16Q可藉由揮舞控 =5來揮舞咖162(使遊戲者角色i6i揮舞高球桿 η二此’第5遊戲例中,當控制器5的前端方向朝向 在1所顯示之球163的圖像時,係控制為遊戲空間内 323330 103 201220109 的咼球桿162擊出球163。因此,遊戲者可藉由揮桿操作 而得到實際揮出高球桿之感覺,而能夠使揮桿操作更具臨 場感。The golfer 162 is swung by the player 160 performing the above-described swing operation, and as a result, when the high club 162 hits the ball 163, the ball 163 moves (flies out). That is, cpu ίο determines whether the high club 162 is in contact with the ball 163 in the above step S27, and moves the ball 163 in contact. Here, the television game image is generated in such a manner as to include the moved ball 163. That is, the CPU 1 is used to control the position and posture of the first virtual camera used for the video game to make the moving ball included in the shooting range, and the terminal device 7 When the high club 162 hits the ball 163, the image of the ball 163 moves and immediately disappears outside the plane. Therefore, in the fifth game example, the appearance of the 'ball movement is mainly displayed on the TV 2' player 16 〇 can be determined by the TV screen county to take care of the swing (4). If the system is zero, the player 16Q can wave the coffee 162 by waving control = 5 (making the player character i6i swing the high club η two this) in the fifth game example, when the controller 5 When the front end direction is toward the image of the ball 163 displayed, it is controlled that the croquet club 162 of the 323330 103 201220109 in the game space hits the ball 163. Therefore, the player can actually swing out by the swing operation. The feeling of a high club can make the swing operation more realistic.

第5遊戲例中,當控制器5的前端方向朝向終端裝置 7時更在LCD 51顯示桿頭圖像。因此,遊戲者可藉 由將控制器5的前端方向朝向終端裝置7,而得到虛擬空 間中之高球桿162的姿勢與實際空間中之控制器5的姿 相對應之感覺,而能夠使揮桿操作更具臨場感。 =上所述,第5遊戲例,當將終端裝置?用作為顯示 藉由適當地配ι终料置7的位置,可讓使用控 制器5之操作更具臨場感。 料’上述第_5她种,終端裝置7配置在地面, 傻7顯7^出僅顯示球163周邊的遊戲空間之圖 ^因此’於終端裝置7中,無法顯示遊戲空間中之 的位置及姿勢,此外,於終端裝置 =細桑作後球163移動的模樣。因此,第5遊戲例 ’ =163的移動前,係於電視2顯示全體高球桿脱, 在球163的移動後,於電視?翻一 此,㈣笛球163移動的模樣。如 根據第5遊戲例’可將更摔 者,並且可使用電視2及終端裝晋呆作&供給遊戲 看的遊戲®像提*於遊戲者。、、、目4面將容易觀 此外,上述第5遊戲例中,為了算出控 係使用終端m㈣示部5 的姿勢’ si的初始處理中將標示部55二,^ .·,燈(標不裝置6未點燈), 323330 104 201220109 ,且CPU 10在上述步驟S23十根據標示器座標資料9 f出控制器5的姿勢。根據此,可正確地判定控制器5的 月,J端方向是否為朝向標示部55之姿勢。上述第5遊戲例 中,雖可不執行上述步驟S21及S22,但在其他遊戲例中, 亦可藉由執行上述步驟S21 & S22的處理而在遊戲中途變 更應予點燈之標示器。例如,咖1〇在步驟切中,根 =加速度資料94來判定控制器5的前端方向是否為朝向 力方向,在步驟S22巾,係控制為當朝 ::部上5點燈,未朝向重力方向時將標示裝置6二= 據此’當控制器5的前端方向朝向重力方向時,可夢由取 ^示f5的標示器座標資料而精度佳地算出控制器5的 尸1署且控制器5的前端方向朝向電視2時,藉由取得 ;;裝置6的標示器座標資料而精度佳地算出控制器5的 ”所說明,遊戲系統1可將終端裝置 -置在自由位置並應用作為顯示裝 示器座標資料用作為遊戲輪 爆此田將心 視Z 除了將控制器5朝向電 視2來使用之外,亦可藉 * 而使控制器5朝向自由方向來=^7汉疋在期望位置 態,可使用^ έ 使用。亦即,根據本實施形 〜了使㈣.5之朝向並未受 器5的操作自由度。 所以Tk升控制 [7.遊戲系統的其他動作例] 上述遊戲系統1,可進杆卜 遊戲之動作。終端裝置7亦可用之用以進行各種 J用作為可搬運型的顯示器或 323330 105 201220109 第2顯示器,间主 輸入之㈣:。f亦可料為麟觸控輸人或依據動作之 遊戲。料康上述遊戲系統卜可實施各式各樣的 (遊戲者僅=戲以外之用途’亦可進行下列動作。 本實_ ^ ^終端裝置7來玩遊戲之動作例) 操作褒置:功夕:裝置7具有顯示裝置的功能’亦具有 端裝置7用作1此’不使用電視及控制11 5而僅將终 式遊戲〜Μ ^ 依循第29 〇 _ 在步驟S3中物Γ所示之遊戲處理來具體地說明,咖10 S4中僅將終端操7取得終端操作資料97,在步驟 操作資料97用作為遊戲輸入(不使用控制器 像,在步驟處理。然後在步驟S6中生成遊戲圖 可不執行步驟S2::遊戲圖像傳送至糧置7。此時, 端農置、n、及沾°根據上述内容,係因應對終 處理結果之遊乍來進行遊戲處理,並將顯示出遊戲In the fifth game example, the head image is displayed on the LCD 51 when the front end direction of the controller 5 faces the terminal device 7. Therefore, the player can obtain the feeling that the posture of the high club 162 in the virtual space corresponds to the posture of the controller 5 in the real space by directing the front end direction of the controller 5 toward the terminal device 7, and can make the wave The pole operation is more realistic. = On the above, the fifth game example, when will the terminal device? As a display, by properly arranging the position of the final material 7, the operation of the controller 5 can be made more realistic. The above-mentioned _5th kind, the terminal device 7 is placed on the ground, and the figure 7 shows only the game space around the ball 163. Therefore, in the terminal device 7, the position in the game space cannot be displayed. Posture, in addition, in the terminal device = fine mulberry after the ball 163 moves. Therefore, before the fifth game example ’=163 moves, the TV 2 displays the entire golf club off, after the movement of the ball 163, on the TV? Turn this one, (4) the appearance of the flute ball 163. For example, according to the fifth game example, a more smasher can be used, and the game 2 can be used for the game to be played by the TV 2 and the terminal. In addition, in the fifth game example, in the fifth game example, in order to calculate the posture of the control system using the terminal m (4), the display portion 55 is in the initial processing of the display unit 55, and the lamp is not marked. The device 6 is not lit), 323330 104 201220109, and the CPU 10 exits the posture of the controller 5 according to the marker coordinate data 9f at the above step S23. According to this, it is possible to accurately determine whether or not the month of the controller 5 is in the posture toward the indicator portion 55. In the fifth game example described above, the above steps S21 and S22 may not be executed. However, in other game examples, the marker to be turned on in the middle of the game may be changed by executing the processing of steps S21 & S22 described above. For example, in the step cutting, the root=acceleration data 94 determines whether the front end direction of the controller 5 is toward the force direction, and in step S22, the system controls to turn on the light toward the :: portion, not facing the direction of gravity. When the front end direction of the controller 5 is directed toward the direction of gravity, it is possible to accurately calculate the body of the controller 5 and the controller 5 by taking the marker coordinate data of the f5. When the front end direction is toward the television 2, the controller 5 can accurately calculate the controller 5 by obtaining the marker coordinate data of the device 6, and the game system 1 can place the terminal device in a free position and apply it as a display device. The indicator coordinates data is used as a game wheel. This field will be used in addition to the controller 5, and the controller 5 can be used to face the free direction. In other words, according to the present embodiment, the orientation of (4).5 is not controlled by the degree of operation of the device 5. Therefore, the Tk rise control [7. Other operation examples of the game system] The above game system 1 , can enter the action of the game. Terminal device 7 can also be used to carry out various J-use as a portable display or 323330 105 201220109 2nd display, the main input (4): f can also be expected to be a touch-sensitive input or game based on action. The game system can be implemented in a variety of ways (the player only uses the use other than the play). The following actions can also be performed. This is a real example of the operation of the terminal device 7 to play the game. The function with the display device 'has also has the end device 7 used as 1 'not using the TV and control 11 5 and only the final game ~ Μ ^ according to the 29th 〇 _ the game processing shown in the object in step S3 It is to be noted that in the coffee 10 S4, only the terminal operation 7 acquires the terminal operation data 97, and the step operation data 97 is used as the game input (the controller image is not used, and the processing is performed in the step. Then, the game map is generated in step S6, and step S2 may not be performed. ::The game image is sent to the grain setting 7. At this time, the end of the farm, n, and dip ° according to the above content, due to the final processing results of the game to play the game, and will display the game

用作為以行遊戲處理,但)亦可將W 有人估田 戲裝置。因此,根據本實施形態,郎# 無法將遊= 列如其:人正在收看電視播放)等理由而 置C 2時’使用者亦可使用終端裝 上述tLG並不限於遊姻像,對於開啟電源後所顯示之 據此,遊^’亦可將圖像傳送至終端裝置7來顯示。根 戟者可從最初即不需使用電視2進行遊戲,故極 323330 106 201220109 為便利。 再者’上述中,亦 之顯示裝置從終端裝 在錢中途,將顯示出遊戲圖像 可更執行上述步驟s 變更為電視2。具體而言,CPU 10 中被輸出至電视2並將遊戲圖像輸出至電視2。步驟S9 裝置7之遊戲圖像::象’係與步驟S10中被傳送至終端 3的輸入之方。根據此,以顯示出來自遊戲裝置 同之遊戲圖像顯示於!^2的輪入,可將與終端裝置7相 顯示裝置變更為電电* 2,因此可將顯示出遊戲圖像之 可關閉終端裝晉7 ^ 2。在遊戲圖像被顯示於電視2後, 遊戲系統1 顯示。 置6、標示部55/構成為從紅外線輸出手段(標示裝 視2所射出的紅=外線通讯模組82)中,可輸出對電 由因應對終端裝置j控矾號。根據此,遊戲裝置3,藉 出上述紅外線遙控:進行的操作而從紅外線輸出手段輸 用者不需操作電视i,可對電視2進行操作。此時,使 作電視2,因此如"的遙控器而能夠使用終端裝置7來操 因此如切細換電视2的輸人時 (經由網路與其他裝置進行通訊之動作例) 如上所述,由於遊戲裝置3具有連接於 所以遊戲系統i亦可應用在經由網路與外部壯之功能, 之情形。第3!圖係顯示經由網路與外部=置進行通訊 系統1尹所包含之各裝置的連接關係之圖。、接k之遊戲 禾,遊戲«置3可經由網路〗90與外部裝置如第&圖所 如上所述,當外部裝置191與遊裝H連接。 罝3可進行通訊 323330 201220109 時,遊戲系統1中,可將終端裝置7作為介面並在與外部 裝置191之間進行通訊。例如,藉由在外部裝置191與終 端裝置7之間進行圖像及聲音的接收傳送,可將遊戲系統 1用作為電視電話。具體而言,遊戲裝置3經由網路190 接收來自外部裝置191的圖像及聲音(電話對方的圖像及 聲音),並將接收到的圖像及聲音傳送至終端裝置7。藉此, 終端裝置7可將來自外部裝置191的圖像顯示於LCD 51, 並且從喇叭77輸出來自外部裝置191的聲音。此外,遊戲 ® 裝置3從終端裝置7接收攝影機56所攝像之攝影機圖像、 及麥克風79所偵測之麥克風聲音,並經由網路190將攝影 機圖像及麥克風聲音傳送至外部裝置191。遊戲裝置3,藉 由在與外部裝置191之間重複進行上述圖像及聲音的接收 傳送,可將遊戲系統】.用作為電視電話。 本實施形態中,由於終端裝置7為可搬運型,所以使 用者可在自由位置上使用終端裝置7,並使攝影機56朝向 φ 自由方向。此外,本實施形態中,由於終端裝置7具備觸 控面板52,所以遊戲裝置3亦可將對觸控面板52所輸入 的輸入資訊(觸控位置資料100)傳送至外部裝置191。例 如,當藉由終端裝置7將來自外部裝置191的圖像及聲音 輸出,並且將使用者書寫於觸控面板52上之文字等傳送至 外部裝置191時,亦可將遊戲系統1用作為所謂e學習系 統(電子化學習系統)。 (與電視播放連動之動作例) 此外,遊戲系統1,當以電視2來觀賞電視播放時, 108 323330 201220109 亦能夠與電視播放連動而動作。亦即,遊戲系統l,當以 電視2來觀賞電視節目時,可將與該電視節目相關之資訊 等輸出至終端裝置7。以下說明遊戲系統1與電視播放連 動而動作時之動作例。 上述動作例中,遊戲裝置3可經由網路與伺服器進行 通訊(換言之,第31圖所示之外部裝置191為伺服器)。伺 服器係對電視播放的每個頻道記憶與電視播放相關的各種 資訊(電視資訊)。該電視資訊,可為字幕或演出者資訊等 _ 之與節目相關之資訊、或是EPG(電子化節目表)的資訊、 或作為資料播放而被播放之資訊。此外,電視資訊,可為 圖像、聲音、文字、或此等的組合之資訊。此外,伺服器 不一定需為1個,可對電視播放的每個頻道或每個節目設 置伺服器,遊戲裝置3亦可與各伺服器進行通訊。 當在電視2中輸出電視播放的影像及聲音時,遊戲裝 置3係令使用者使用終端裝置7將觀賞中之電視播放的頻 φ 道輸入。然後,經由網路要求伺服器傳送對應於所輸入的 頻道之電視資訊。因應於此,伺服器傳送對應於上述頻道 之電視資訊的資料。當接收到從伺服器傳送來之資料時, 遊戲裝置3將所接收之資料輸出至終端裝置7。終端裝置7 將上述資料中的圖像及文字資料顯示於LCD 51。並從喇叭 輸出聲音資料。藉由上述方式,使用者可使用終端裝置7 來享受與目前觀賞中的電視節目相關之資訊等。 如上所述,遊戲系統1係經由網路與外部裝置(伺服器) 進行通訊,藉此,亦可藉由終端裝置7將與電視播放連動 109 323330 201220109 之資訊提供至使用者。尤其在本實施形態中,由於終端裝 置7為可搬運型,所以使用者可在自由位置上使用終端裝 置7,其便利性高。 如以上所述,本實施形態中,除了使用於遊戲之外, 使用者亦能夠以各種用途及形態來使用終端裝置7。 [8.變形例] 上述實施形態為用以實施本發明之一例,其他實施形 態中,例如亦可在以下所說明之構成中實施本發明。 (具有複數個終端裝置之變形例) 上述實施形態中,遊戲系統1構成為傻具有1個終端 裝置,但遊戲系統1亦可構成為具有複數個終端裝置。亦 即,遊戲裝置3可分別與複數個終端裝置進行無線通訊, 可將遊戲圖像的資料、遊戲聲音的資料與控制資料傳送至 各終端裝置,並且從各終端裝置接收操作資料與攝影機圖 像資料與麥克風聲音資料。遊戲裝置3係與複數個終端裝 置的各個進行無線通訊,此時,遊戲裝置3可以時間分割 方式來進行與各終端裝置之無線通訊,或是分割頻率波段 來進行。 如上所述具有複數個終端裝置時,可使用遊戲系統來 進行更多種類的遊戲。例如,當遊戲系統1具有2個終端 裝置時,由於遊戲系統1具有3個顯示裝置,所以可生成 分別用於3位遊戲者的遊戲圖像並顯示於各顯示裝置。此 外,當遊戲系統1具有2個終端裝置時,在將控制器與終 端裝置作為1組來使用之遊戲(例如上述第5遊戲例)中,2 110 323330 201220109 位遊戲者可同時進行遊戲。再者,當根據從2個控制器所 輸出之標示器座標資料來進行上述步驟S27的遊戲處理 時,2位遊戲者可分別使控制器朝向標示器(標示裝置6或 標示部55)來進行遊戲操作。亦即,某一方的遊戲者使控 制器朝向標示裝置6來進行遊戲操作,另一方的遊戲者使 控制器朝向標示部55來進行遊戲操作。 (關於終端裝置的功能之變形例) 上述實施形態中,終端裝置7具有不執行遊戲處理之 ® 所謂精簡型終端的功能。在此,其他實施形態中,亦可藉 由終端裝置7等的其他裝置,來執行上述實施形態中由遊 戲裝置3所執行之一連串遊戲處理中的一部分處理。例 如,由終端裝置7來執行一部分處理(例如終端用遊戲圖像 的生成處理)。亦即,終端裝置亦可根據對操作部的操作進 行遊戲處理,且依據遊戲處理生成遊戲圖像並顯示於顯示 部之具有作為可攜型遊戲裝置的功能者。此外,例如在具 φ 有可相互進行通訊之複數個資訊處理裝置(遊戲裝置)之遊 戲系統中,該複數個資訊處理裝置可分擔執行遊戲裝置。 (關於終端裝置的構成之變形例) 上述實施形態中的終端裝置為一例,終端裝置7的各 操作鍵或外罩50的形狀,或是各構成要素的數目及設置位 置等僅僅為一例,亦可為其他形狀、數目及設置位置。例 如,終端裝置可為以下所示之構成。以下係參照第32圖至 第35圖,說明終端裝置的變形例。 第32圖係顯示上述實施形態的變形例之終端裝置的 111 323330 201220109 外觀構成之圖。第32圖中的⑷圖為終端裝置的前視圖, ⑹圖為俯視圖,(c)圖為右侧視圖,⑷圖為仰視圖。此外, 第33圖係顯示使用者握持終端裝置之模樣之圖。第犯圖 及第33圖中,關於對應於上述實施形態之終端裝置7的構 成要素之構成要素,係附加與第8圖相同之參照圖號,但 不一定須以同一者來構成。 如第32圖所示’終端裝置8具備大致為橫向較長之長 _转板狀形狀的外罩5〇。外罩5G為使用者所能夠握持之 程度的大小’因此,使用者能夠握持終端裝置8來移動, 或是改變終端裝置.8的配置位置。 終端裝置8於外罩50的表面具有LCD 51。LCD 51設 置在外罩50表面的十央附近。因此,如第9圖所示,使用 者藉由握持LCD 51兩側部分的外罩5〇,可一邊觀看LCD51 的畫面-邊握持終端裝置來移動。第9圖中,係顯示使用 者握持LCD 51左右兩側的部分之外罩5〇,而以橫握方式(橫 籲向較長的朝向)握持終端裝置8之例子,但亦能夠以縱握方 式(縱向較長的朝向)握持終端裝置8。 如第32圖的(a)圖所示,終端裝置8於ICD51的晝面 具有觸控面板52作為操作手段(操作部)。本變形例中,觸 面板52為電阻膜方式的觸控面板。惟觸控面板並不限於 電阻膜方式,例如可使用例如靜電電容方式等之任意方式 的觸控面板。此外.,觸控面板52可為單點觸控方式或多點 觸控方式。本變形例中,觸控面板52係應用與LCD 51的 解析度為相同解析度(偵測精度)者。惟觸控面板52的解析 323330 112 201220109 度並不定/頁與LCD 51的解析度一致。對觸控面板52之 輸入,通承用觸控筆來進行,但並不限於觸控筆,亦能夠 則吏用者的手指對觸控面板52進行輸人。外罩50上,可 "又置有用以收納用來對觸控面板52進行操作之觸控筆之 收納孔。如此,由於終端裝置8具有觸控商板52,所以使 用者可一邊移動終端裝置8 -邊操作觸控面板52 。亦即, 使用者可it移動Lcd 51的晝面,一邊對該晝面直接(藉 • 由觸控面板52)進行輪入。 如第32圖所不,終端裝置8具備2個類比搖桿53A及 ,、以及複數個操作鍵54A至54L作為操作手段(操作 7 )。各類比搖桿53A及53B為指示方向之裝置。各類比搖 2 53A及53B,係構成為可使由使用者的手指所操作之搖 目對於外罩5G的表面往任意方向(上下左右及斜向的 任思、角度)滑動或傾倒。此外,左類比搖桿53A及右類比搖 才干53B分別設置在LCD 51畫面的左侧及右側。因此,使用 鲁者可藉由左右任-手使用類比搖桿來進行指示方向之輸 入。此外,如第33圖所示,各類比搖桿53A及53B設置在 吏用者可於握持終端裝置8的左右部分之狀態下進行操作 之位置上,因此,即使使用者握持終端裝置8來移動時, '亦$容易操作各類比搖桿53A及53B。 ^各操作鍵54Α至54L為用以進行預定輸入之操作手 1又。如以下所示,各操作鍵54Α至54L係設置在使用者可 ^握持終端裝置S的左右部分之狀態τ進行操作之位置上 圖)。因此,即使使用者握持終端裝置8來移動 323330 113 201220109 時,亦可容易操作此等操作手段。 。如第32圖的(a)圖所示,於外罩5〇的表面,設置有各 &作鍵54A至54L中之十字鍵(方向輸入鍵)54A及鍵54β 至54H。亦即’此等鍵54A至54G係配置在使用者的拇指 所能夠操作之位置上(參照第33圖)。 十子鍵54A係設置在LCD 51的左側且在左類比搖桿 53A的下侧。亦即,十字鍵54A配置在使用者的左手所能 夠操作之位置上。十字鍵54A具有十字形狀,為可指示上 下左右的方向之鍵。此外’鍵54B至54D設置在LCD 51的 下側。此等3個鍵54B至54D,係配置在左右兩手所能夠 操作之位置上。此外,4個鍵54E至54H係設置在LCD51 的右側且在右類比搖桿53B的下側。亦即,4個鍵54E至 54H係配置在使用者的右手所能夠操作之位置上。再者,4 個鍵54E至54H係以(相對於4個鍵54E至54H的中心位置) 成為上下左右的位置關係之方式來配置。因此,終端裝置 φ 8可使4個鍵54E至54H具有用以將上下左右的方向指示 於使用者之鍵的功能。Used as a line game, but) can also be used to estimate the field. Therefore, according to the present embodiment, when Lang # can't set the navigation = such as: the person is watching the TV broadcast, etc., the user can also use the terminal to install the tLG, which is not limited to the marriage image, and after the power is turned on. According to this, the image can also be transmitted to the terminal device 7 for display. The root can be played without the need to use the TV 2 from the beginning, so the pole 323330 106 201220109 is convenient. Further, in the above, the display device is also installed in the middle of the money from the terminal, and the display of the game image can be performed by changing the above-described step s to the television 2. Specifically, the CPU 10 is output to the television 2 and outputs the game image to the television 2. Step S9 The game image of the device 7: the image is transmitted to the input of the terminal 3 in step S10. According to this, it is possible to change the display device with the terminal device 7 to the electric power* 2 by displaying the round-up of the game image displayed on the game device from the game device, so that the display of the game image can be turned off. Terminal installed Jin 7 ^ 2. After the game image is displayed on the TV 2, the game system 1 is displayed. In the case of the infrared ray output means (red = external communication module 82 which is emitted from the indication device 2), the display unit 55/ is configured to output a aligning signal to the terminal device. According to this, the game device 3 can operate the television 2 without the need to operate the television i from the infrared output means by the operation of the infrared remote control. At this time, since the television 2 is used, the terminal device 7 can be used as the remote controller of the ", so that when the input of the television 2 is changed, the operation example of communicating with other devices via the network is as follows. As described above, since the game device 3 has a connection, the game system i can also be applied to a function that is strong via the network and the outside. The third figure shows a connection diagram of each device included in the communication system 1 Yin via the network. And the game of k, the game «set 3 can be via the network〗 90 and external devices such as the & figure as described above, when the external device 191 is connected to the play H. When 罝3 is available for communication 323330 201220109, in the game system 1, the terminal device 7 can be used as an interface to communicate with the external device 191. For example, the game system 1 can be used as a videophone by receiving and transmitting images and sounds between the external device 191 and the terminal device 7. Specifically, the game device 3 receives an image and sound (image and sound of the telephone partner) from the external device 191 via the network 190, and transmits the received image and sound to the terminal device 7. Thereby, the terminal device 7 can display an image from the external device 191 on the LCD 51, and output the sound from the external device 191 from the speaker 77. Further, the game ® device 3 receives the camera image captured by the camera 56 and the microphone sound detected by the microphone 79 from the terminal device 7, and transmits the camera image and the microphone sound to the external device 191 via the network 190. The game device 3 can use the game system as a videophone by repeating the above-described reception of the image and sound with the external device 191. In the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position and face the camera 56 in the φ free direction. Further, in the present embodiment, since the terminal device 7 is provided with the touch panel 52, the game device 3 can also transmit the input information (touch position data 100) input to the touch panel 52 to the external device 191. For example, when the image and sound from the external device 191 are output by the terminal device 7, and characters or the like written by the user on the touch panel 52 are transmitted to the external device 191, the game system 1 can also be used as a so-called E-learning system (electronic learning system). (Example of the operation in conjunction with the television broadcast) In addition, the game system 1 can also operate in conjunction with the television broadcast when the television is viewed by the television 2 as 108 323330 201220109. That is, the game system 1 can output information related to the television program to the terminal device 7 when viewing the television program on the television 2. An example of the operation when the game system 1 operates in conjunction with the television broadcast will be described below. In the above operation example, the game device 3 can communicate with the server via the network (in other words, the external device 191 shown in Fig. 31 is a server). The servo device stores various information (television information) related to television playback for each channel of the television broadcast. The television information may be information related to the program such as subtitles or artist information, or information of an EPG (Electronic Program List), or information played as a material broadcast. In addition, television information can be information such as images, sounds, text, or a combination of these. Further, the server does not have to be one, and the server can be set for each channel or each program of the television broadcast, and the game device 3 can also communicate with each server. When the video and sound of the television broadcast are output on the television 2, the game device 3 causes the user to input the frequency φ channel of the television broadcast in the viewing using the terminal device 7. The server is then required to transmit television information corresponding to the entered channel via the network. In response to this, the server transmits data corresponding to the television information of the above channel. When receiving the data transmitted from the server, the game device 3 outputs the received data to the terminal device 7. The terminal device 7 displays the image and the text data in the above data on the LCD 51. And output sound data from the speaker. By the above means, the user can use the terminal device 7 to enjoy information related to the currently watched television program and the like. As described above, the game system 1 communicates with an external device (server) via the network, whereby the information associated with the television broadcast 109 323330 201220109 can be provided to the user via the terminal device 7. In particular, in the present embodiment, since the terminal device 7 is of a transportable type, the user can use the terminal device 7 at a free position, which is highly convenient. As described above, in the present embodiment, the user can use the terminal device 7 in various uses and forms in addition to the game. [8. Modifications] The above embodiment is an embodiment for carrying out the invention, and in other embodiments, the invention may be embodied, for example, in the configuration described below. (Modification in which a plurality of terminal devices are included) In the above embodiment, the game system 1 is configured to have one terminal device, but the game system 1 may be configured to have a plurality of terminal devices. That is, the game device 3 can wirelessly communicate with a plurality of terminal devices, and can transmit the data of the game image, the data of the game sound, and the control data to each terminal device, and receive the operation data and the camera image from each terminal device. Data and microphone sound data. The game device 3 wirelessly communicates with each of a plurality of terminal devices. In this case, the game device 3 can perform wireless communication with each terminal device in a time division manner or divide the frequency band. When there are a plurality of terminal devices as described above, the game system can be used to perform a wider variety of games. For example, when the game system 1 has two terminal devices, since the game system 1 has three display devices, game images for each of the three players can be generated and displayed on each display device. Further, when the game system 1 has two terminal devices, in a game in which the controller and the terminal device are used as one group (for example, the fifth game example described above), 2 110 323330 201220109 players can simultaneously play the game. Furthermore, when the game processing of the above step S27 is performed based on the marker coordinate data output from the two controllers, the two players can respectively cause the controller to face the marker (the pointing device 6 or the indicator portion 55). Game operation. That is, the player of one of the parties causes the controller to perform the game operation toward the pointing device 6, and the other player causes the controller to perform the game operation toward the indicator portion 55. (Modification of Function of Terminal Device) In the above embodiment, the terminal device 7 has a function of a so-called compact terminal that does not execute game processing. Here, in another embodiment, a part of the series of game processing executed by the game device 3 in the above embodiment may be executed by another device such as the terminal device 7. For example, a part of the processing (e.g., generation processing of the game image for the terminal) is executed by the terminal device 7. In other words, the terminal device can also perform game processing based on the operation of the operation unit, and generate a game image in accordance with the game processing and display it on the display unit as a function as a portable game device. Further, for example, in a game system having a plurality of information processing devices (game devices) that can communicate with each other, the plurality of information processing devices can share the execution of the game device. (Modification of Configuration of Terminal Device) The terminal device in the above embodiment is an example, and the shape of each operation key or cover 50 of the terminal device 7, or the number of components and the installation position, etc., may be merely an example. For other shapes, numbers, and settings. For example, the terminal device can be configured as shown below. Hereinafter, a modification of the terminal device will be described with reference to Figs. 32 to 35. Fig. 32 is a view showing the appearance of 111 323330 201220109 of the terminal device according to the modification of the above embodiment. (4) in Fig. 32 is a front view of the terminal device, (6) is a plan view, (c) is a right side view, and (4) is a bottom view. Further, Fig. 33 is a view showing a state in which the user holds the terminal device. In the first and third figures, the constituent elements of the constituent elements of the terminal device 7 according to the above-described embodiment are denoted by the same reference numerals as those of the eighth embodiment, but need not necessarily be the same. As shown in Fig. 32, the terminal device 8 is provided with a cover 5 that is substantially long in the lateral direction and has a long plate-like shape. The cover 5G is of a size that the user can hold. Therefore, the user can hold the terminal device 8 to move or change the arrangement position of the terminal device 8. The terminal device 8 has an LCD 51 on the surface of the housing 50. The LCD 51 is disposed near the center of the outer cover 50. Therefore, as shown in Fig. 9, the user can move while holding the terminal device while viewing the screen of the LCD 51 by holding the cover 5's on both sides of the LCD 51. In Fig. 9, the user is shown holding the outer cover 5's on the left and right sides of the LCD 51, and holding the terminal device 8 in a horizontally held manner (horizontal direction), but it is also possible to The gripping means (longitudinal orientation) holds the terminal device 8. As shown in Fig. 32(a), the terminal device 8 has a touch panel 52 as an operation means (operation portion) on the side of the ICD 51. In the present modification, the touch panel 52 is a resistive touch panel. However, the touch panel is not limited to the resistive film method, and for example, a touch panel of any type such as a capacitive method can be used. In addition, the touch panel 52 can be a single touch method or a multi-touch method. In the present modification, the touch panel 52 is applied to the same resolution (detection accuracy) as the resolution of the LCD 51. However, the resolution of the touch panel 52 is 323330 112 201220109 degrees indefinite/page is consistent with the resolution of the LCD 51. The input to the touch panel 52 is performed by a stylus, but it is not limited to the stylus, and the touch panel 52 can be input by the user's finger. On the outer cover 50, a receiving hole for accommodating the stylus for operating the touch panel 52 can be disposed. Thus, since the terminal device 8 has the touch panel 52, the user can operate the touch panel 52 while moving the terminal device 8. That is, the user can move the face of the Lcd 51 and directly enter the face (by the touch panel 52). As shown in Fig. 32, the terminal device 8 is provided with two analog rockers 53A and ???, and a plurality of operation keys 54A to 54L as operation means (operation 7). Various types of ratio rockers 53A and 53B are means for indicating directions. The various types of ratios 2, 53A and 53B are configured such that the rocking by the user's fingers can slide or tilt the surface of the outer cover 5G in any direction (up, down, left, right, and oblique directions, angles). Further, the left analog stick 53A and the right analog stick 53B are respectively disposed on the left and right sides of the LCD 51 screen. Therefore, the use of the Lu can be used to input the direction of the direction by using the analog joystick. Further, as shown in Fig. 33, the various types of ratio rockers 53A and 53B are provided at positions where the user can operate while holding the left and right portions of the terminal device 8, and therefore, even if the user holds the terminal device When 8 comes to move, 'also $ easy to operate all kinds of joysticks 53A and 53B. ^ Each of the operation keys 54A to 54L is an operation hand 1 for performing a predetermined input. As shown below, each of the operation keys 54A to 54L is provided at a position where the user can grip the left and right portions of the terminal device S to operate. Therefore, even if the user holds the terminal device 8 to move 323330 113 201220109, the operation means can be easily operated. . As shown in Fig. 32(a), on the surface of the outer cover 5, a cross key (direction input key) 54A and keys 54β to 54H among the keys 54A to 54L are provided. That is, the keys 54A to 54G are disposed at positions where the user's thumb can operate (see Fig. 33). The ten sub-key 54A is disposed on the left side of the LCD 51 and on the lower side of the left analog stick 53A. That is, the cross key 54A is disposed at a position where the user's left hand can operate. The cross key 54A has a cross shape and is a key that can indicate the directions of up, down, left, and right. Further, 'keys 54B to 54D are provided on the lower side of the LCD 51. These three keys 54B to 54D are disposed at positions where the left and right hands can be operated. Further, four keys 54E to 54H are provided on the right side of the LCD 51 and on the lower side of the right analog rocker 53B. That is, the four keys 54E to 54H are disposed at positions where the user's right hand can operate. Further, the four keys 54E to 54H are arranged such that they are in a positional relationship of up, down, left, and right (with respect to the center positions of the four keys 54E to 54H). Therefore, the terminal device φ 8 can have four functions of the keys 54E to 54H for indicating the up, down, left and right directions to the user's keys.

此外,如第32圖的(a)圖、(b)圖及(c)圖所示,第1L 鍵541及第1R鍵54J係設置在外罩50的斜上方部分(左上 方部分及右上方部分)。具體而言,第1L鍵541設置在板 狀外罩50之上侧的侧面左端,並從上侧及左側的側面露 出。此外,第1R鍵54J設置在板狀外罩50之上側的側面 右端,並從上侧及右侧的側面露出。如此,第1L鍵541配 置在使用者的左手食指所能夠操作之位置上,第1R鍵54J 114 323330 201220109 配置在使用者的右手食指所能夠操作之位置上(參照第9 圖)。 此外,如第32圖的(b)圖及(c)圖所示,第儿鍵54K 及第2R鍵54L ’係配置在板狀外罩5〇的背面(亦即設置有 LCD 51之表面的相反側之面)上所突起設置之足部59Α及 59Β。與上述實施形態的簷部59相同,各足部59Α及5帅 係設置在包含分別設置在顯示部的左右方之操作部(各類 比搖桿53Α及53Β)的相反侧位置之區域上。此外,第儿 籲鍵54K設置在外罩50之背面左侧(從表面側觀看時之左側) 的稍微上方處,第2R鍵54L設置在外罩50之背面右側(從 表面侧觀看時之右側)的稍微上方處。換言之,第乩鍵54κ 係設置在表面上賴置之左類比搖桿53Α的大致相反側的 位置第2R鍵54L係設置在表面上所設置之右類比搖桿 53Β的大致相反側的位置。如此,帛⑶鍵54κ酉己置在使用 者的左手中指所能夠操作之位置上,第2R鍵54L配置在使 _用者的右手中指所能夠操作之位置上(參照第9圖)。此 外如第32圖的(c)圖所示,第乩鍵54Κ及第鍵5札, 係設置在上述足部59A及的朝斜上方之面上,並具有 朝斜上方之按鍵面。由於使用者握持終端裝置8時推測中 指會朝上下方向動作,因此,藉由使按鍵面朝上方,使用 者可容易壓下第2L鍵54K及第2_池。此外,藉由在 外罩50的背面設置足部,使用者可容易握持外罩5〇,且 藉由在足部設置鍵,可在握持外罩50下容易操作。 關於第32圖所示之終端震置8,纟於第2L鍵541(及 323330 115 201220109 第2R鍵54L設置在背面,當以使LCD 51的晝面(外罩50 的表面)朝上之狀態下載置終端裝置8時,有時晝面會未完 全呈水平。因此’其他實施形態中,可在外罩5〇的背面設 置3個以上的足部。根據此’可在使LCD 51的晝面朝上之 狀態下使足部接觸於地面而載置於地面,因此可使晝面呈 水平地載置終端裝置8。此外’亦可追加能夠裝卸的足部 而水平地載置終端裝置8。 各操作鍵54A至54L ’係適當地分配有因應遊戲程式 ® 之功能。例如’十字鍵54A及鍵54E至54H可用在方向指 示操作或選擇操作等,各鍵54B至54E可用在決定操作或 取消操作等。 雖然未圖示,但終端裝置8具有用以導通/關閉終端裝 置8的電源之電源鍵。此外,終端裝置8亦可具有用以導 通/關閉LCD 51的晝面顯示之鍵,或是用以進行與遊戲裝 置3的連接設定(配對)之鍵,或是用以調節喇叭(第圖 φ 所示之喇叭77)的音量之鍵。 如第32圖的(a)圖所示,終端裝置8係在外罩50的表 面具備有由標示器55A及標示器55B所構成之標示部(第 10圖所示之標示部55)。標示部55設置在LCD 51的上側。 各標示器55A及標示器55B,與標示裝置6的標示器6R及 6L相同’是由1個以上的紅外線led所構成。標示部55 與上述標示裝置6相同,係用以讓遊戲裝置3算出控制器 5的動作等所用。此外,遊戲裝置3可控制標示部55所具 備之各個紅外線LED的點燈。 116 323330 201220109 終端作錢像手狀攝影機56。攝影 機56係包含具有^解析度之攝像元件(例如咖感測器 或⑽感測如第32圖所示,本變· 攝影機56 §又置在卜罩50的表面。因此,攝影機^可將握 持著終端裝置8之使用者的臉予以攝像,例如一邊 51 —邊將進行遊戲時之使用者予以攝像。 終端裝置8’係具備作為聲音輸入手段之麥 圖所示之麥克風79)。於外罩5()的表面設置有麥第10 50c。麥克風79設置在該麥克風用孔5〇c内之外罩 孔 部。麥克風係摘測出使用者的聲音等、以及的内 圍的聲音。 、置8周 終端裝置8,係具備作為聲音輸出手段之喇叭( 圖所示之喇叭77)。如第32圖的(d)圖所示,於外罩U 下側側面設置有喇叭孔57。喇叭77的輸出聲音 的 孔57輸出。本變形例中,終端裝置8具有2個喇叭f於叭 喇叭與右喇σ八的各位置設置有喇π八孔57。 '反 此外,終端裝置8係具備用以將其他裝置與終端妒 8連接之擴充連接器58。本變形例中,如第32圖的置 所示,擴充連接器58係設置在外罩50的下側側面。連圖 於擴充連接器58之其他裝置,可為任意裝置,例如為特, 遊戲中所使用之控制器(槍型控制器等)或是鍵盤等之輪= 裝置。若無連接其他裝置之必要性,則亦可不設置* 接器58。 汽充連 關於第32圖所示之終端装置8,各操作鍵或外罩如 32333〇 117 201220109 的形狀,或是各構成要素的數目及設置位置等僅僅為一 例,亦可為其他形狀、數目及設置位置。 如上所述,上述變形例中,在外罩50的背面被設置在 左右兩側的位置之2個足部59A及59B,係作為突起部而 設置。此時,與上述實施形態相同,藉由在以無名指或中 指鉤住突起部的下面之狀態下握持終端裝置8,使用者可 輕鬆地握持終端裝置8(參照第33圖)。此外,與上述實施 形態相同,由於在突起部上表面設置第2L鍵54K及第2R • 鍵54L,所以使用者可在上述狀態下容易地操作此等鍵。 如上述實施形態及變形例,突起部較佳係在外罩的背 側,且於較外罩的中央為上側且至少在左右兩側的位置上 突起地設置。根據此,當使用者握持外罩的左右兩側時, 突起部可讓手指鉤住,而能夠輕鬆地握持終端裝置。此外, 藉由將突起部設置在上側,使用者可藉由手掌來支撐外罩 (參照第10圖等)而可緊緊地握持操作裝置。 φ 突起部亦可不設置在較外罩的中央為上側。例如,當 分別將操作部設置在顯示部的左右時,突起部,在使用者 能以兩手的拇指分別操作各操作部之方式握持外罩的狀態 下,亦可設置在拇指以外的任一手指所能夠鉤住之位置。 藉此,使用者亦可藉由將手指鉤住突起部,而能夠輕鬆地 握持終端裝置。 第34圖及第35圖係顯示上述實施形態的其他變形例 之終端裝置的外觀構成圖。第34圖係顯示終端裝置的右側 視圖,第35圖為俯視圖。第34圖及第35圖所示之終端裝 118 323330 201220109 置9,除了具備凸部230a及230b之外,其他與上述實施 形態中的終端裝置7相同。以下係以與上述實施形態之不 同點為中心,來說明本變形例中之終端裝置9的構成。 凸部230a及230b係剖面為凸型,在外罩50的背側分 別設置在左右兩側。在此,於外罩50的左側(從表面側觀 看時之左側)設置有凸部230a,於外罩50的右側(從表面 側觀看時之右侧)設置有凸部230b。如第35圖所示,各凸 部230a及230b係設置在外罩50的左右兩邊(兩端部)。此 • 外,各凸部230a及230b設置在較突起部(簷部59)為下方。 各凸部230a及230b係在與突起部之間隔著間隔設置。亦 即,外罩50上,各凸部230a及230b與突起部之間的部分, 係構成為較此等的各部為薄。各凸部230a及230b,之突 起部分朝上下方向延伸,且垂直於上下方向之剖面呈凸型 的形狀。 本變形例中,使用者藉由以小指(及無名指)包圍各凸 φ 部230a及230b之方式來握持,更能夠緊緊地握持終端裝 置9。亦即,凸部230a及230b具有握把部的功能。凸部(握 把部)可為任意形狀,惟以朝上下方向延伸之方式來形成時 容易握持終端裝置9,故較佳。此外,各凸部230a及230b 的高度可為任意高度,可形成為較突起部為低。根據此, 在以使LCD 51的晝面朝上之方式載置終端裝置9之狀態 下,畫面的下側較上侧為低,所以可在容易觀看之狀態下 載置終端裝置9。此外,各凸部230a及230b係在與突起 部之間隔著間隔而設置,所以使用者可將手指抵住突起部 119 323330 201220109 的下面來握持終端裝置9,凸部不會成為該手指的阻礙。 如上所述,根據上述變形例,藉由在突起部的下方設置凸 部,使用者更能夠緊緊地握持終端裝置。其他實施形態中, 亦可構成為不於外罩50的背面設置上述突起部,此時,使 用者亦可藉由凸部(握把部)緊緊地握持外罩50。此外,凸 部(握把部)表面,為了更為提升抓取功能,可使用不易滑 動之材料。即使無凸部,外罩背面亦可使用不易滑動之材 料。 (關於運用本構成之裝置之變形例) 上述實施形態中,係以與定置型遊戲裝置一同使用之 終端裝置為例來說明,但本說明書所記載之操作裝置的構 成,可適用在使用者能夠握持使用之任意裝置中。例如, 操作裝置,亦可實現作為可攜式遊戲機、行動電話、智慧 型手機、及電子書終端裝置等資訊終端機。 以上雖詳細說明本發明,但前述說明在所有方面上僅 為本發明之例示,並不應用以限定其範圍。在不脫離本發 明之範圍内,當然可進行種種改良和變形。 (產業利用可能性) 如上所述,本發明係以使用者可容易握持等者為目 的,例如可應用在遊戲系統中的操作裝置(終端裝置)等。 【圖式簡單說明】 第1圖為遊戲系統1的外觀圖。 第2圖係顯示遊戲裝置3的内部構成之方塊圖。 第3圖係顯示控制器5的外觀構成之立體圖。 120 323330 201220109 第4圖係顯示控制器5的外觀構成之立體圖。 第5圖係顯示控制器5的内部構造之圖。 第6圖係顯示控制器5的内部構造之圖。 第7圖係顯示控制器5的構成之方塊圖。 第8圖係顯示終端裝置7的外觀構成之圖。 第9圖係顯示終端裝置7的外觀構成之圖。 第10圖係顯示使用者橫向握持終端裝置7之模樣之 圖。 ® 第11圖係顯示使用者橫向握持終端裝置γ之模樣之 圖。 第12圖係顯示使用者縱向握持終端裝置7之模樣之 圖。 第13圖係顯示使用者縱向握持終端裝置7之模樣之 rst\ 圍〇 第14圖係顯示終端裝置7的内部構成之方塊圖。 φ 第15圖係顯示將附加裝置(輸入裴置200)裝著於終端 裝置7之一例圖。 第16圖係顯示將附加裝置(輸入裝置2〇〇)裝著於終端 裝置7之一例圖。 第17圖係顯示輸入裝置的其他例子圖。 第18圖係顯示將第17圖所示之輸入裝置220裝著於 終端裝置7之模樣之圖。 第19圖係顯示將第17圖所示之輸入裴置22〇裝著於 終端裝置7之模樣之圖。 323330 121 201220109 第20圖係顯示將附加裝置(支架210)裝著於終端裝置 7之其他一例圖。 第21圖係顯示遊戥處理中所使用之各種資料之圖。 第22圖係顯示遊戲裝置3中所執行之遊戲處理的流程 之主流程圖。 第23圖係顯示遊戲控制處理的詳細流程之流程圖。 第24圖係顯示第1遊戲例中之電視2的晝面與終端裝 置7之圖。Further, as shown in (a), (b), and (c) of Fig. 32, the first L key 541 and the first R key 54J are provided at an obliquely upper portion (the upper left portion and the upper right portion of the outer cover 50). ). Specifically, the first L-key 541 is provided on the left side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the left side. Further, the first R key 54J is provided on the right side end of the upper side of the plate-shaped outer cover 50, and is exposed from the upper side and the right side surface. In this manner, the first L key 541 is disposed at a position where the user's left index finger can be operated, and the first R key 54J 114 323330 201220109 is disposed at a position where the user's right index finger can be operated (refer to Fig. 9). Further, as shown in (b) and (c) of Fig. 32, the first key 54K and the second R key 54L' are disposed on the back surface of the plate-like cover 5 (i.e., the opposite surface of the LCD 51 is provided). On the side of the side, the foot portions 59Α and 59Β are provided on the protrusions. Similarly to the crotch portion 59 of the above-described embodiment, each of the leg portions 59A and 5 is provided in a region including the opposite side positions of the operation portions (the types of the respective rockers 53A and 53B) provided on the left and right sides of the display portion. Further, the first call key 54K is provided slightly above the left side of the back cover 50 (the left side when viewed from the front side), and the second R key 54L is provided on the right side of the back of the cover 50 (the right side when viewed from the front side) Slightly above. In other words, the second key 54κ is disposed on the substantially opposite side of the left analog rocker 53A on the surface. The second R key 54L is disposed on the substantially opposite side of the right analog rocker 53A provided on the surface. Thus, the 帛(3) key 54κ酉 is placed at a position where the user's left middle finger can be operated, and the second R key 54L is placed at a position where the user's right middle finger can be operated (see Fig. 9). Further, as shown in Fig. 32 (c), the first button 54 and the button 5 are provided on the upper surface of the foot portion 59A and obliquely upward, and have a key surface which is obliquely upward. Since the user assumes that the middle finger is moving in the up and down direction when the user holds the terminal device 8, the user can easily press down the second L key 54K and the second pool by bringing the button face upward. Further, by providing the foot on the back surface of the outer cover 50, the user can easily hold the outer cover 5, and by providing a key on the foot, it is easy to operate under the outer cover 50. Regarding the terminal shake 8 shown in FIG. 32, the second L key 541 (and 323330 115 201220109, the 2nd R key 54L is disposed on the back side, and is downloaded in a state where the face of the LCD 51 (the surface of the cover 50) faces upward). When the terminal device 8 is placed, the kneading surface may not be completely horizontal. Therefore, in other embodiments, three or more leg portions may be provided on the back surface of the outer cover 5A. According to this, the face of the LCD 51 may be faced. In the upper state, the foot is placed on the ground and the floor device is placed on the ground. Therefore, the terminal device 8 can be placed horizontally. Further, the detachable foot portion can be added to horizontally mount the terminal device 8. The operation keys 54A to 54L' are appropriately assigned functions corresponding to the game program®. For example, the 'cross key 54A and the keys 54E to 54H can be used for the direction indication operation or the selection operation, etc., and the keys 54B to 54E can be used for the decision operation or the cancel operation. Although not shown, the terminal device 8 has a power button for turning on/off the power of the terminal device 8. Further, the terminal device 8 may have a button for turning on/off the face display of the LCD 51, or Used to perform with the game device 3 Connect the setting (pairing) button or the button for adjusting the volume of the speaker (the speaker 77 shown in Fig. φ). As shown in (a) of Fig. 32, the terminal device 8 is attached to the surface of the cover 50. The indicator portion (the indicator portion 55 shown in Fig. 10) composed of the marker 55A and the marker 55B is provided. The indicator portion 55 is provided on the upper side of the LCD 51. Each marker 55A and marker 55B, and the pointing device 6 The same signs are used for one or more infrared LEDs. The indicator portion 55 is the same as the above-described indicator device 6, and is used to cause the game device 3 to calculate the operation of the controller 5, etc. Further, the game device 3 It is possible to control the lighting of each of the infrared LEDs provided in the indicator portion 55. 116 323330 201220109 The terminal is a money-like hand camera 56. The camera 56 includes an imaging element having a resolution (for example, a coffee sensor or (10) sensing as described above. As shown in Fig. 32, the camera 53 is placed on the surface of the mask 50. Therefore, the camera can capture the face of the user holding the terminal device 8, for example, while playing the game on the side 51. The user takes a picture. Terminal device 8' A microphone 79) as shown in the imaginary image of the sound input means is provided. The surface of the outer cover 5 () is provided with a mak 1050c. The microphone 79 is provided in the outer hole 5c of the microphone hole. The user's voice and the sound of the inner circumference are measured. The terminal device 8 is provided with a speaker (the speaker 77 shown in the figure) as a sound output means. The figure (d) of Fig. 32 is shown. A horn hole 57 is provided on the lower side surface of the outer cover U. The output sound hole 57 of the horn 77 is output. In the present modification, the terminal device 8 has two horns f provided at each position of the horn speaker and the right horn σ. La π eight holes 57. In addition, the terminal device 8 is provided with an expansion connector 58 for connecting another device to the terminal port 8. In the present modification, as shown in Fig. 32, the expansion connector 58 is provided on the lower side surface of the outer cover 50. Other devices connected to the expansion connector 58 may be any device, such as a controller used in a game (gun type controller, etc.) or a wheel such as a keyboard. If there is no need to connect other devices, the connector 58 may not be provided. For the terminal device 8 shown in Fig. 32, the shape of each operation button or cover such as 32333〇117 201220109, or the number of each component and the installation position, etc. are merely examples, and other shapes and numbers may be used. Set the location. As described above, in the above-described modification, the two leg portions 59A and 59B which are provided on the right and left sides of the outer cover 50 are provided as projections. At this time, in the same manner as in the above-described embodiment, the terminal device 8 can be easily held by the user while holding the terminal device 8 with the ring finger or the middle finger hooked on the lower surface of the protruding portion (see Fig. 33). Further, in the same manner as in the above embodiment, since the second L key 54K and the second R key 54L are provided on the upper surface of the projection, the user can easily operate the keys in the above state. In the above-described embodiments and modifications, the protruding portion is preferably provided on the back side of the outer cover, and is provided on the upper side of the outer cover and is provided at least at the left and right sides. According to this, when the user grips the left and right sides of the outer cover, the protruding portion can hook the finger, and the terminal device can be easily held. Further, by providing the protrusion on the upper side, the user can support the cover by the palm (refer to Fig. 10 or the like), and the operation device can be gripped tightly. The φ protrusion may not be provided on the upper side of the center of the outer cover. For example, when the operation portion is provided on the right and left sides of the display portion, the protrusion portion can be set to any finger other than the thumb in a state where the user can hold the cover with the operation of each of the operation portions. The position that can be hooked. Thereby, the user can easily hold the terminal device by hooking the finger to the protruding portion. Fig. 34 and Fig. 35 are views showing the external configuration of a terminal device according to another modification of the above embodiment. Fig. 34 is a right side view showing the terminal device, and Fig. 35 is a plan view. The terminal device 118 323330 201220109 shown in Figs. 34 and 35 is the same as the terminal device 7 in the above embodiment except that the convex portions 230a and 230b are provided. The configuration of the terminal device 9 in the present modification will be described below focusing on differences from the above-described embodiments. The convex portions 230a and 230b are convex in cross section, and are disposed on the left and right sides of the outer cover 50, respectively. Here, the convex portion 230a is provided on the left side of the outer cover 50 (the left side when viewed from the front side), and the convex portion 230b is provided on the right side of the outer cover 50 (the right side when viewed from the front side). As shown in Fig. 35, each of the convex portions 230a and 230b is provided on the left and right sides (both ends) of the outer cover 50. In addition, each of the convex portions 230a and 230b is disposed below the relatively protruding portion (the crotch portion 59). Each of the convex portions 230a and 230b is provided at an interval from the protruding portion. That is, in the outer cover 50, the portion between the convex portions 230a and 230b and the protruding portion is configured to be thinner than the respective portions. The protruding portions of the convex portions 230a and 230b extend in the vertical direction, and have a convex shape in a cross section perpendicular to the vertical direction. In the present modification, the user holds the convex φ portions 230a and 230b with the little finger (and the ring finger), and the terminal device 9 can be held tightly. That is, the convex portions 230a and 230b have the function of the grip portion. The convex portion (grip portion) may have any shape, and it is preferable to easily hold the terminal device 9 when it is formed to extend in the vertical direction. Further, the height of each of the convex portions 230a and 230b may be any height, and may be formed to be lower than the protruding portion. According to this, in the state in which the terminal device 9 is placed such that the face of the LCD 51 faces upward, the lower side of the screen is lower than the upper side, so that the terminal device 9 can be placed in an easily viewable state. Further, since the convex portions 230a and 230b are provided at intervals from the protruding portion, the user can hold the finger against the lower surface of the protruding portion 119 323330 201220109 to hold the terminal device 9, and the convex portion does not become the finger. Obstruction. As described above, according to the above modification, by providing the convex portion below the protruding portion, the user can hold the terminal device tightly. In other embodiments, the protrusion may be provided not on the back surface of the outer cover 50. In this case, the user may hold the outer cover 50 tightly by the convex portion (grip portion). In addition, the surface of the convex portion (grip portion) can be made of a material that is not slippery in order to enhance the gripping function. Even if there are no projections, the material on the back of the cover can be easily slid. (Modification of the apparatus using the present configuration) In the above embodiment, the terminal device used together with the fixed game device is taken as an example. However, the configuration of the operation device described in the present specification can be applied to the user. Hold in any device used. For example, the operating device can also be implemented as an information terminal such as a portable game machine, a mobile phone, a smart phone, and an e-book terminal device. The present invention has been described in detail above, but is not intended to limit the scope of the invention. Various modifications and variations are of course possible without departing from the scope of the invention. (Industrial Applicability) As described above, the present invention is intended to be easily held by a user, for example, an operation device (terminal device) or the like that can be applied to a game system. BRIEF DESCRIPTION OF THE DRAWINGS Fig. 1 is an external view of the game system 1. Fig. 2 is a block diagram showing the internal structure of the game device 3. Fig. 3 is a perspective view showing the appearance of the controller 5. 120 323330 201220109 Fig. 4 is a perspective view showing the appearance of the controller 5. Fig. 5 is a view showing the internal configuration of the controller 5. Fig. 6 is a view showing the internal configuration of the controller 5. Fig. 7 is a block diagram showing the configuration of the controller 5. Fig. 8 is a view showing the appearance of the terminal device 7. Fig. 9 is a view showing the appearance of the terminal device 7. Fig. 10 is a view showing a state in which the user holds the terminal device 7 laterally. ® Figure 11 shows a picture of the user holding the terminal device γ horizontally. Fig. 12 is a view showing a state in which the user holds the terminal device 7 in the longitudinal direction. Fig. 13 is a block diagram showing the internal configuration of the terminal device 7 in which the user holds the terminal device 7 in the longitudinal direction. Fig. 15 is a view showing an example in which an attachment device (input device 200) is attached to the terminal device 7. Fig. 16 is a view showing an example in which an attachment device (input device 2A) is attached to the terminal device 7. Fig. 17 is a view showing another example of the input device. Fig. 18 is a view showing the appearance of the input device 220 shown in Fig. 17 attached to the terminal device 7. Fig. 19 is a view showing the appearance of the input device 22 shown in Fig. 17 attached to the terminal device 7. 323330 121 201220109 Fig. 20 is a view showing another example of attaching an attachment device (bracket 210) to the terminal device 7. Figure 21 is a diagram showing various materials used in the recreation process. Fig. 22 is a main flowchart showing the flow of the game processing executed in the game device 3. Figure 23 is a flow chart showing the detailed flow of the game control process. Fig. 24 is a view showing the face of the television 2 and the terminal device 7 in the first game example.

第25圖係顯示第2遊戲例中之電視2的晝面與終端裝 置7之圖。 第2 6圖係顯示苐3遊戲例中顯示於電視2之電視用遊 戲圖像的一例圖。 第27圖係顯示第3遊戲例中顯示於終端较置7之終端 用遊戲圖像的一例圖。 第28圖係顯示第4遊戲例中顯示於電視2之電視用遊 戲圖像的一例圖。 第29圖係顯示第4遊戲例申顯示於終端裝置7之終端 用遊戲圖像的一例圖。 第30圖係顯示第5遊戲例中使用遊戲系統丨之模樣之 圖。 , 連接時之遊戲系 第31圖係顯示經由網路與外部震置 統1中所包含之各裝置的連接關係之圖。 的變形例之終端裝 置的外 第32圖係顯示本實施形態 觀構成之圖。 323330 122 201220109 第33圖係顯示使用者握持第32圖所示之終端裝置時 之模樣之圖。 第34圖係顯示本實施形態的其他變形例之終端裂置 的外觀構成之圖。 第35.圖係顯示本實施形態的其他變形例之終端裝^置 的外觀構成之圖。 【主要元件符號說明】Fig. 25 is a view showing the face of the television 2 and the terminal device 7 in the second game example. Fig. 26 is a view showing an example of a television game image displayed on the television 2 in the 苐3 game example. Fig. 27 is a view showing an example of a game image for a terminal displayed in the terminal 7 in the third game example. Fig. 28 is a view showing an example of a television game image displayed on the television 2 in the fourth game example. Fig. 29 is a view showing an example of a game image for terminal used in the terminal device 7 in the fourth game example. Fig. 30 is a view showing the appearance of the game system in the fifth game example. Fig. 31 shows the connection relationship between the devices included in the external seismic system 1 via the network. The external view of the terminal device of the modified example is shown in Fig. 32. 323330 122 201220109 Figure 33 is a diagram showing the appearance of the user when holding the terminal device shown in Figure 32. Fig. 34 is a view showing the appearance of the terminal splitting of another modification of the embodiment. Fig. 35 is a view showing the appearance of the terminal device according to another modification of the embodiment. [Main component symbol description]

1 遊戲系統 2 電視 2a 3 遊戲裝置 4 光碟 5 控制器 6 標示裝置 6L 標示器 6R 標示器 7 終端裝置 8 終端裝置 9 終端裝置 10 CPU 11 系統LSI 11a 輸出輸入處理器 lib GPU 11c DSP lid VRAM lie 内部主記憶體 12 外部主記憶體 13 R0M/RTC 14 光碟機 15 AV-IC 16 AV連接器 17 快閃記憶體 18 網路通訊模組 19 控制器通訊模組 20 擴充連接器 21 記憶卡用連接器 22 天線 23 天線 24 電源鍵 25 重設鍵 26 退片鍵 323330 123 2012201091 Game system 2 TV 2a 3 Game device 4 Disc 5 Controller 6 Marking device 6L Marker 6R Marker 7 Terminal device 8 Terminal device 9 Terminal device 10 CPU 11 System LSI 11a Output input processor lib GPU 11c DSP lid VRAM lie Internal Main memory 12 External main memory 13 R0M/RTC 14 CD player 15 AV-IC 16 AV connector 17 Flash memory 18 Network communication module 19 Controller communication module 20 Expansion connector 21 Memory card connector 22 Antenna 23 Antenna 24 Power button 25 Reset button 26 Rewind button 323330 123 201220109

27 編解碼器LSI 28 終端通訊模組 29 天線 30 基板 31 外罩 31a 放音孔 32 操作部 32a 十字鍵 32b 1號鍵 32c 2號鍵 32d A鍵 32e 減號鍵 32f 首頁鍵 32g 正號鍵 32h 電源鍵 32i B鍵 33 連接器 33a 卡止孔 34a LED 34b LED 34c LED 34d LED 35 攝像資訊運算部 35a 光入射面 36 通訊部 37 加速度感測器 38 紅外線濾波器 39 透鏡 40 攝像元件 41 圖像處理電路 42 微電腦 43 記憶體 44 無線模組 45 天線 46 振動器 47 〇刺口八 48 迴轉感測器 50 外罩 50a 卡止孔 50b 卡止孔 50c 麥克風用孔 51 LCD 52 觸控面板 53 類比搖桿 53A 類比榣桿 53B 類比搖桿 54 操作鍵 54A 至54M操作鍵(按鍵) 124 323330 20122010927 Codec LSI 28 Terminal communication module 29 Antenna 30 Substrate 31 Cover 31a Sound emission hole 32 Operation part 32a Cross key 32b 1st key 32c 2nd key 32d A key 32e Minus key 32f Home key 32g Positive key 32h Power supply Key 32i B key 33 Connector 33a Locking hole 34a LED 34b LED 34c LED 34d LED 35 Imaging information computing unit 35a Light incident surface 36 Communication section 37 Acceleration sensor 38 Infrared filter 39 Lens 40 Imaging element 41 Image processing circuit 42 Microcomputer 43 Memory 44 Wireless Module 45 Antenna 46 Vibrator 47 〇 Pierce Bay 48 Swive Sensor 50 Cover 50a Snap Hole 50b Snap Hole 50c Microphone Hole 51 LCD 52 Touch Panel 53 Analog Rocker 53A Analogy Mast 53B analog rocker 54 operation keys 54A to 54M operation keys (buttons) 124 323330 201220109

55 標示部 55A 標示器 55B 標示器 56 攝影機 57 〇刺口八孔 58 擴充連接器 59 簷部 59a 卡止孔 59b 卡止孔 60 觸控筆 60a 收納孔 61 蓋部 62 磁性感測器 63 加速度感測器 63 窗 64 迴轉感測器 65a 孔 65b 孔 66 充電端子 67 電池蓋 69 麥克風 71 觸控面板控制器 72 磁性感測器 73 加速度感測器 74 迴轉感測器 75 使用者介面控制器 76 編解碼Is LSI 77 〇刺口八 78 聲音1C 79 麥克風 80 無線模組 81 天線 82 紅外線通訊模組 83 快閃記憶體 84 電源1C 85 電池 86 充電器 87 CPU 88 内部記憶體 90 遊戲程式 91 接收資料 92 控制器操作資料 93 第1操作鍵資料 94 第1加速度資料 95 第1角速度資料 96 標示器座標資料 97 終端操作資料 98 第2操作鍵資料 125 323330 20122010955 Marking part 55A Marker 55B Marker 56 Camera 57 〇 口 口 口 58 58 58 58 58 58 58 58 58 58 58 59 59 59 59 59 59 59 59 59 59 59 59 59 59 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 61 Detector 63 Window 64 Rotary Sensor 65a Hole 65b Hole 66 Charging Terminal 67 Battery Cover 69 Microphone 71 Touch Panel Controller 72 Magnetic Sensor 73 Acceleration Sensor 74 Rotary Sensor 75 User Interface Controller 76 Decoding Is LSI 77 〇 口 八 78 78 Sound 1C 79 Microphone 80 Wireless Module 81 Antenna 82 Infrared Communication Module 83 Flash Memory 84 Power Supply 1C 85 Battery 86 Charger 87 CPU 88 Internal Memory 90 Game Program 91 Receive Data 92 Controller operation data 93 1st operation key data 94 1st acceleration data 95 1st angular velocity data 96 marker coordinate data 97 terminal operation data 98 2nd operation key data 125 323330 201220109

99 搖桿資料 100 觸控位置資料 101 第2加速度資料 102 第2角速度資料 103 方位資料 104 攝影機圖像資料 105 麥克風聲音資料 106 處理用資料 107 控制資料 108 控制器姿勢資料 109 終端姿勢資料 110 圖像辨識資料 111 聲音辨識資料 121 飛録 122 控制面 123 標靶 124 觸控筆 131 大砲 132 砲彈 133 標靶 141 打者(打者物件) 142 投手(投手物件) 143 游標 151 飛機(飛機物件) 152 大砲 153 與標靶 154 準星 160 遊戲者 161 角色 162 高球桿 162a 桿頭 163 球 164 圖像(桿頭圖像) 190 網路 191 外部裝置 200 輸入裝置 200a 第1握把部 200b 第2握把部 201 第1鍵 202 第2鍵 203 第3鍵 204 搖桿 205 支撐部 205a 爪部 205b 爪部 205c 爪部 206 連接構件 207 第4鍵 126 323330 201220109 208 窗部 209 連接部 210 支架 211 支撐構件 211a 壁部 211b 槽部 212 充電端子 213a 導引構件 213b 導引構件 220 輸入裝置 230 凸部 230a 凸部 230b 凸部 127 32333099 joystick data 100 touch position data 101 second acceleration data 102 second angular velocity data 103 orientation data 104 camera image data 105 microphone sound data 106 processing data 107 control data 108 controller posture data 109 terminal posture data 110 image Identification data 111 Sound identification data 121 Flight recording 122 Control surface 123 Target 124 Stylus 131 Cannon 132 Cannonball 133 Target 141 Hitler (Hitman object) 142 Pitcher (Pitcher object) 143 Cursor 151 Aircraft (aircraft object) 152 Cannon 153 and Target 154 Crosshair 160 Player 161 Character 162 High club 162a Head 163 Ball 164 Image (head image) 190 Network 191 External device 200 Input device 200a First grip portion 200b Second grip portion 201 1 key 202 2nd key 203 3rd key 204 Rocker 205 Support part 205a Claw part 205b Claw part 205c Claw part 206 Connection member 207 4th key 126 323330 201220109 208 Window part 209 Connection part 210 Bracket 211 Support member 211a Wall part 211b Slot portion 212 charging terminal 213a guiding member 213b guiding member 220 input device 23 0 convex portion 230a convex portion 230b convex portion 127 323330

Claims (1)

201220109 七、申請專利範圍: 1. 一種操作裝置,為用以讓使用者進行操作之操作裝置, 係具備: 外罩,大致呈板狀; 顯示部,設置在前述外罩的表面側;以及 突起部,設置在前述外罩的背面側,且突起於較前 述外罩的中央偏上側且至少在左右兩側的位置。 2. 如申請專利範圍第1項所述之操作裝置,其中,復具備 第1操作部及第2操作部分別設置於較前述外罩的中央 偏上側,且在前述顯示部的左右方。 3. 如申請專利範圍第2項所述之操作裝置,其中,前述突 起部係設置在包含前述第1操作部及第2操作部的相反 側位置之區域。 4. 如申請專利範圍第2項所述之操作裝置,其中,復具 備:第5操作部,配置在前述外罩之表面側的面且在前 述第1操作部的下方;以及第6操作部,配置在前述外 罩之表面側的面且在前述第2操作部的下方。 5. —種操作裝置,係具備: 外罩,大致呈板狀; 顯示部,設置在前述外罩的表面側; 分別設置在前述顯示部的左右方之第1操作部及 第2操作部;以及 突起部,當使用者以可分別以兩手的拇指操作前述 第1操作部及第2操作部的方式握持前述外罩時,設置 1 323330 201220109 在前述外罩的背面側中拇指以外任— 位置。 手私可鉤住之 :申請專利範圍第5項所述之操 起部係設置在包含前述第i操作部=,則述突 匈仇置之區域。 作部的相反 7.:申:糊範圍第5項所述之操作襄置,其 .第5操作部,配置在前述外墓 设具 述第1操作部的下方;以及第6操面側的面且在前 8,表面側的面且機第2操作:;下配置在前述外 V〜種操作裝置,係具備: 。 外罩’大致呈板狀, 顯示部,設置在前述外罩的表面侧. 突起部,設置在前述外罩的内且 左右兩側的位置;以及 ^且大起於至少在 操作部,設置在前述突起部之上相丨μ .〜種操作裝置,為用以讓使用者、隹/側的面。 係具備: #订操作之操作裝置, 外罩,大致呈板狀; 顯示部,設置在前迷外罩的表 握把部’在前述外罩的背面側 、’从及 匈以朝上下方向延伸的方式設置,二述外罩的左右兩 〇·如申請專利範圍第9項所述之梅】面呈凸型。 備:突起部,設置在前述外罩的背面^置,其中’復肩 部的上側且至少在左右兩側的位置突迷於前述握扣 323330 2 201220109 11. 一種資訊處理裝置,為平板型的資訊處理裝置,係具備: 外罩,大致呈板狀; 顯示部,設置在前述外罩的表面側;以及 突起部,設置在前述外罩的内面侧,並於較前述外 罩的中央偏上側且至少在左右兩侧的位置突起。 12. 如申請專利範圍第1或5項所述之操作裝置,其中,復 具備:在前述突起部的上面分別設置在前述外罩的左右 兩側之第3操作部及第4操作部。 • 13.如申請專利範圍第1或5項所述之操作裝置,其中,前 述突起部具有朝左右延伸之簷狀的形狀。 14. 如申請專利範圍第1或5項所述之操作裝置,其中,在 前述突起部的下面,設置有與操作裝置不同之附加裝置 所能夠卡止之第1卡止孔。 15. 如申請專利範圍第14項所述之操作裝置,其中,在前 述外罩之下側的面,設置有前述附加裝置所能夠卡止之 ^ 第2卡止孔。 16. 如申請專利範圍第1或5項所述之操作裝置,其中,在 位於前述突起部的下方且在前述外罩的背面之左右兩 側,復具備剖面呈凸型之凸部。 17. 如申請專利範圍第16項所述之操作裝置,其中,前述 突起部與前述凸部係隔著間隔而設置。 18. 如申請專利範圍第1或5項所述之操作裝置,其中,復 具備:設置在前述外罩的背面的左右兩側之握把部。 19. 如申請專利範圍第1、5、8或9項所述之操作裝置,其 3 323330 201220109 中,復具備:在前述外罩之上側的面分別設置在左右兩 側之第7操作部及第8操作部。 20. 如申請專利範圍第1、5、8或9項所述之操作裝置,其 中,復具備設置在前述顯示部的畫面之觸控面板。 21. 如申請專利範圍第1、5、8或9項所述之操作裝置,其 中,在前述外罩的内部復具備慣性感測器。 22. 如申請專利範圍第1、5、8或9項所述之操作裝置,其 中,復具備:通訊部,以無線方式將顯示出對本身機器 ® 所進行的操作之操作資料傳送至遊戲裝置,並接收從前 述遊戲裝置所傳送來之圖像資料;以及顯示控制部,將 所接收的圖像資料顯示於前述顯示部。 23. 如申請專利範圍第1、5、8或9項所述之操作裝置,其 中,復具備:遊戲處理部,根據對本身機器的操作來執 行遊戲處理;以及顯示控制部,根據前述遊戲處理來生 成遊戲圖像並顯示於前述顯示部。 ^ 24.如申請專利範圍第1、5、8或9項所述之操作裝置,其 中,前述顯示部具有5对以上的晝面。 4 323330201220109 VII. Patent application scope: 1. An operation device, which is an operation device for allowing a user to operate, comprising: an outer cover having a substantially plate shape; a display portion disposed on a surface side of the outer cover; and a protrusion portion, It is disposed on the back side of the outer cover, and protrudes at a position above the center of the outer cover and at least on the left and right sides. 2. The operation device according to claim 1, wherein the first operation unit and the second operation unit are provided on the upper side of the center of the outer cover and on the left and right sides of the display unit. 3. The operation device according to claim 2, wherein the protruding portion is provided in a region including the opposite side of the first operation portion and the second operation portion. 4. The operation device according to claim 2, further comprising: a fifth operation unit disposed on a surface on a surface side of the outer cover and below the first operation unit; and a sixth operation unit The surface on the surface side of the outer cover is disposed below the second operation portion. 5. An operating device comprising: a cover having a substantially plate shape; a display portion provided on a surface side of the cover; and a first operation portion and a second operation portion respectively provided on the right and left sides of the display portion; and a protrusion When the user holds the cover so that the first operation unit and the second operation unit can be operated with the thumb of both hands, the 1 323330 201220109 is provided at a position other than the thumb on the back side of the cover. The hand can be hooked: The operation department described in item 5 of the patent application scope is set in the area including the aforementioned i-th operation unit =, which is described in the context of the Hungarian. The operation unit described in the fifth aspect of the invention, wherein the fifth operation unit is disposed below the first operation unit of the outer tomb; and the sixth operation side The surface of the front surface of the front surface is mounted on the surface of the front side, and the second operation unit is disposed in the outer surface of the apparatus. The cover ' is substantially plate-shaped, and the display portion is provided on the surface side of the cover. The protrusion is provided at a position on the left and right sides of the cover; and is raised at least in the operation portion, and is provided at the protrusion The above-mentioned phase is a kind of operation device, which is used to make the user, the side of the side. The device includes: #订操作操作装置, the cover is substantially plate-shaped; the display portion is provided on the back side of the cover, and the front and rear sides of the cover are extended in the vertical direction. The left and right sides of the outer cover are as follows: The surface of the plum as described in claim 9 is convex.备备: The protrusion is disposed on the back surface of the outer cover, wherein the upper side of the double shoulder portion and at least the left and right sides are protruded from the front grip 323330 2 201220109 11. An information processing device is a flat type information The processing device includes: an outer cover having a substantially plate shape; a display portion provided on a surface side of the outer cover; and a protruding portion provided on an inner surface side of the outer cover and located above the center of the outer cover and at least on the left and right sides The position of the side is raised. 12. The operation device according to claim 1 or 5, further comprising: a third operation portion and a fourth operation portion provided on the upper and lower sides of the outer cover on the upper surface of the protrusion. The operation device according to claim 1 or 5, wherein the protrusion has a dome shape extending left and right. The operating device according to claim 1 or 5, wherein a first locking hole that can be locked by an additional device different from the operating device is provided on the lower surface of the protruding portion. 15. The operation device according to claim 14, wherein a surface of the lower side of the outer cover is provided with a second locking hole that the attachment device can lock. The operating device according to claim 1 or 5, wherein a convex portion having a convex cross section is provided on a right and left sides of the back surface of the outer cover. 17. The operating device according to claim 16, wherein the protruding portion and the protruding portion are provided at intervals. 18. The operating device according to claim 1 or 5, further comprising: a grip portion provided on the left and right sides of the back surface of the outer cover. 19. The operating device according to the first, fifth, eighth or ninth aspect of the patent application, wherein the third operating portion of the upper side of the outer cover is provided on the left and right sides of the outer cover 8 operation department. 20. The operating device according to claim 1, 5, 8 or 9, wherein the touch panel provided on the screen of the display unit is provided. 21. The operating device of claim 1, wherein the inertial sensor is provided inside the outer cover. 22. The operating device according to claim 1, wherein the communication unit transmits the operation data showing the operation performed by the own machine to the game device in a wireless manner. And receiving image data transmitted from the game device; and a display control unit that displays the received image data on the display unit. 23. The operating device according to claim 1, wherein the game processing unit executes the game processing according to the operation of the own machine; and the display control unit performs the game processing according to the foregoing A game image is generated and displayed on the display unit. The operating device according to any one of claims 1, 5, 8 or 9, wherein the display unit has five or more pairs of faces. 4 323330
TW100126152A 2010-11-01 2011-07-25 Controller device and information processing device TWI442963B (en)

Applications Claiming Priority (9)

Application Number Priority Date Filing Date Title
JP2010245299A JP4798809B1 (en) 2010-11-01 2010-11-01 Display device, game system, and game processing method
JP2010245298 2010-11-01
JP2011092506 2011-04-18
JP2011092612A JP6103677B2 (en) 2010-11-01 2011-04-19 GAME SYSTEM, OPERATION DEVICE, AND GAME PROCESSING METHOD
JP2011102834A JP5837325B2 (en) 2010-11-01 2011-05-02 Operating device and operating system
JP2011103705 2011-05-06
JP2011103706A JP6005908B2 (en) 2010-11-01 2011-05-06 Equipment support system and support device
JP2011103704A JP6005907B2 (en) 2010-11-01 2011-05-06 Operating device and operating system
JP2011118488A JP5936315B2 (en) 2010-11-01 2011-05-26 Information processing system and information processing apparatus

Publications (2)

Publication Number Publication Date
TW201220109A true TW201220109A (en) 2012-05-16
TWI442963B TWI442963B (en) 2014-07-01

Family

ID=46518614

Family Applications (2)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system
TW100126152A TWI442963B (en) 2010-11-01 2011-07-25 Controller device and information processing device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
TW100126151A TWI440496B (en) 2010-11-01 2011-07-25 Controller device and controller system

Country Status (3)

Country Link
CN (7) CN102600614B (en)
AU (2) AU2011213765B2 (en)
TW (2) TWI440496B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI502355B (en) * 2012-06-01 2015-10-01 Nvidia Corp Methodology for using smartphone and mobile computer in a mobile compute environment
TWI679575B (en) * 2013-10-11 2019-12-11 日商半導體能源研究所股份有限公司 A driving method of a portable data-processing device

Families Citing this family (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2392391B1 (en) 2010-02-03 2017-06-28 Nintendo Co. Ltd. Display device, game system, and game processing method
US8913009B2 (en) 2010-02-03 2014-12-16 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8339364B2 (en) 2010-02-03 2012-12-25 Nintendo Co., Ltd. Spatially-correlated multi-display human-machine interface
US8814686B2 (en) 2010-02-03 2014-08-26 Nintendo Co., Ltd. Display device, game system, and game method
JP6243586B2 (en) 2010-08-06 2017-12-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
US10150033B2 (en) 2010-08-20 2018-12-11 Nintendo Co., Ltd. Position calculation system, position calculation device, storage medium storing position calculation program, and position calculation method
JP5840386B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP5840385B2 (en) 2010-08-30 2016-01-06 任天堂株式会社 GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system
KR101492310B1 (en) 2010-11-01 2015-02-11 닌텐도가부시키가이샤 Operating apparatus and information processing apparatus
JP5689014B2 (en) 2011-04-07 2015-03-25 任天堂株式会社 Input system, information processing apparatus, information processing program, and three-dimensional position calculation method
TWI487554B (en) * 2013-02-06 2015-06-11 Univ Southern Taiwan Sci & Tec Game machine control method
JP2014212479A (en) 2013-04-19 2014-11-13 ソニー株式会社 Control device, control method, and computer program
CN105302232A (en) * 2014-06-04 2016-02-03 振桦电子股份有限公司 Tablet computer with detachable handle
JP6341568B2 (en) * 2014-08-05 2018-06-13 アルプス電気株式会社 Coordinate input device
US10610776B2 (en) 2015-06-12 2020-04-07 Nintendo Co., Ltd. Supporting device, charging device and controller system
JP6635597B2 (en) 2015-06-12 2020-01-29 任天堂株式会社 Information processing system and operation device
US10712835B2 (en) 2016-10-06 2020-07-14 Htc Corporation System and method for detecting hand gesture
DE102018100122A1 (en) 2017-01-04 2018-07-05 Htc Corporation Finger gesture recognition control unit and method for detecting a finger gesture
US10579151B2 (en) * 2017-01-04 2020-03-03 Htc Corporation Controller for finger gesture recognition and method for recognizing finger gesture
CN108031111A (en) * 2017-12-29 2018-05-15 安徽科创智慧知识产权服务有限公司 Have wireless and wired connection handle system concurrently
CN108579073A (en) * 2018-06-04 2018-09-28 东莞市卫童智能科技有限公司 A kind of mobile-phone game handle
KR20200097012A (en) * 2019-02-07 2020-08-18 주식회사 엔씨소프트 System and method for terminal device control
CN110072041B (en) * 2019-04-26 2021-03-02 维沃移动通信(杭州)有限公司 Mobile terminal

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1332778B1 (en) * 1996-03-05 2004-12-08 Sega Enterprises, Ltd. Controller and expansion unit for controller
US6966837B1 (en) * 2001-05-10 2005-11-22 Best Robert M Linked portable and video game systems
AU2002354677A1 (en) * 2001-07-12 2003-01-29 Gary L. Friedman Portable, hand-held electronic input device and combination with a personal digital device
US6773349B2 (en) * 2002-07-31 2004-08-10 Intec, Inc. Video game controller with integrated video display
US20060252537A1 (en) * 2005-04-21 2006-11-09 Wen-An Wu Portable wireless control apparatus
JP4778267B2 (en) * 2005-05-16 2011-09-21 任天堂株式会社 Game machine operating device and portable game machine
TWM278452U (en) * 2005-06-03 2005-10-21 Weistech Technology Co Ltd Game controlling handle having a display device
GB2470327B (en) * 2008-03-07 2012-03-21 Milwaukee Electric Tool Corp Visual inspection device
US8384680B2 (en) * 2008-12-23 2013-02-26 Research In Motion Limited Portable electronic device and method of control
CN201572520U (en) * 2009-12-23 2010-09-08 周建正 Three-in-one support for game consoles
TWI440496B (en) * 2010-11-01 2014-06-11 Nintendo Co Ltd Controller device and controller system

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI502355B (en) * 2012-06-01 2015-10-01 Nvidia Corp Methodology for using smartphone and mobile computer in a mobile compute environment
TWI679575B (en) * 2013-10-11 2019-12-11 日商半導體能源研究所股份有限公司 A driving method of a portable data-processing device
TWI742471B (en) * 2013-10-11 2021-10-11 日商半導體能源研究所股份有限公司 A driving method of a portable data-processing device
TWI811799B (en) * 2013-10-11 2023-08-11 日商半導體能源研究所股份有限公司 A driving method of a portable data-processing device

Also Published As

Publication number Publication date
TWI440496B (en) 2014-06-11
TW201219093A (en) 2012-05-16
CN102600611B (en) 2015-03-11
CN102600611A (en) 2012-07-25
CN102600614A (en) 2012-07-25
AU2011213764B2 (en) 2013-10-24
CN102600612A (en) 2012-07-25
CN202398092U (en) 2012-08-29
CN202355829U (en) 2012-08-01
CN202355827U (en) 2012-08-01
CN202398095U (en) 2012-08-29
TWI442963B (en) 2014-07-01
AU2011213764A1 (en) 2012-05-17
CN102600614B (en) 2015-11-25
CN102600612B (en) 2015-12-02
AU2011213765A1 (en) 2012-05-17
AU2011213765B2 (en) 2013-07-11

Similar Documents

Publication Publication Date Title
TW201220109A (en) Controller device and information processing device
KR101492310B1 (en) Operating apparatus and information processing apparatus
JP6184658B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
KR101154636B1 (en) Display device, game system, and game method
TWI434717B (en) Display device, game system, and game process method
JP5840386B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP5840385B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND GAME PROCESSING METHOD
JP6188766B2 (en) Operating device and operating system
JP5800526B2 (en) GAME DEVICE, GAME PROGRAM, GAME SYSTEM, AND GAME PROCESSING METHOD
JP4798809B1 (en) Display device, game system, and game processing method
JP2012249867A (en) Information processing program, information processing apparatus, information processing system, and information processing method
JP5829040B2 (en) GAME SYSTEM, GAME DEVICE, GAME PROGRAM, AND IMAGE GENERATION METHOD
JP6103677B2 (en) GAME SYSTEM, OPERATION DEVICE, AND GAME PROCESSING METHOD
JP5937792B2 (en) GAME PROGRAM, GAME DEVICE, GAME SYSTEM, AND GAME PROCESSING METHOD
JP5936315B2 (en) Information processing system and information processing apparatus
JP2012096005A (en) Display device, game system and game processing method
KR20130020715A (en) Operating apparatus and operating system

Legal Events

Date Code Title Description
MM4A Annulment or lapse of patent due to non-payment of fees