[go: up one dir, main page]

HK1181135B - Content system with secondary touch controller - Google Patents

Content system with secondary touch controller Download PDF

Info

Publication number
HK1181135B
HK1181135B HK13108181.7A HK13108181A HK1181135B HK 1181135 B HK1181135 B HK 1181135B HK 13108181 A HK13108181 A HK 13108181A HK 1181135 B HK1181135 B HK 1181135B
Authority
HK
Hong Kong
Prior art keywords
controller
user
interface
content
input
Prior art date
Application number
HK13108181.7A
Other languages
Chinese (zh)
Other versions
HK1181135A1 (en
Inventor
J.克拉维
K.A.洛布
C.M.诺瓦克
K.盖斯那
C.克莱恩
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/331,726 external-priority patent/US20130154958A1/en
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1181135A1 publication Critical patent/HK1181135A1/en
Publication of HK1181135B publication Critical patent/HK1181135B/en

Links

Description

Content system with auxiliary touch controller
Technical Field
The invention relates to a content system with an auxiliary touch controller.
Background
Users of content services have many options for controlling content presentation devices. Television remote controls have become more complex and have the ability to control multiple devices. A game controller for use with a gaming platform not only allows a user to participate in playing a game, but also allows the user to consume content provided on the gaming device.
New control options have been provided by so-called "smart" or tablet computing devices having a touch screen. For example, content providers allow users to install an application on a user's smart phone that will stream content from a remote source (such as Netflix) or even change channels on someone's television (by using the XfinityTV application from concast (Comcast)). Although these different control options are useful in some embodiments, haptic devices are preferred in other situations.
Disclosure of Invention
Techniques are provided that allow for a secondary media or control experience on a touch-enabled controller when a user consumes inactive or shared content by using a primary processing system and a primary haptic controller. A secondary experience is provided in a controller of a content presentation and interaction system that includes a primary content presentation device. The controller includes a haptic control input and a touch screen control input. The haptic control input is responsive to input by a first user and communicatively coupled to the content presentation device. The controller includes a plurality of tactile input mechanisms and provides a first set of a plurality of control inputs to manipulate the content. The controller includes a touch screen control input responsive to input by the first user and communicatively coupled to the content presentation device. The second controller is proximate to the first controller and provides a second set of the plurality of control inputs. The second set of control inputs includes alternative inputs for at least some of the controls and additional inputs not available using the tactile input mechanism.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Drawings
FIG. 1 illustrates an exemplary gaming and media system.
Fig. 2 illustrates an exemplary use case of the present technology.
FIG. 3 shows a block diagram of an overview of components used to implement the present technology.
FIG. 4 is a block diagram of an exemplary system for implementing the present technology.
Fig. 5 is a flow chart illustrating an example of the present technology.
6A-10C are plan and side views of various embodiments for integrating a haptic controller with a touch screen interface controller.
11-16 illustrate various embodiments of primary content and auxiliary environments provided on a touch screen interface controller as discussed herein.
FIG. 17 is a flow chart illustrating a number of interfaces that may be provided.
FIG. 18 is a block diagram of an exemplary processing device.
FIG. 19 is a block diagram of an exemplary touch screen interface device.
FIG. 20 is a block diagram of an exemplary console device.
Detailed Description
Techniques are provided that allow for a secondary media or control experience on a touch-enabled controller when a user consumes inactive or shared content by using a primary processing system and a primary haptic controller. The secondary controller may be provided through the use of an integrated, connected or communicating processing device that adapts the secondary interface to the content being consumed. One aspect includes providing a secondary controller for a gaming experience or streaming media. Entertainment services provide content and track users' online activities. Based on the content selected by the user for consumption in the entertainment system, the service determines an appropriate auxiliary experience for the touch screen interface and provides an experience that is integrated with the content. The content may also be provided from a third party source, in which case the processing device or console may provide feedback to the entertainment service regarding the nature of the content.
The present technique may be used in conjunction with the main processing device shown in fig. 1, 18 and 20. FIG. 1 illustrates an exemplary gaming and media system. As shown in FIG. 1, gaming and media system 200 includes a gaming and media console (hereinafter collectively referred to as a "console") 202. Generally, console 202 is one type of computing system, as will be described further below. The console 202 is configured to accommodate one or more wireless controllers, as represented by controllers 204(1) and 204 (2). The console 202 is equipped with an internal hard disk drive (not shown) and a portable media drive 206 that supports various forms of portable storage media, as represented by an optical storage disc 208. Examples of suitable portable storage media include DVDs, CD-ROMs, game discs, and the like. Console 202 also includes two memory unit card sockets 225(1) and 225(2) for receiving removable flash-type memory units 240. Command buttons 235 on console 202 enable and disable wireless peripheral support.
Console 202 also includes an optical port 230 for wireless communication with one or more devices and two USB (Universal Serial bus) ports 210(1) and 210(2) to support wired connections for additional controllers or other peripheral devices. In some implementations, the number and arrangement of additional ports may be modified. A power button 212 and an eject button 214 are also located on the front face of the game console 202. The power button 212 is selected to power the game console and may also provide access to other features and controls, while the eject button 214 alternately opens and closes the tray of the portable media drive 206 to allow insertion and removal of the storage disc 208.
Console 202 is connected to a television or other display (e.g., monitor 250) via A/V interface cable 220. In one implementation, console 202 is equipped with a dedicated A/V port (not shown) configured for content-protected digital communication using A/V cable 220 (e.g., an A/V cable adapted for coupling to a high-definition multimedia interface "HDMI" port on high-definition display 16 or other display device). The power cable 222 provides power to the game console. Console 202 may further be configured with broadband capabilities, as represented by a cable or modem connector 224 to facilitate access to a network, such as the Internet. The broadband capability may also be provided wirelessly over a broadband network, such as a wireless fidelity (Wi-Fi) network.
Each controller 100 is coupled to a console 202 via a wired or wireless interface. In the illustrated implementation, the controller 100 is coupled to a console 202 via a wireless connection. Console 202 may be equipped with any of a variety of user interaction mechanisms. In the example shown in fig. 2, each controller 100 is equipped with two thumb sticks 112(a) and 112(b), a D-pad 116, a button 106, and two triggers 110.
These controllers 100 are merely representative, and additional embodiments of the controllers 100 are discussed herein. Because there are several common elements between the various controllers, they are generally collectively labeled 100, which varies depending on the application described herein.
In one implementation, a Memory Unit (MU) 240 may also be inserted into the controller 204 to provide additional and portable storage. Portable MUs allow users to store game parameters for use when playing on other consoles. In this implementation, each controller is configured to accommodate two MUs 240, but more or less than two MUs may be employed.
The gaming and media system 200 is generally configured to play games stored on a memory medium, as well as to download and play games, and to reproduce pre-recorded music and video from electronic and hard media sources. Using different storage offerings, items may be played from a hard disk drive, from an optical disk medium (e.g., 208), from an online source, or from the MU 240. Examples of the types of media that the gaming and media system 200 is capable of playing include:
game items played from CD and DVD disks, from hard drives, or from online sources.
Digital music played from a CD in the portable media drive 206, from a file on a hard drive (e.g., music using Windows Media Audio (WMA) format), or from an online streaming source.
Digital audio/video played from a DVD disc in the portable media drive 206, from a file on a hard drive (e.g., active streaming format), or from an online streaming source.
During operation, the console 202 is configured to receive input from the controller 100 and display information on the display 16. For example, console 202 may display a user interface on display 250 to allow a user to select and display games using controller 100.
FIG. 2 illustrates a common user scenario that may be employed using the techniques described herein. In accordance with the present technology, a touch display controller is used in conjunction with a haptic controller to provide an ancillary experience as well as content 14 provided by entertainment system 200.
In fig. 2, two users 50 and 52 are shown sitting in front of a display device 16, on which display device 16 a segment 14 of shared content (in this example a tennis match) is displayed. Each user 50, 52 has an associated process controller 60, 62. Each controller 60, 62 has a respective associated touch assembly 64, 65. In fig. 2, the controller is shown as being integrated with the touch device, but the controllers 60, 62 may comprise any of the various controllers discussed herein. Also shown in FIG. 2 is an entertainment system 200, which entertainment system 200 may include a game console 202, a display device 16, and a capture device 20, all of which are discussed below with respect to FIGS. 18-20.
Fig. 2 also shows a second controller comprising a target recognition and tracking device 20. The target recognition and tracking device 20 may include, for example, Microsoft WindowsA controller or the like, various embodiments of which are described in the following co-pending patent applications, all of which are specifically incorporated herein by reference: U.S. patent application serial No. 12/475,094 entitled "environmental and/or object segmentation" filed on 29.5.2009; U.S. patent application serial No. 12/511,850 entitled "automated generation visual representation" filed on 29.7.2009; U.S. patent application serial No. 12/474,655 entitled "gestalto" filed on 29.5.2009; U.S. patent application serial No. 12/603,437 entitled "postrackingpipeline" filed on 21/10/2009; U.S. patent application serial No. 12/475,308 entitled "device for identifying and tracking multiple human beings over time" filed on 29.5.2009; U.S. patent application serial No. 12/575,388 entitled "human tracking system" filed on 7/10/2009; U.S. patent application serial No. 12/422,661 entitled "gesture recognizer system architecture" filed on 13.4.2009; and U.S. patent application serial No. 12/391,150 entitled "standard postures" filed on 23.2.2009.
As shown in fig. 2, each user has their own controller equipped with a touch sensitive component 64, 65. The touch sensitive assembly is used in conjunction with the master controllers 60 and 62 to provide a secondary media control experience on the touch-enabled controllers.
Fig. 3 illustrates an exemplary embodiment of a haptic controller 100 with a touch sensitive device 400 to provide a secondary media control experience. As shown in FIG. 3, a user may view content on display 16 using console 202. The controller 100 may comprise a controller for an "Xbox" device.
Fig. 3 is a top view of the controller 100 with tactile or manual input. Although a particular controller is described, it is not intended to be limiting as many types of controllers can be used. The controller 100 includes a housing or body 102 that forms most of the exterior surface of the controller, having a shape that interfaces with a user's hand. A pair of handles 104 extend from a lower portion of the body. A set of input or action buttons 106 is located at the upper right portion of the body. These input buttons may be referred to as face buttons because of their orientation on the top surface of the body 102 of the controller. The input button may be a simple switch that generates a signal having a binary output to indicate the user's selection. In other examples, the input buttons may be pressure sensitive switches that generate signals indicative of different levels of selection by the user. Additional input buttons 108 are provided at an upper central location of the body and may provide additional functionality, such as for navigating a graphical user interface menu. The input buttons 108 may also provide binary or multi-level response signals. A set of input buttons 110 are provided on the top surface of the controller body 102, commonly referred to as triggers for their intended actuation by a finger. In many examples, these types of triggers are pressure sensitive, but this is not necessarily so.
A first simulated thumb stick 112a is provided at the upper left portion of the face of the main body 102, and a second simulated thumb stick 112b is provided at the lower right portion of the face of the main body 102. Each analogue thumb stick allows so-called analogue input by determining the exact angle of the thumb stick relative to the fixed base part. Furthermore, the analog thumb stick measures the amount of movement of the stick at a precise angle to generate a signal in response to different input amounts in any direction.
A directional pad (D-pad) 114 is formed in a recess 116 at the central left portion of the face of the body 102. In other examples, the D-pad may be formed over the controller surface without a recess. The D-pad contains an actuation surface that includes a cross-shaped input pad 120 and four filler pieces 152. In this example, the input pad includes four input arms 128. In other examples, the input pad may include more or less than four input arms. In one example, the D-pad allows a user to provide directional input control corresponding to four different coordinate directions (e.g., NSEW (North south east west)) for the four input arms 128.
The actuation surface layout of the D-pad114 may be user configurable. In one example, the filler piece 152 may be moved relative to the input pad 120 to change the distance between the upper surface of the input pad 120 and the upper surface of the filler piece. In this way, the actuation surface layout of the D-pad may be altered by the user. With the filler piece 152 in an upward position relative to the input pad 120, an annular or disc-shaped actuation configuration is provided, while with the filler piece in a lowered position relative to the upper surface of the input pad, a cruciform actuation configuration is provided.
In one embodiment, the input pad 120 and the filler piece 152 are rotatable within the recess 116 about a central axis of the directional pad that extends perpendicular to the central portion of the actuation surface. Rotation of the input pad 120 and the fill sheet 152 causes linear translation of the fill sheet parallel to the central axis. By rotating the directional pad114 about a central axis in a clockwise or counterclockwise direction, the surface layout of the actuation surface 118 may be changed. The linear translation of the fill tab changes the distance between the upper surface of the input arm 128 and the upper surface of the fill tab 152, thereby changing the actuation surface layout of the directional pad.
Device 400 may be a touch-enabled processing device, such as described below with respect to fig. 4 and 19. The touch-enabled processing device may be wirelessly coupled to console 202 via an internet connection or to console 202 via cable 302 and connector 304. The device 400 may interact with the controller 20 or the controller 100 to provide an auxiliary experience in conjunction with the content 14 being consumed on the main display 16 and the console 202.
FIG. 4 is a block diagram illustrating a system suitable for implementing the present technology. FIG. 4 illustrates various use cases and various system components. Fig. 4 shows users 53, 55, 57, each interacting with their own display 16, main processing device 202, and one or more controllers. Each of these users consumes content that may be provided by, for example, entertainment service 480 or third party provider 425. The entertainment services may include a content store 470, and the content store 470 may include a library of streaming media, games, and other applications for use by the users 53, 55, 57. The entertainment service may include a user profile store 460, the user profile store 460 including information records regarding online and content consumption activities of each user of the service 480. The user profile store 460 may include information such as a user's social graph selected from online activities and third party social network feeds 420, as well as user engagement with a gaming application provided by the entertainment service 480. The content manager 462 may determine the relationship between the different types of content 470 and other users of the same or similar content provided by the entertainment service 480 and the activities in which the users 53, 55, 57 are engaged when using any of the processing devices discussed herein.
The third-party content provider 425 may be displayed directly by the console 202 or consumed through the service 480. These providers 425 may include social network feeds 420, commercial content feeds 422, commercial audio video feeds 424, other gaming systems 426, and private audio/visual feeds. Examples of commercial content services 422 include news service feeds from identified news service agents as well as RSS feeds. Commercial audio video services 424 may include entertainment streams from broadcast networks or other commercial services that provide streaming media entertainment. The gaming services 426 may include content from gaming services other than those provided by the entertainment service 480. Private audio/video feeds 428 may include both audio/visual feeds available through a social network and those available through commercial audio video websites such as YouTube.
Entertainment service 480 may also include touch interface device controls 464. The touch interface device controller may determine the user interface 410, which the user interface 410 should be presented on the interface device 400. Touch interface device controller 464 may provide instructions to touch interface device 400 to allow the touch interface device to provide an auxiliary experience in order to render a user interface and provide control instructions back to an entertainment service or a third party service to control content presented on a respective display device 16.
As shown in FIG. 4, touch interface device 400 can be coupled to processing system 202 and entertainment services in a variety of ways. As shown with respect to user 53, device 400-1 is integrated with controller 100-1 by physically attaching touch interface device 400-1 to controller 100-1. Various examples of physical couplings are described below, but may include cabling, physically connecting devices in interface ports on each device, or a fully integrated touch interface device built into controller 100. As shown with respect to user 55, touch interface device 400-2 may communicate wirelessly with controller 100-2. Similarly, controller 100-2 may communicate wirelessly with console 202, and instructions to the touch interface device may be provided from interface device controller 464 via console 202 or may be provided directly from console 202. As shown with respect to user 57, touch interface device 400 may communicate directly with network 90, network 90 may be a combination of public and private networks such as the Internet, and touch interface device 400 may receive instructions from console 202 or from interface device controller 464. In the user 57 embodiment, the controller 100 is also in communication with a console 202. In an alternative embodiment, controller 100 may communicate with network 90 to control both console 202 and content provided from content store 470 and third party system 425.
As shown in FIG. 4, the general components of touch interface device 400 will include a processor 404 that can execute instructions for providing a user interface 410, a network interface 402, volatile memory 406, and non-volatile memory 408. Various capabilities of touch interface device 400 will be described herein. The methods described below are convertible into instructions that are operable by the processor 404, as well as the console 202, controller 100, and controller 20, to enable the methods described herein to be performed and implemented.
FIG. 5 illustrates a general flow diagram of a method in accordance with the present technology. At 510, the touch interface device is coupled to a controller, such as controller 100, and capabilities of the touch interface device can be determined. In some embodiments, the touch interface device is integrated in the controller and step 510 need not be performed.
In some embodiments, touch interface device 400 may constitute any of a number of different processing devices, such as smart phones and media players, that have a universal connection port or wireless connection capability to allow them to be coupled to a controller or to console 202 or to network 90 and services 480. In such cases, the capabilities of the device are determined at 520. In one embodiment, the touch interface device is an integrated device or a known device designed specifically for use with the controller 100. In such embodiments, step 520 need not be performed.
At 530, the user selects to receive or participate in content provided from the service 480, or by a third party, or in conjunction with a processing device such as the console 202. At 540, a determination is made regarding the type of auxiliary experience that may be presented on the touch interface device (if any) based on the type of content presented. Various examples of secondary experiences are described below. If content is presented from the service 480, the service 480 will know which content is being presented to the user and may determine whether auxiliary content, a user interface or controller, or some other auxiliary experience should be provided to the touch interface device 400. If content is provided from a third party service, console 202 may provide feedback to service 480, and service 480 may then determine which auxiliary experiences should be provided to the user. At 550, an auxiliary experience is presented on the interface device in conjunction with the presented content.
Fig. 6A and 6B illustrate an alternative to connecting the touch interface device 400-6 to the controller 100-6. In fig. 6A and 6B, touch interface device 400-1 is any one of a number of general purpose devices that may be used in conjunction with controller 100-6. The controller 100-6 generally corresponds to the controller 100 discussed above and is equipped with a connector for the cable 602, which may be adapted for use with any of a number of different interface devices using the plug 604. The connector may be a standard connector such as a USB or mini USB connector.
In FIGS. 6A and 6B, a hardware base 610 including arms 612 and 614 is used to connect the touch interface device 400-6 to the controller 100. In this manner, touch interface device 400-6 may be any general purpose touch interface device and may be used by a user of controller 100 to receive an auxiliary experience with respect to content presentation on a display. One end of each arm 612, 614 may be inserted into a respective coupling hole of the controller 100-6, and a second end of each arm may include a bracket that fixes the touch interface device 400-6 with respect to the controller 100-6. As shown in FIG. 6B, the base 610 may allow the touch interface device 400-6 to be positioned at a variety of angles with respect to the controller 100-6.
Touch interface device 400-6 may include a camera 630, with camera 630 placed on a face of the device associated with the touch-sensitive surface. It is well known that many touch devices include a second camera on the back of the device. Placing the device at an angle relative to the controller 100-6 allows for different fields of view of the camera and provides alternative inputs to the service 480 to provide a variety of ancillary experiences as described below.
As shown in fig. 6B, the controller 100-6 may also include a forward-facing camera 620. The forward facing camera has a field of view directed in the direction the controller is pointed. This gives the system 480 multiple fields of view and adds to the functionality of the system described below.
FIGS. 7A and 7B illustrate a second touch interface device 400-7, the second touch interface device 400-7 having been adapted to be received into a slot 704 of the controller 100-7. In this embodiment, the physical connectors on the touch interface device 400-7 and the physical connectors on the controller 100-7 are mated in a manner that allows electrical connection between the two devices. The slot 704 provides structural rigidity to the interface device 400-7. Additionally, note that the orientation of the device 400-7 is in a "portrait" mode relative to the controller 100-7. Alternative embodiments are discussed below.
As shown in FIG. 7B, the slot (or other coupling component) may be adapted to allow the touch interface device 400-7 to have a varying angle with respect to the controller 100-7. FIG. 7A also shows camera 630 on touch interface device 400-7 and camera 620 on controller 100-7.
As shown in fig. 8A and 8B, the controller 100-8 has been adapted to receive a laterally mounted interface device 400-8. Device 400-8 may be configured to plug into one or more connections in controller 100-8, and controller 100-8 includes all of the tactile elements provided above. Touch interface device 400-8 may be a dedicated touch interface device suitable for use with controller 100-8, or controller 100-8 may be adapted to accommodate any of a number of different devices using standard connections. Again, as shown in FIG. 8B, the slot (or other coupling component) may be adapted to allow the touch interface device 400-8 to have a varying angle with respect to the controller 100-8.
FIGS. 9A and 9B illustrate another controller 100-9 with an integrated touch interface device 400-9. The integrated touch interface device 400-9 should now be considered a separate interface device, but may be considered a touch interface screen integrated in or on the controller 100-9. In this embodiment there may be processing components of the universal interface device 400 shown in fig. 4. Again, controller 100-9 shows all of the haptic control elements of other embodiments.
Fig. 9A and 9B illustrate the use of additional cameras placed in other portions of the controller such as shown in fig. 9, where cameras 630 and 640 provide alternative views of the user environment that may be used in the auxiliary experience described below. It is to be understood that these cameras may be provided in any of the various embodiments described herein.
FIGS. 10A-10C illustrate an alternative to placement of the touch interface device 400-10 relative to the controller 100-10. As shown, the controller 100-10 mounts the touch interface device 400-10 below the handle 104. The angle at which the controller is set may be selected based on physical adjustments within the controller, the provision of an alternate slot in the controller for device access, or other mechanical components that allow the user to adjust the screen angle, as shown in fig. 10A and 10B.
11-17 illustrate a number of examples of secondary interfaces provided on a touch display controller. The secondary interface may be adapted for use with content consumed by the user. The following description is exemplary, and any number of different auxiliary interfaces may be provided based on the type of content selected. Typically, these include a user help interface, a secondary controller interface, or an alternate view interface. The auxiliary controller interface may provide an alternative set of control signals for game control that are not provided by the touch control elements, or an alternative control means as an alternative to the haptic control elements. As such, for a set of control signals for content provided by the controller and the touch display, one subset may be provided by the haptic controller and a second subset may be provided by the touch display interface. The subsets may be completely separate, may be partially overlapping, or may be completely overlapping. For example, as discussed below, in a gaming application, an alternative user interface or help screen may be provided in connection with a game. In a streaming media environment, additional guidance information related to the streaming media may be presented. In addition, alternative forms of controls or supplemental information may be provided, all within the context of the type of content or media being consumed by the user.
FIG. 11 illustrates an exemplary view of the combined auxiliary experience that a user may see when playing a game on the display 16. The display 16 shows a tennis game 1102 showing a tennis player 1104 above hitting a ball 1106 relative to a net 1110. As will be generally understood, during a tennis game, a user has multiple strokes that can be made with respect to the ball, and the controllers 112A and 112B can be used to position the player and perform different types of strokes by entering the corresponding button entries on the buttons 106. FIG. 11 shows an example of an auxiliary interface including a help screen 1130 where the user is provided with instructions on how to use the controller with respect to the game. In this context, instructions are relatively basic with respect to gaming. In another context, the context of the help screen 1130 may change because the service 480 controls the game and knows where the user is participating in the game. For example, in a role playing game, where a user is challenged to complete several different types of challenges within the game, if the user fails a certain number of times on a particular challenge, the secondary interface may prompt the user to indicate whether the user wishes to see how other members or participants in the game are addressing the level. This may include video pre-emption-through, step-by-step instructions, basic prompts or advice, or any other alternative type of assistance without disturbing the main experience of the game 1102 as it appears on the display 16.
Fig. 12 illustrates a scenario in which a user playing a role-playing game in a first-person view 1202 may control other members in a team environment. Role-playing game view 1202 provides a first-person view of weapons 1206 into the environment. As shown in FIG. 12, the environment on display 16 includes fences 1204, 1214, a building 1216, and other elements. Some of these elements, as well as other players, may be present in the game world, but may be outside of the first person field of view 1202.
In this example, the auxiliary experience provided on display 400-12 shows two other users 1250 and 1252 that may be in a team of the users. One example of a secondary interface allows the operator of the controller 100-12 to locate other users 1250 and 1252 if they are members of a team-based game and the operator of the controller 100-12 is the master player. To position the player, the player may be dragged to a different location, for example, by touching the user player and moving the user player to the requested location by sliding the user's finger across the touch interface screen 400-12. Various types of team scenarios may be used in conjunction with the auxiliary experience. For example, a screen may have more than simple control over the player's position on the screen. The screen may allow the user to visually and audibly communicate with other members. Touching user 1252 may open an audio channel with the team member to communicate instructions to the team member via audio communication. Alternatively, touching user 1252 may generate a menu with preprogrammed instructions selected by the operator of controller 100-12, which the user of controller 100-12 need only select to communicate to their team member. Alternatively, the secondary interface may simply provide an overhead map of the environment showing elements that are not visible in the first-person view. In yet another alternative, the touch interface 400-12 may provide additional information or help cues about objects in the secondary interface.
Fig. 13 illustrates a second scenario using a role playing game similar to that shown in fig. 12. In this case, the user is provided with an alternate first-person view of the gaming environment on interface 400-13, which may include a back view of what is happening behind the user. In this example, the user may see that there is a potential danger behind the operator of the controller 100-13 in the virtual environment of the game view 1202, such as another character 1310. Character 1310 appears only on the auxiliary interface in auxiliary experiences 400-13 unless the user control interface "spins" and looks to the back in the virtual environment. Alternatively, the interfaces 400-13 may use the cameras discussed above to provide alternative views of the environment in which the user is located or to display alternative data interpreted from real world people within the user's range and to introduce these environment variables into the gaming experience.
FIG. 14 illustrates an embodiment using an alternative control device that may be more advantageous for certain types of games than the haptic controls on controllers 100-14. The touch interfaces 400-14 may be used in games where user control is assisted by analog input, such as sliders or dials. In this embodiment, touch interface 400-14 is used to play a game of goals 1400 that appears on display 16. In this game, the user must pull their slingshot 1402 backward to obtain a sufficient projectile velocity to hit target 1404. FIG. 14 shows a force slider interface on device 400-14 where the user slides their finger from the initial contact point 1406 to the second contact point 1408 and releases their finger from the screen of device 400-14 to release the projectile in game 1400. Such analog controls may be more easily presented and allow for user control options on the devices 400-14.
In the example of FIG. 14, the game may provide the same targeting and control mechanisms as interfaces 400-14 through tactile control. Thus, in such an embodiment, the set of control signals from the haptic device may overlap with one from interfaces 400-14.
FIG. 15 illustrates yet another embodiment of an auxiliary experience that may be implemented by haptic control on the controllers 100-15 or on the interfaces 400-15. In poker game 1500, a user typically does not want other users in the game to know their cards. The operator of the controller 100-15 may cause their cards to be presented to themselves in the touch interface 400-15. A user may participate in a card game 1500 on the display 16 using touch input 1504 to a card 1506 on the user's own device (not visible to other players in the game), even if the display is shared by all players in the game. Such an embodiment is useful in a scenario such as that shown in fig. 2, where two users playing the same game but having secret information that is not intended to be shared with other players in the game need to access their own information. The interface 400-15 may be a partial or complete replacement for using haptic controls on the controller 100-15.
FIG. 16 illustrates another auxiliary experience that includes a notification system. In FIG. 16, the auxiliary experience on the displays 400-16 includes notification that other users are waiting for the operator of the controllers 100-16 to play a different game. In this scenario, the operator of the controller 100-16 is playing a role playing game using the view 1202 discussed above. However, other users may send messages 1604 and 1608 or any other type of notification to the operator of the controllers 100-16 asking the user if they would like to participate in other types of games. Depending on the type of notification, software response control buttons 1610, 1612, 1614, 1616, 1618, and 1620 may be provided to allow the operator of controllers 100-16 to easily respond to such notifications or simply ignore such notifications. It should be appreciated that any number of different types of notifications and notification controls may be implemented on the auxiliary experience.
FIG. 17 illustrates a flow chart of a more detailed method highlighting various different embodiments in accordance with the present technology. At step 1702, a user selects content to be presented to or engaged in by the user. Based on the selected content, an auxiliary experience is generated and presented to the touch interface device.
If the content is a game at 1704, then the service 480 will select the components of the auxiliary experience that should be displayed to the user at 1706. At 1708, the service sends the components to the touch interface device. Once the control elements are received 1710, the user can utilize the control elements to control the game 1712. The control elements of the secondary experience on the touch interface device will generate control signals that will be returned to the service 480 to control the game according to the particular requirements of the game.
At 1714, if the content requires a help screen, a prompt to display help may be provided. At 1716, when the help screen is invoked, the service 480 may determine where the user is in the game, application, or other content, and the user history for the game application or content. This may help the service 480 provide the correct type of help or provide options for the user to request different types of help. At 1718, an appropriate help type is selected. The appropriate help type may be automatically selected by the gaming service 480 or the user may be prompted to select a particular help type that may then be displayed at 1719. The assistance can take many forms, including those discussed above. In addition, a video of how to perform the game task may be played to the user, or the user may be shown how other users solve the problem with the application.
After the user selects content 1702, a notification may be received 1720. At 1722, the service 480 may make a determination whether the notification is of a type that the user may wish to view. Any number of filters may be used to make this determination. For example, all notification messages received from a particular level of the user's social graph may be allowed to pass. The user may have specified that they do not wish to receive certain categories of notifications, such as invitations to play games. Once the system determines whether the notification should be provided, the system may display the notification in an appropriate manner at 1722.
As one of ordinary skill will appreciate, many types of content may be provided by the service 480 or third party provider. For any type of content at 1728, once the service 480 determines the type of content at 1730, an auxiliary experience may be provided at 1732. At 1732, the system determines controls, information, or applications suitable for use in the secondary experience, and at 1734 provides the secondary UI experience to the touchscreen controller. As described, the service 480 may determine the user's viewing history and other online activity along with the current streaming content through feedback from the console 202 or directly from the user, and this feedback may be used to provide a secondary interface in different contexts.
FIG. 18 illustrates an example of a suitable computing system environment that may be used in the foregoing techniques as any of the processing devices described herein. Multiple computing systems may be used as servers to implement location services.
With reference to FIG. 18, an exemplary system for implementing the invention includes a general purpose computing device in the form of a computer 710. Components of computer 710 may include, but are not limited to, a processing unit 720, a system memory 730, and a system bus 721 that couples various system components including the system memory to the processing unit 720. The system bus 721 may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. By way of example, and not limitation, such architectures include Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MCA) bus, Enhanced ISA (EISA) bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus also known as Mezzanine bus.
Computer 710 typically includes a variety of computer readable media. Computer readable media can be any available media that can be accessed by computer 710 and includes both volatile and nonvolatile media, removable and non-removable media. By way of example, and not limitation, computer readable media may comprise computer storage media and communication media. Computer storage media includes volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by computer 710.
The system memory 730 includes computer storage media in the form of volatile and/or nonvolatile memory such as Read Only Memory (ROM) 731 and Random Access Memory (RAM) 732. A basic input/output system 733 (BIOS), containing the basic routines that help to transfer information between elements within computer 710, such as during start-up, is typically stored in ROM 731. RAM732 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 720. By way of example, and not limitation, FIG. 18 illustrates operating system 734, application programs 735, other program modules 736, and program data 737.
The computer 710 may also include other removable/non-removable, volatile/nonvolatile computer storage media. By way of example only, FIG. 18 illustrates a hard disk drive 740 that reads from or writes to non-removable, nonvolatile magnetic media, a magnetic disk drive 751 that reads from or writes to a removable, nonvolatile magnetic disk 752, and an optical disk drive 755 that reads from or writes to a removable, nonvolatile optical disk 756 such as a CDROM or other optical media. Other removable/non-removable, volatile/nonvolatile computer storage media that can be used in the exemplary operating environment include, but are not limited to, magnetic tape cassettes, flash memory cards, digital versatile disks, digital video tape, solid state RAM, solid state ROM, and the like. The hard disk drive 741 is typically connected to the system bus 721 through a non-removable memory interface such as interface 740, and magnetic disk drive 751 and optical disk drive 755 are typically connected to the system bus 721 by a removable memory interface, such as interface 750.
The drives and their associated computer storage media discussed above and illustrated in FIG. 18, provide storage of computer readable instructions, data structures, program modules and other data for the computer 710. In FIG. 18, for example, hard disk drive 741 is illustrated as storing operating system 744, application programs 745, other program modules 746, and program data 747. Note that these components can either be the same as or different from operating system 734, application programs 735, other program modules 736, and program data 737. Operating system 744, application programs 745, other program modules 746, and program data 747 are given different numbers here to illustrate that, at a minimum, they are different copies. A user may enter commands and information into the computer 20 through input devices such as a keyboard 762 and pointing device 761, commonly referred to as a mouse, trackball or touch pad. Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, or the like. These and other input devices are often connected to the processing unit 720 through a user input interface 760 that is coupled to the system bus, but may be connected by other interface and bus structures, such as a parallel port, game port or a Universal Serial Bus (USB). A monitor 791 or other type of display device is also connected to the system bus 721 via an interface, such as a video interface 790. In addition to the monitor, computers may also include other peripheral output devices such as speakers 797 and printer 796, which may be connected through an output peripheral interface 790.
The computer 710 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 780. The remote computer 780 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or all of the elements described above relative to the computer 710, although only a memory storage device 781 has been illustrated in fig. 7. The logical connections depicted in fig. 7 include a Local Area Network (LAN) 771 and a Wide Area Network (WAN) 773, but may also include other networks. Such networking environments are commonplace in offices, enterprise-wide computer networks, intranets and the Internet.
When used in a LAN networking environment, the computer 710 is connected to the LAN771 through a network interface or adapter 770. When used in a WAN networking environment, the computer 710 typically includes a modem 772 or other means for establishing communications over the WAN773, such as the Internet. The modem 772, which may be internal or external, may be connected to the system bus 721 via the user input interface 760, or other appropriate mechanism. In a networked environment, program modules depicted relative to the computer 710, or portions thereof, may be stored in the remote memory storage device. By way of example, and not limitation, FIG. 18 illustrates remote application programs 785 as residing on memory device 781. It will be appreciated that the network connections shown are exemplary and other means of establishing a communications link between the computers may be used.
FIG. 19 is a block diagram of an exemplary mobile device that may operate as a touch interface device in embodiments of the present technology. Exemplary electronic circuitry of a typical mobile device is depicted. The mobile device 900 includes one or more microprocessors 912, and memory 1010 (e.g., non-volatile memory such as ROM and volatile memory such as RAM) that stores processor-readable code that is executed by one or more processors of the control processor 912 to implement the functions described herein.
The mobile device 900 may include, for example, a processor 912, memory 1010 including applications and non-volatile storage. The application may include a secondary interface provided to the user interface 918. The processor 912 may implement communications as well as any number of applications, including the interactive applications described herein. The memory 1010 can be any variety of memory storage media types including non-volatile and volatile memory. The device operating system handles the different operations of the mobile device 900 and may contain user interfaces for operations such as making and receiving phone calls, text messaging, checking voicemail, and the like. The application 1030 may be any kind of program, such as a camera application for photos and/or videos, an address book, a calendar application, a media player, an internet browser, games, other multimedia applications, an alarm application, other third party applications, the interaction applications discussed herein, and so forth. The non-volatile storage component 1040 in memory 1010 contains data such as web caches, music, photos, contact data, scheduling data, and other files.
The processor 912 also communicates with the RF transmit/receive circuitry 906, which circuitry 906 in turn is coupled to the antenna 902, which also communicates with the infrared transmitter/receiver 908, with any additional communication channel 1060 like Wi-Fi or bluetooth, and with the movement/orientation sensor 914 like an accelerometer. An accelerometer is included in the mobile device to enable applications such as intelligent user interfaces that let the user input commands through gestures, indoor GPS functionality that calculates the movement and direction of the device after disconnecting from GPS satellites, and to detect the orientation of the device and automatically change the display from portrait to landscape when the phone is rotated. The accelerometer may be provided, for example, by a micro-electromechanical system (MEMS), which is a tiny mechanical device (micron-scale) built on a semiconductor chip. Acceleration direction, as well as orientation, vibration and shock can be sensed. The processor 912 is further in communication with a ringer/vibrator 916, a user interface/keypad/screen 918, one or more speakers 1020, a microphone 922, a camera 924, a light sensor 926, and a temperature sensor 928. The user interface, keypad, and screen may comprise a capacitive touch screen, in accordance with well-known principles and techniques.
The processor 912 controls the transmission and reception of wireless signals. During a transmit mode, processor 912 provides a voice signal or other data signal from microphone 922 to RF transmit/receive circuitry 906. Transmit/receive circuitry 906 transmits the signal to a remote station (e.g., a fixed station, carrier, other cellular telephone, etc.) for communication via antenna 902. The ringer/vibrator 916 is used to signal an incoming call, text message, calendar reminder, alarm clock reminder, or other notification to the user. During a receive mode, the transmit/receive circuitry 906 receives voice or other data signals from a remote station via the antenna 902. The received voice signals are provided to the speaker 1020, while other received data signals are also processed appropriately.
In addition, a physical connector 988 may be used to connect the mobile device 900 to an external power source, such as an AC adapter or powered docking station. The physical connector 988 may also serve as a data connection to the computing device and/or various embodiments of the controller 100 described herein. The data connection allows operations such as synchronizing mobile device data with computing data on another device.
A GPS transceiver 965 that relays the location of the user application using satellite-based radio navigation is enabled for such services.
The example computer systems illustrated in the figures include examples of computer-readable storage media. Computer readable storage media are also processor readable storage media. Such media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, program modules, or other data. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, cache, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical disk storage, memory sticks or cards, magnetic cassettes, magnetic tape, a media drive, a hard disk, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can accessed by a computer.
FIG. 20 is a block diagram of another embodiment of a computing system that may be used to implement console 202. In this embodiment, the computing system is a multimedia console 800, such as a gaming console or the like. As shown in FIG. 20, the multimedia console 800 has a Central Processing Unit (CPU) 801 and a memory controller 802 that facilitates processor access to various types of memory, including a flash Read Only Memory (ROM) 803, a Random Access Memory (RAM) 806, a hard disk drive 808, and a portable media drive 806. In one implementation, CPU801 includes a level 1 cache 810 and a level 2 cache 812 to temporarily store data and thus reduce the number of memory access cycles made to hard disk drive 808, thereby improving processing speed and throughput.
The CPU801, the memory controller 802, and various memory devices are interconnected via one or more buses (not shown). The details of the bus used in this implementation are not particularly relevant to understanding the subject matter of interest discussed herein. It should be understood, however, that such a bus may include one or more of serial and parallel buses, a memory bus, a peripheral bus, and a processor or local bus using any of a variety of bus architectures. By way of example, such architectures can include an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an enhanced ISA (eisa) bus, a Video Electronics Standards Association (VESA) local bus, and a Peripheral Component Interconnect (PCI) bus also known as a mezzanine bus.
In one implementation, the CPU801, memory controller 802, ROM803, and RAM806 are integrated onto a common module 814. In this implementation, ROM803 is configured as a flash ROM that is connected to memory controller 802 via a PCI bus and a ROM bus (neither of which are shown). RAM806 is configured as multiple Double Data Rate Synchronous Dynamic RAM (DDRSDRAM) modules that are independently controlled by memory controller 802 via separate buses (not shown). Hard disk drive 808 and portable media drive 805 are shown connected to memory controller 802 by a PCI bus and an AT attachment (ATA) bus 816. However, in other implementations, different types of dedicated data bus structures may alternatively be applied.
The graphics processing unit 820 and the video encoder 822 form a video processing pipeline for high speed and high resolution (e.g., high definition) graphics processing. Data is transmitted from a Graphics Processing Unit (GPU) 820 to a video encoder 822 over a digital video bus (not shown). Lightweight messages (e.g., popups) generated by system applications are displayed using the GPU820 interrupt to schedule code to render popup into an overlay. The amount of memory used for the overlay depends on the overlay area size, and the overlay preferably scales with the screen resolution. Where the concurrent system application uses a full user interface, it is preferable to use a resolution that is independent of the application resolution. A scaler (scaler) may be used to set this resolution, thereby eliminating the need to change the frequency and cause a TV resynch.
An audio processing unit 824 and an audio codec (coder/decoder) 826 form a corresponding audio processing pipeline for multi-channel audio processing of various digital audio formats. Audio data is communicated between audio processing unit 824 and audio codec 826 via a communication link (not shown). The video and audio processing pipelines output data to an a/V (audio/video) port 828 for transmission to a television or other display. In the illustrated implementation, the video and audio processing component 820 and 828 are installed on the module 214.
FIG. 20 shows a module 814 including a USB host controller 830 and a network interface 832. USB host controller 830 is shown in communication with CPU801 and memory controller 802 via a bus (e.g., a PCI bus) and hosts peripheral controllers 804(1) - (804) - (4). The network interface 832 provides access to a network (e.g., the internet, home network, etc.) and may be any of a wide variety of various wired or wireless interface components including an ethernet card, a modem, a wireless access card, a bluetooth module, a cable modem, and the like.
In the implementation depicted in fig. 18, the console 800 includes a controller support subassembly 840 for supporting four controllers 804(1) -804 (4). The controller support subassembly 840 includes any hardware and software components necessary to support wired and wireless operation with external control devices such as, for example, media and game controllers. The front panel I/O subassembly 842 supports the multiple functions of the power button 812, the eject button 813, as well as any LEDs (light emitting diodes) or other indicators exposed on the outer surface of the console 802. Subassemblies 840 and 842 are in communication with module 814 via one or more cable assemblies 844. In other implementations, the console 800 may include additional controller subcomponents. The illustrated implementation also shows an optical I/O interface 835 configured to send and receive signals that may be passed to module 814.
MUs 840(1) and 840(2) are shown as being connectable to MU ports "a" 830(1) and "B" 830(2), respectively. Additional MUs (e.g., MUs 840(3) -840 (6)) are shown as connectable to controllers 804(1) and 804(3), i.e., two MUs per controller. Controllers 804(2) and 804(4) may also be configured to receive MUs (not shown). Each MU840 provides additional storage on which games, game parameters, and other data may be stored. In some implementations, the other data can include any of a digital game component, an executable gaming application, an instruction set for expanding a gaming application, and a media file. When inserted into console 800 or a controller, MU840 may be accessed by memory controller 802. The system power supply module 850 supplies power to the components of the gaming system 800. A fan 852 cools the circuitry within console 800. A microcontroller unit 854 is also provided.
An application 860 comprising machine instructions is stored on hard disk drive 808. When console 800 is powered on, various portions of application 860 are loaded into RAM806, and/or caches 810 and 812, for execution on CPU801, with application 860 being one such example. Various applications may be stored on hard disk drive 808 for execution on CPU 801.
Gaming and media system 800 may be operated as a standalone system by simply connecting the system to display 16, a television, a video projector, or other display device. In this standalone mode, gaming and media system 800 allows one or more players to play games or enjoy digital media, such as watching movies or listening to music. However, with the integration of broadband connectivity made possible through network interface 832, gaming and media system 800 may also be operated as a participant in a larger network gaming community.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.

Claims (9)

1. A controller for a content presentation and interaction system including a primary content presentation device, comprising:
a tactile control input [100] responsive to input by a first user and communicatively coupled to a content presentation device [202,250] comprising a plurality of tactile input mechanisms and providing a first set of control inputs to manipulate content;
a touch screen control input [400] responsive to input by said first user and communicatively coupled to said content presentation device, the screen being in proximity to said tactile control input and providing a second set of control inputs comprising alternative inputs to at least some of said first set of control inputs and additional inputs not available using said tactile input mechanism, said touch screen control input having a direct connection to said tactile control input to direct communication with said tactile control input.
2. The controller of claim 1, wherein the controller is in communication with the content presentation device and the content presentation device is in communication with an entertainment service via a network, the service providing one or more elements in an auxiliary interface, the auxiliary interface comprising one or more of:
an application help interface;
an application control interface;
replace the game view interface;
an information interface that provides additional information about the content.
3. A controller as claimed in claim 2, wherein the touch screen control input comprises a processor and a connector, and the touch screen control input is connected to the haptic control input through the connector.
4. A controller as claimed in claim 3, wherein the haptic control input or touch screen control input comprises at least one imaging camera in communication with the processor to provide input to the auxiliary interface.
5. The controller of claim 1, wherein the content presentation device is in communication with a content service via a network, the service providing one or more elements of an auxiliary interface.
6. A controller as claimed in claim 1, wherein the touch screen control input comprises a processor and a connector, and a second controller is connected to the haptic control input.
7. The controller of claim 1, wherein a second controller comprises a processor and a wireless communication system, and the second controller is coupled to the content presentation device via the wireless communication system.
8. The controller of claim 1, wherein an output device communicates with a content service via a network, the service providing one or more elements of an auxiliary interface on the touchscreen input that includes the second set of control inputs.
9. The controller of claim 1, wherein the touch screen control input is communicatively coupled to the haptic control input.
HK13108181.7A 2011-12-20 2013-07-12 Content system with secondary touch controller HK1181135B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/331,726 US20130154958A1 (en) 2011-12-20 2011-12-20 Content system with secondary touch controller
US13/331,726 2011-12-20

Publications (2)

Publication Number Publication Date
HK1181135A1 HK1181135A1 (en) 2013-11-01
HK1181135B true HK1181135B (en) 2016-12-02

Family

ID=

Similar Documents

Publication Publication Date Title
US20130154958A1 (en) Content system with secondary touch controller
US10610778B2 (en) Gaming controller
US8858333B2 (en) Method and system for media control
US9545572B2 (en) Systems and methods for determining functionality of a display device based on position, orientation or motion
US8870654B2 (en) Gaming controller
EP2794039B1 (en) Directional input for a video game
US8951120B2 (en) Systems and methods for calibration and biasing for game controller
US20150205106A1 (en) Using a Second Screen as a Private Tracking Heads-up Display
US20130337916A1 (en) Companion gaming experience supporting near-real-time gameplay data
AU2017203102A1 (en) Systems and methods for interactive experiences and controllers therefor
CN110665220B (en) Game Controller
US20140121010A1 (en) Method and system for video gaming using game-specific input adaptation
CN105723302A (en) Boolean/float controller and gesture recognition system
US11771981B2 (en) Sharing buffered gameplay in response to an input request
TWI817208B (en) Method and apparatus for determining selected target, computer device, non-transitory computer-readable storage medium, and computer program product
KR102369256B1 (en) Method for providing user interface and terminal for executing the same
HK1181135B (en) Content system with secondary touch controller
KR102369251B1 (en) Method for providing user interface and terminal for executing the same