HK1178832B - Game controller on mobile touch-enabled devices - Google Patents
Game controller on mobile touch-enabled devices Download PDFInfo
- Publication number
- HK1178832B HK1178832B HK13106014.4A HK13106014A HK1178832B HK 1178832 B HK1178832 B HK 1178832B HK 13106014 A HK13106014 A HK 13106014A HK 1178832 B HK1178832 B HK 1178832B
- Authority
- HK
- Hong Kong
- Prior art keywords
- touch
- display
- area
- mobile device
- enabled mobile
- Prior art date
Links
Description
Technical Field
The invention relates to a game controller on a touch-enabled mobile device.
Background
Video games typically involve user interaction that generates visual feedback on a display. For example, controllers are often used to manipulate games. According to an example, a controller may be used to manipulate a game executing on a video game console, which may cause visual content to be presented on a display (e.g., television, monitor, etc.). An example of an early controller for manipulating a game was a joystick that included one button. However, modern controllers with multiple analog and discrete inputs have recently become more popular. By way of illustration, controllers used with video game consoles typically include an analog thumb joystick (thumbstick), an analog trigger device (trigger), discrete buttons, and/or directional pads. Thus, games that can be executed on consoles have become more complex to take advantage of the variety of inputs provided by modern controllers.
As touch-enabled mobile phones have become commonplace, versions of console-based games that can be executed on touch-enabled mobile devices have become more readily available. However, the user experience when interacting with a version of a game on a touch-enabled mobile device is degraded compared to the user experience when interacting with a version of the game on a video game console. The user experience may be adversely affected, for example, because touch-enabled mobile devices conventionally cannot provide tactile feedback similar to the controls of a game console.
Furthermore, since touch-enabled mobile devices are typically degraded to use less complex control models that have fewer inputs than the control models employed by the controllers of the console, the user experience may be degraded. For example, touch-enabled mobile devices typically have two control points (e.g., two thumbs), while conventional controllers of consoles often have more than two control points. When a touch-enabled mobile device is held by a user, the user can touch the display of the touch-enabled mobile device with her two thumbs, while her other fingers are typically used to hold the touch-enabled mobile device. Additionally, if a user attempts to hold the touch-enabled mobile device with more fingers (e.g., a thumb and at least one additional finger) on the display, the natural range of motion of her thumb may be limited, especially for larger touch-enabled mobile devices. In contrast, conventional controllers often include the following triggers and/or buttons: the trigger and/or button is positioned to be operated by a finger of the user other than the thumb when the controller is held in the hand. If a less complex control model is used with a touch-enabled mobile device, some operations that may be manipulated by the user in a version of a game for the console may not be available in the version of the game for the touch-enabled mobile device, which may result in a reduced user experience.
Disclosure of Invention
Various technologies are described herein relating to controlling a game with a touch-enabled mobile device. The thumb joystick may be presented on a re-assignable area of a display of the touch-enabled mobile device. In addition, a mode selection button may be presented on a mode selection area of the display. A first operation in the game may be controlled in response to detecting a first touch within a re-allocatable region of the display. For example, a thumb joystick may be represented as being at a default height when a first operation is controlled. Further, a second touch may be detected that may be a drag from the mode selection area to the re-allocatable area. A second operation in the game may then be controlled in response to the third touch, where the second touch and the third touch are detected without a discontinuity in the contact. The thumb joystick may be represented as being at a depressed height when the second operation is controlled. Additionally, a third touch may be detected as intermittent and a first operation in the game may be controlled in response to a fourth touch detected within the re-allocatable area of the display. The detected discontinuity of the third touch may switch the thumb joystick from being represented as being at a depressed height to being represented as being at a default height.
According to various embodiments described herein, a toggle button may be presented on a toggle area of a display of a touch-enabled mobile device. Switching from presenting the thumb joystick on the re-assignable region of the display to presenting the directional pad on the re-assignable region of the display may be performed if a touch is detected within the transition region of the display while the thumb joystick is presented on the re-assignable region of the display. Further, a touch may be detected when a directional pad is presented on the display within the re-allocatable area, and operations in the game may be controlled in response thereto. This operation may be different from, for example, the following in a game: the operation may be controlled in response to a touch detected within the re-assignable region when the thumb joystick is presented on the re-assignable region (e.g., the thumb joystick is represented at a default height or a depressed height).
According to other embodiments described herein, the output of the sensor may be used to control one or more operations in a game executed on the touch-enabled mobile device. The sensors may be gyroscopes, accelerometers, cameras, and so on. In some embodiments, the output of the sensor may be used to detect the tilt of the touch-enabled mobile device and/or whether the touch-enabled mobile device is rotating, which may be used to control operations in the game. Additionally or alternatively, in various embodiments, the output of the sensor may be used to detect gestures, and operations in the game may be controlled in accordance with the detected gestures.
The above summary presents a simplified summary in order to provide a basic understanding of some aspects of the systems and/or methods discussed herein. This summary is not an extensive overview of the systems and/or methods discussed herein. It is not intended to identify key/critical elements or to delineate the scope of such systems and/or methods. Its sole purpose is to present some concepts in a simplified form as a prelude to the more detailed description that is presented later.
Drawings
FIG. 1 illustrates an exemplary graphical user interface that may be presented on a display of a touch-enabled mobile device.
2-4 illustrate exemplary user interactions with a touch-enabled mobile device to switch modes of input represented by a thumb joystick included in an exemplary graphical user interface.
FIG. 5 illustrates another exemplary graphical user interface that may be presented on a display of a touch-enabled mobile device.
6-9 illustrate various exemplary orientations of a touch-enabled mobile device that may be used to manipulate the operation of a game presented on a display of the touch-enabled mobile device.
10-12 illustrate additional various exemplary orientations of a touch-enabled mobile device that may be used to manipulate the operation of a game presented on the display of the touch-enabled mobile device.
FIG. 13 illustrates a functional block diagram of an exemplary system 1300 that controls a game executed by a touch-enabled mobile device.
FIG. 14 is a flow chart illustrating an exemplary method for controlling a game using a touch-enabled mobile device.
FIG. 15 illustrates a method for changing an input for a game, where the input may be received through a re-allocatable region of a display of a touch-enabled mobile device.
FIG. 16 illustrates an exemplary computing device.
Detailed Description
Various technologies pertaining to controlling a game with a touch-enabled mobile device are now described with reference to the drawings, wherein like reference numerals are used to refer to like elements throughout. In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of one or more aspects. It may be evident, however, that such aspect(s) may be practiced without these specific details. In other instances, well-known structures and devices are shown in block diagram form in order to facilitate describing one or more aspects. In addition, it is to be understood that functionality that is described as being performed by certain system components may be performed by multiple components. Similarly, for example, a component may be configured to perform functions described as being performed by multiple components.
Furthermore, the term "or" is intended to mean an inclusive "or" rather than an exclusive "or". That is, unless specified otherwise, or clear from context, the phrase "X employs A or B" is intended to mean any of the natural inclusive permutations. That is, the phrase "X employs A or B" is satisfied for both: x is A; b is used as X; or X employs both A and B. In addition, the articles "a" and "an" as used in this application and the appended claims should generally be construed to mean "one or more" unless specified otherwise or clear from context to be directed to a singular form.
As described herein, a game may be controlled using a touch-enabled mobile device. More specifically, interaction models designed for touch-enabled mobile devices described herein may provide an increased number of simulated and/or discrete inputs available for manipulating a game as compared to conventional interaction models used with touch-enabled mobile devices. Thus, the interaction model described herein may enable a user to manipulate a game through a touch-enabled mobile device in a manner more similar to manipulating a game through a controller used in conjunction with a video game console, as compared to conventional approaches.
Referring now to the drawings, FIG. 1 illustrates an exemplary graphical user interface 100 that may be presented on a display. The graphical user interface 100 may be presented, for example, on a display of a touch-enabled mobile device. Examples of touch-enabled mobile devices include telephones (e.g., smart phones, etc.), handheld computers, tablet computers, and the like.
Additionally, a graphical user interface 100 may be presented to enable a user to interact with the game. For example, although not shown, it is understood that the graphical user interface 100 may include game data (visual information related to the game); accordingly, the game data may be presented on at least a portion of a display of the touch-enabled mobile device. By way of illustration, the visual indicators included in exemplary graphical user interface 100 may be presented on a layer above game data on a touch-enabled mobile device, although claimed subject matter is not so limited. Further, the input represented by the visual indicators included in the exemplary graphical user interface 100 may be used by a user to manipulate game data presented on the display.
The graphical user interface 100 includes a variety of visual indicators that visually represent different analog and discrete inputs. The visual indicators are positioned in the graphical user interface 100 to be located on different respective areas of a display on which the graphical user interface 100 is presented. Further, according to one or more exemplary embodiments, it is contemplated that the area of the display may be re-allocatable; accordingly, the graphical user interface 100 may include a first visual indicator located on a reallocated area of the display during a first time period and a second visual indicator located on the reallocated area of the display during a second time period. For example, a touch detected in a re-assignable region of a display may control a different operation during a first time period than a second time period. According to another example, during the first time period, a different type of input (e.g., analog input versus discrete input) may be mapped to a touch detected within the re-allocatable region than during the second time period. It is contemplated according to this example that different types of inputs may be utilized to control different operations.
According to an example, one or more of the inputs represented by the visual indicators in the graphical user interface 100 can be mapped to touches detected within respective areas of a display of the touch-enabled mobile device; thus, a touch detected within a corresponding area of the display may be used to control various operations of the game. Additionally or alternatively, at least one of the inputs represented by the visual indicators in the graphical user interface 100 may be mapped to the output of one or more different sensors included in the touch-enabled mobile device, the output from the different sensors may be used to control various operations of the game. Cameras, accelerometers, gyroscopes, and the like are examples of different sensors that may be included in a touch-enabled mobile device; it can be appreciated, however, that the claimed subject matter is not so limited. By using different sensors, more control points (e.g. in addition to two thumbs) can be supported.
The visual indicators included in the exemplary graphical user interface 100 are described further below. However, it is to be understood that graphical user interfaces lacking one or more of the visual indicators shown herein and/or including visual indicators other than those shown herein are intended to fall within the scope of the hereto appended claims. As shown, the exemplary graphical user interface 100 includes two thumb joysticks: namely a left thumb joystick 102 and a right thumb joystick 104. For example, the left thumb 102 is located within a left thumb joystick region 106 and the right thumb 104 is located within a right thumb joystick region 108 in the graphical user interface 100. The graphical user interface 100 also includes a left mode selection button 110 and a right mode selection button 112. A left channel 114 is positioned between the left mode select button 110 and the left thumb joystick region 106, and a right channel 116 is positioned between the right mode select button 112 and the right thumb joystick region 108. In addition, the graphical user interface 100 includes a toggle button 118 and four flick buttons (an A button 120, a B button 122, an X button 124, and a Y button 126). The graphical user interface 100 additionally includes a left trigger button 128, a right trigger button 130, a left buffer (buffer) 132, and a right buffer 134.
The visual indicators shown in FIG. 1 are positioned in the graphical user interface 100 to be presented on respective areas of the display of the touch-enabled mobile device. Thus, the left thumb joystick 102 (and left thumb joystick region 106) may be presented on a left re-assignable region of the display, and the right thumb joystick 104 (and right thumb joystick region 108) may be presented on a right re-assignable region of the display. Additionally, a left mode selection button 110 may be presented in a left mode selection area of the display and a right mode selection button 112 may be presented in a right mode selection area of the display. Additionally, the left trench 114 may be presented in a left trench area of the display and the right trench 116 may be presented in a right trench area of the display. Further, the transition button 118 may be presented on a transition region of the display, four tap buttons may be presented on respective tap regions of the display, the left trigger button 128 may be presented on a left trigger region of the display, the right trigger button 130 may be presented on a right trigger region of the display, the left buffer 132 may be presented on a left buffer region of the display, and the right buffer 134 may be presented on a right buffer region of the display.
Further, inputs for controlling operations in the game represented by the left thumb joystick 102, the right thumb joystick 104, the left mode selection button 110, the right mode selection button 112, the toggle button 118, the A button 120, the B button 122, the X button 124, and the Y button 126 may be mapped to touches detected within respective areas of a display of the touch-enabled mobile device. Additionally, the inputs used to control the operations represented by the right left trigger button 128 and the right trigger button 130 in the game may be mapped to the measured amount of tilt (e.g., measured tilt angle) detected from the output of the sensor that enabled the mobile device that came out and went out. Further, inputs for controlling operations represented by the left buffer 132 and the right buffer 134 in the game may be mapped to rotations of the touch-enabled mobile device detected from the outputs of the sensors. According to an example, the sensor for detecting the amount of tilt and the sensor for detecting the rotation may be the same sensor. According to another example, different sensors can be employed to detect the amount of tilt and rotation of a touch-enabled mobile device. According to another embodiment, it is contemplated that more than one sensor may be utilized to detect the amount of tilt and/or rotation of the touch-enabled mobile device.
In addition, a subset of the inputs for controlling operations in the game are analog inputs, and the remaining ones of the inputs for controlling operations in the game are discrete inputs. More specifically, the inputs represented by the left thumb joystick 102, the right thumb joystick 104, the left trigger button 128, and the right trigger button 130 are analog inputs, while the remaining ones of the inputs are discrete inputs. The analog input may receive analog input information, which may be a continuous range of values. Additionally, the discrete input may receive discrete input information; the discrete input information may be on (on) or off (off) values, up, down, left or right values, or any other discrete value. For example, the discrete input may be in response to a tap by the user.
As described above, the input represented by the right thumb joystick 104 is an analog input that may receive analog input information. According to an example, the user may touch with his right thumb on the portion of the display where the right thumb joystick 104 is presented. While continuing to touch the display, the user may move his right thumb within the right re-assignable region (e.g., within the right thumb joystick region 108). When a user moves his right thumb within the right re-assignable region of the display, right thumb joystick 104 may be moved within graphical user interface 100 presented on the display to, for example, provide visual feedback to the user, although claimed subject matter is not so limited. Further, according to this example, a user's touch may be detected and mapped to the analog input information. The analog input information may vary over a continuous range of values depending on where the touch is detected on the display (e.g., the analog input information may vary over the range of values based on movement of the right thumb across the display, moving the right thumb away from the display, etc.). Further, it is contemplated that the input represented by the left thumb joystick 102 may be substantially similar to the input represented by the right thumb joystick 104.
Further, as described above, the input represented by the a button 120 is an example of a discrete input that can receive discrete input information. According to an example, it may be detected: whether the user is touching or not touching the display in a tapped area of the presentation a button 120. According to this example, whether a touch is detected in a tap region of the presentation button 120 of the display may be mapped to discrete input information. For example, the input represented by A button 120 may act as a single press or hold and hold button. According to an illustration, if the input represented by the a-button 120 acts as a single press button, the discrete input information may be switched from the first state to the second state (e.g., from on to off, off to on, etc.) in response to detecting a transition from the user not touching to the user touching a tapped area of the display that presents the a-button 120 (or alternatively, a transition from the user touching to the user not touching such a tapped area). By way of further illustration, if the input represented by the a-button 120 acts as a hold-and-drop button, the discrete input information may be switched from the first state to the second state (e.g., from off to on, on to off, etc.) in response to detecting a transition from the user not touching to the tap region of the presentation a-button 120 of the user-touched display, and may be switched back from the second state to the first state (e.g., from on to off, off to on, etc.) in response to detecting a transition from the user touching to the tap region of the presentation a-button 120 of the user-not-touched display. Additionally, it can be appreciated that the inputs represented by the B button 122, the X button 124, and the Y button 126 can be substantially similar to the inputs represented by the A button 120.
Referring to fig. 2-4, exemplary user interaction with a touch-enabled mobile device 202 for switching the mode of input represented by the right thumb joystick 104 is shown. In this exemplary user interaction, the touch-enabled mobile device 202 is held by the user's right hand 204. Although not shown, it is contemplated that the touch-enabled mobile device 202 may additionally or alternatively be held in the user's left hand. For purposes of illustration, FIGS. 2-4 show a portion of the graphical user interface 100 of FIG. 1 being presented on a portion of a display of a touch-enabled mobile device 202; it is contemplated that the remainder of the graphical user interface 100 of fig. 1 may be presented on the remainder of the touch-enabled mobile device 202, which is not shown. The exemplary user interaction shows the right thumb 206 of the user's right hand 204 touching various areas of the display of the touch-enabled mobile device 202 to switch the mode of input represented by the right thumb joystick 104. For example, the exemplary user interaction with the touch-enabled mobile device 202 shown in FIGS. 2-4 may simulate interaction with a depressible simulated thumb joystick on a conventional controller of a video game console. In addition, different operations in the game may be operated by the right thumb joystick 104 according to an input represented by a mode, such as whether the right thumb joystick 104 is represented as a default height in a simulated joystick up mode or a depressed height in a thumb joystick down mode. Further, it is contemplated that the user may similarly interact with the input represented by the left thumb joystick 102 in fig. 1 (e.g., using the left thumb or left hand).
In an exemplary user interaction, fig. 2-4 show the right thumb joystick 104 and the right thumb joystick region 108 being presented on the right re-allocatable area of the display of the touch-enabled mobile device 202. In addition, a right mode selection button 112 is presented on a right mode selection area of the display of the touch-enabled mobile device 202. The right re-allocatable area of the display and the right mode selection area of the display are non-overlapping. Additionally, the right channel 116 is presented on a right channel area of the display of the touch-enabled mobile device 202. According to the illustrated example, the right trench region is between the right reallocateable region and the right mode selection region. It can be appreciated, however, that exemplary user interactions are shown for illustrative purposes and that claimed subject matter is not so limited.
Turning to fig. 2, the right thumb joystick 104 is represented as being at a default height on the display of the touch-enabled mobile device 202 (e.g., the input represented by the right thumb joystick 104 is in a thumb joystick up mode). Additionally, the user's right thumb 206 is touching the right re-allocatable region of the display on which the right thumb joystick 104 and the right thumb joystick region 108 are presented. Touch-enabled mobile device 202 can detect a touch of right thumb 206 within a right re-allocatable area of the display (e.g., can detect a first touch). Further, a first operation in the game may be controlled in response to a detected touch of the right thumb 104 within the right re-assignable region of the display when the right thumb joystick 104 is represented at the default height shown in FIG. 2. For example, the detected touch may be mapped to analog input information; thus, analog input information may be received through a right re-assignable region of the display on which the right thumb joystick 104 is presented.
Fig. 3 again shows the right thumb joystick 104 at a default height on the display of the touch-enabled mobile device 202. In addition, the user's right thumb 206 is touching the right mode selection area of the display on which the right mode selection button 112 is presented. According to an example, the user's right thumb 206 can be dragged from the right re-assignable area of the display shown in FIG. 2 to the right mode selection area of the display shown in FIG. 3 (e.g., without a discontinuity in contact between the right thumb 206 and the display of the touch-enabled mobile device 202, without a discontinuity beginning with a first touch, etc.). For another example, the user's right thumb 206 may be removed from contact with the display after touching the right re-allocatable area of the display shown in FIG. 2, and then placed in contact with the right mode selection area of the display shown in FIG. 3 (e.g., a discontinuity in contact after the first touch).
Further, the user's right thumb 206 is dragged from the right mode selection area of the display shown in FIG. 3 to the right re-allocatable area (e.g., without a discontinuity in contact between the right thumb 206 and the display of the touch-enabled mobile device 202). The touch-enabled mobile device 202 can detect such a touch (e.g., a second touch). Accordingly, the detected second touch is dragged from the right mode selection area of the display to the right re-allocatable area of the display. According to an example, a drag of the second touch may be detected as passing through at least a portion of the right trench region; however, claimed subject matter is not so limited. Additionally, the touch-enabled mobile device 202 can switch the mode of input represented by the right thumb joystick 104 after detecting the second mode (e.g., switch the input represented by the right thumb joystick 104 to a thumb-joystick-down mode).
As shown in fig. 4, the right thumb joystick 104 is shown at a depressed height. According to the illustrated example, at least a portion of the visual indicator presented on the right re-assignable region of the display is changed to indicate that the right thumb joystick 104 is switched to a depressed height in response to detecting the second touch (e.g., the aforementioned drag). For example, a first visual indicator may be presented on the right re-allocatable area of the display when the right thumb joystick 104 is represented at a default height (e.g., as shown in fig. 2 and 3), and a second visual indicator may be presented on the right re-allocatable area of the display when the right thumb joystick 104 is represented at a depressed height (e.g., as shown in fig. 4). According to this example, at least a portion of the first visual indicator and the second visual indicator are different.
FIG. 4 shows the user's right thumb 206 touching the right re-allocatable area of the display of the mobile device. Touch-enabled mobile device 202 can detect a touch (e.g., a third touch) of right thumb 206 within the right re-allocatable area of the display after right thumb 206 is dragged from the right mode selection area to the right re-allocatable area of the display by the contactless break. Thus, the second touch and the third touch are detected by the touch-enabled mobile device without detecting a discontinuity in the contact. Additionally, a second operation in the game may be controlled in response to a third touch detected within the right re-assignable region of the display while the right thumb joystick 104 is represented at the depressed height shown in FIG. 4.
Further, the third touch may be detected as intermittent (e.g., the user's right thumb 206 touching the right re-allocatable area of the display as shown in fig. 4 may be detected as released from the display). Upon detecting the third touch, the touch-enabled mobile device 202 may switch the mode of input represented by the right thumb joystick 104 back to the mode shown in fig. 2 (e.g., switch the input represented by the right thumb joystick 104 to a thumb joystick up mode). Thus, the right thumb joystick 104 may, for example, be modeled as spring-loaded (e.g., to simulate automatic pop-up after release).
Then, if touch-enabled mobile device 202 detects that the user's right thumb 206 touches a right re-allocatable area of the display (e.g., a fourth touch) after the break in the third touch, the first operation in the game may be controlled again in response to the fourth touch. A fourth touch may be detected within the right re-assignable region of the display when the right thumb joystick 104 is represented as being at the default height shown in fig. 2.
Referring again to fig. 1, the input represented by the toggle button 118 may toggle the type of input mapped to the touch detected within the left re-assignable region of the display (e.g., the left thumb joystick 102 and the left thumb joystick region 106 may be presented in the graphical user interface 100 on the left re-assignable region). According to an example, a touch within the transition region of the display (e.g., upon which the toggle button 118 is presented) can be detected when the left thumb joystick 102 is presented on a left re-assignable region of the display (e.g., as shown in fig. 1). A touch within the transition area of the display may cause the type of input mapped to the touch detected within the left reassignable area of the display to switch from analog input to discrete input.
Referring to FIG. 5, another exemplary graphical user interface 500 that may be presented on a display of a touch-enabled mobile device is shown. The touch-enabled mobile device may switch from presenting the graphical user interface of fig. 1 to presenting the graphical user interface 500 on the display upon detecting the aforementioned touch within the transition region of the display. Accordingly, the touch-enabled mobile device may switch from presenting the left reallocated area 102 of fig. 1 (e.g., included in the graphical user interface 100 of fig. 1) to presenting the directional pad 502 on the left reallocated area of the display upon detecting a touch within the transition area of the display. Additionally, when directional pad 502 is presented on the left reallocated area of the display, the left reallocated area of the display may be configured as a discrete input.
According to an example, if a different touch within a transition region of the display (e.g., on which the transition button 118 is presented) is detected while the directional pad 502 is presented on the left re-assignable region of the display (e.g., as shown in fig. 5), the touch-enabled mobile device can switch from presenting the directional pad 502 to presenting the left thumb joystick 102 of fig. 1 on the left re-assignable region of the display. Accordingly, the graphical user interface 100 of FIG. 1 may again be presented on the display of the touch-enabled mobile device. Thus, according to this example, the left re-assignable region of the display may be configured to simulate an input when the left thumb joystick 102 of fig. 1 is presented thereon. It is therefore contemplated that the input represented by the toggle button 118 can cause the left re-allocatable region of the display of the touch-enabled mobile device to switch between being an analog input (e.g., receiving analog input information) and a discrete input (e.g., receiving discrete input information). Additionally or alternatively, although not shown, it can be appreciated that the right re-allocatable region of the display can be similarly associated with inputs similar to the inputs represented by the toggle button 118.
According to an example, it is contemplated that the input represented by the left mode select button 110 may be disabled when the directional pad 502 is presented on the display. According to an alternative example, the input represented by the left mode selection button 110 may be enabled when the directional pad 502 is presented on the display. According to this example, a touch detected within the left mode selection area of the display while the directional pad 502 is presented on the display may cause the left re-assignable area of the display of the touch-enabled mobile device to switch from presenting the directional pad 502 to presenting the left thumb joystick 102 of fig. 1 on the left re-assignable area of the display (e.g., similar to detecting a touch within the transition area of the display as described above).
According to another example, a touch from the left mode selection area of the display to the left re-assignable area of the display may be detected (e.g., a drag similar to the drag described above with reference to FIGS. 3 and 4). Upon detecting such a touch, the touch-enabled mobile device can switch from presenting the directional pad 502 on the re-allocatable area of the display to presenting the left thumb joystick 102 of fig. 1 on the re-allocatable area of the display (e.g., the graphical user interface 100 of fig. 1 can be presented on the display). Further, from this example, it can be appreciated that the left thumb joystick 102 may be directly represented at the depressed height (e.g., similar to the example shown in fig. 4). Thus, similar to the example of FIG. 4, operations in the game may be controlled in response to subsequent touches detected within the left reparable region of the display while the touch is continuously detected (e.g., without a discontinuity in the contact initiated at the start of the drag) with the thumb joystick 102 of FIG. 1 represented at a depressed height. Additionally, after the touch is detected as intermittent, different operations in the game may then be controlled in response to subsequent further touches to the left re-allocatable area of the display with the left thumb joystick 102 of FIG. 1 represented as being at a default height.
6-9, various exemplary orientations of the touch-enabled mobile device 202 are illustrated that may be used to manipulate the operation of a game presented on the display of the touch-enabled mobile device 202. According to an example, various inputs for controlling operations in a game can be mapped to a measured amount of tilt of the touch-enabled mobile device 202. For example, the amount of tilt of the touch-enabled mobile device 202 can be measured by a sensor (not shown) included in the touch-enabled mobile device 202.
FIG. 6 shows the touch-enabled mobile device 202 in a stationary pose. Additionally, the surface of the display of the touch-enabled mobile device 202 in the stationary pose may define the predetermined plane with reference to FIGS. 7-9. It can be appreciated that the stationary gesture of touch-enabled mobile device 202 can be identified in substantially any manner. According to various illustrations, a stationary gesture can be identified as being in a preset orientation of the touch-enabled mobile device 202, an orientation when the touch-enabled mobile device 202 is powered on, an orientation when a game is launched, an orientation of the touch-enabled mobile device 202 after a predetermined amount of time has elapsed without tilting the touch-enabled mobile device 202, and so forth. It can be appreciated that the claimed subject matter contemplates identifying the stationary pose in substantially any other manner.
The amount of tilt of the touch-enabled mobile device 202 relative to a predetermined plane can be measured. Further, operations in the game may be controlled according to the amount of tilt of the touch-enabled mobile device 202. It is contemplated that the amount of tilt of the touch-enabled mobile device 202 can be mapped to simulated input information for the game. Additionally, the level of simulated input information for the game may remain constant when the amount of tilt of the touch-enabled mobile device 202 remains constant relative to the predetermined plane.
FIG. 7 illustrates the upper right corner of the touch-enabled mobile device 202 tilted with respect to a predetermined plane defined by the stationary gesture. According to the illustrated example, the input represented by the right trigger button 130 included in the exemplary graphical user interface 100 of FIG. 1 can be mapped to an amount of tilt measured by the touch-enabled mobile device 202 relative to a predetermined plane. Accordingly, the upper right corner of the touch-enabled mobile device 202 can be tilted downward by the user to varying orientations to activate the input represented by the right toggle button 130 of fig. 1. Thus, the amount of tilt in the upper right corner of touch-enabled mobile device 202 may mimic squeezing the trigger of a conventional controller used with a video game console with varying degrees of depression. As another example, it is contemplated that pressing the right trigger button 130 of fig. 1 presented on the display of the touch-enabled mobile device 202 can be depicted as a function of the measured amount of tilt (e.g., visual feedback can be provided to the user).
Similarly, FIG. 8 depicts the upper left corner of the touch-enabled mobile device tilted relative to a predetermined plane defined by the stationary gesture. It is to be appreciated that the input represented by the left toggle button 128 of FIG. 1 can be manipulated according to the amount of tilt in the upper left corner in a substantially similar manner as compared to the example of FIG. 7 described above. Further, FIG. 9 depicts that both the upper left and upper right corners of the touch-enabled mobile device 202 are tilted together relative to a predetermined plane defined by the stationary gesture. According to this example, both the input represented by the left trigger button 128 of FIG. 1 and the input represented by the right trigger button 130 of FIG. 1 can be manipulated simultaneously in accordance with the example described above with reference to FIG. 7.
10-12, various additional exemplary orientations of the touch-enabled mobile device 202 are shown that can be used to manipulate the operation of a game presented on the display of the touch-enabled mobile device 202. For example, touch-enabled mobile device 202 can be moved from a stationary pose as shown in FIG. 6 to an orientation as shown in FIGS. 10-12, and such movement can be mapped to one or more inputs for controlling operations in the game. As shown, fig. 10 and 11 depict the touch-enabled mobile device 202 rotating relative to a stationary gesture within a plane defined by a surface of a display of the touch-enabled mobile device 202. In addition, FIG. 12 shows the touch-enabled mobile device 202 moving downward relative to the stationary gesture within a plane defined by a surface of a display of the touch-enabled mobile device 202.
According to an example, it can be detected whether the touch-enabled mobile device 202 is rotating within a plane defined by a surface of the display. Further, operations in the game may be controlled whether the touch-enabled mobile device 202 is rotated as detected. Additionally, whether the touch-enabled mobile device 202 is rotated may be mapped to discrete input information for the game.
For example, a clockwise rotation as shown in fig. 10 (e.g., as compared to the orientation from fig. 6) of the touch-enabled mobile device 202 or a counterclockwise rotation as shown in fig. 11 (e.g., as compared to the orientation from fig. 6) of the touch-enabled mobile device 202 may be detected. By way of illustration, whether a clockwise rotation is detected may map to an input represented by the right buffer 134 of fig. 1 (e.g., a discrete input), and whether a counterclockwise rotation is detected may map to an input represented by the left buffer 132 of fig. 1 (e.g., a discrete input). According to an illustration, the clockwise rotation or counterclockwise rotation can be due to a user tapping the upper right or upper left corner of the touch-enabled mobile device 202; however, it can be appreciated that such tapping need not be employed to rotate the touch-enabled mobile device 202.
According to another example, the touch-enabled mobile device 202 may be detected to move downward within a plane defined by a surface of the display as shown in FIG. 12. According to this example, if the touch-enabled mobile device 202 is detected as having moved downward, both the input represented by the left buffer 132 of FIG. 1 and the input represented by the right buffer 134 of FIG. 1 may be activated. Furthermore, although not shown, it is also contemplated that: upward movement of the touch-enabled mobile device 202 may additionally or alternatively be detected and utilized in a similar manner as discussed previously in relation to downward movement.
Referring now to FIG. 13, an exemplary system 1300 for controlling a game executed by a touch-enabled mobile device (e.g., touch-enabled mobile device 202 of FIG. 2) is illustrated. System 1300 includes an output component 1302 that presents a graphical user interface 1304 (graphical user interface 100 of fig. 1, graphical user interface 500 of fig. 5, etc.) on a display 1306 of the touch-enabled mobile device. For example, the graphical user interface 1304 presented by the output component 1302 may include game data. According to an example, game data can be generated by output component 1302; however, it is also contemplated: the game data may be generated by different components (not shown).
Additionally, the system 1300 includes an interaction analysis component 1308 that can receive data related to touch from the display 1306. Interaction analysis component 1308 can be configured to detect various touches described herein based on touch related data received from display 1306. Further, the interaction component 1308 can be various inputs represented by visual indicators included in the graphical user interface 1304 described herein.
The system 1300 also includes a manipulation component 1310, which manipulation component 1310 can control the graphical user interface 1304 presented by the output component 1302. More specifically, the manipulation component 1310 can employ input information (e.g., analog input information, discrete input information, etc.) received from the interaction analysis component 1308 to control respective operations in the game that can cause game data presented as part of the graphical user interface 1304 (or output by the output component 1302 in substantially any other manner) to change.
According to an example, output component 1302 can present at least a thumb joystick (e.g., left thumb joystick 102 of fig. 1, right thumb joystick 104 of fig. 1, etc.) on a re-assignable area of display 1306, as well as mode selection buttons (e.g., left mode selection button 110, right mode selection button 112 of fig. 1, etc.) on a mode selection area of display 1306. In addition, the display 1306 may have a reallocated area and a mode selection area that are non-overlapping. According to this example, the interaction analysis component 1308 can be configured to detect a drag from a mode selection area of the display 1306 to a re-assignable area of the display 1306. Further, the interaction analysis component 1308 can be configured to detect a touch within a reallocated area of the display 1306.
Additionally, the manipulation component 1310 can control a first operation in the game in response to a touch detected within a re-assignable area of the re-display 1306 when a drag (e.g., from a mode selection area of the display 1306 to a re-assignable area of the display 1306) and a touch (e.g., within the re-assignable area of the display 1306) are detected by the interaction analysis component 1308 without a discontinuity in contact. A first operation in the game may be controlled based on a touch detected within a re-assignable area of display 1306 while a thumb joystick presented on the display by output component 1302 is represented at a depressed height. Thus, the first operation in the game may be an operation that is typically performed by moving a simulated thumb joystick on a conventional controller used with a video game console while pressing the simulated thumb joystick. According to an illustration, the first operation in the game that may be controlled as described above may be a first person shooter or a squat operation in a Role Playing Game (RPG) game type; however, claimed subject matter is not so limited.
Further, the manipulation component 1310 can otherwise control the second operation when the thumb joystick is presented on the display 1306 by the output component 1302 (e.g., when a drag and touch within the re-assignable region is intermittently detected). In such a scenario, the molded joystick rendered on display 1306 by output component 1302 is represented as being at a default height. A second operation in the game may be controlled in response to a touch being detected by the interaction analysis component 1308 within the reallocated area of the display 1306. For example, the second operation in the game may be an operation that is typically performed by moving an analog thumb joystick (e.g., without being depressed) on a conventional controller used with a video game console. By way of illustration, the second operation in the game that may be controlled as described above may be a first person shooter or a walking or running operation in an RPG game type; however, claimed subject matter is not so limited.
In various exemplary embodiments, the interaction analysis component 1308 may optionally include an input change component 1312. According to such an exemplary embodiment, output component 1302 may also present a transition button (e.g., transition button 118 of FIG. 1) on a transition area of display 1306. Additionally, the interaction analysis component 1308 may also be configured to detect touches within the transition region of the display 1306. Further, input change component 1312 may switch between simulated input information mapped to the game and discrete input information mapped to the game for touches detected within the reallocated area of display 1306 in response to touches detected within the transition area of display 1306. Accordingly, output component 1302 may present a thumb joystick on the re-assignable area of display 1306 when mapped to analog input information, and may present a directional pad (e.g., directional pad 502 of fig. 5) on the re-assignable area of the display when mapped to discrete input information. Additionally, manipulation 1310 can control a third operation in the game in response to detection of a touch within the re-allocatable region of display 1306 by interaction analysis component 1308 when mapped to discrete input information.
According to other exemplary embodiments, the system 1300 may optionally include a sensor 1314 that may provide data to the interaction analysis component 1308. The sensor 1314 may be, for example, a gyroscope, an accelerometer, a camera, and so on. Further, it is contemplated that the system 1300 may optionally include a plurality of sensors. It is understood that sensor 314 may be included in a touch-enabled mobile device.
By way of illustration, the interaction analysis component 1308 can employ data received from the sensors 1314 to measure a tilt of the touch-enabled mobile device relative to a predetermined plane (e.g., as shown in FIGS. 6-9). The predetermined plane may be set based on a stationary pose of the touch-enabled mobile device. Further, interaction analysis component 1308 can use data received from sensor 1314 to detect whether the touch-enabled mobile device is rotating within a plane defined by the surface of display 1306 (e.g., as shown in FIGS. 10-12). The manipulation component 1310 can control an operation in a game in response to a tilt of the mobile device and control a different operation in the game in response to whether the touch-enabled mobile device is rotated.
According to another example, sensor 1314 can be a camera coupled with (e.g., incorporated into, etc.) a touch-enabled mobile device. According to this example, the interaction analysis component 1308 can receive a sequence of images from a camera. Additionally, the interaction analysis component 1308 can detect a user gesture from a sequence of images received from a camera (e.g., perform signal processing on the sequence of images, etc.). Further, the manipulation component 1310 can control in-game operations in response to user gestures detected by the interaction analysis component 1308. By way of illustration, the gestures may be movements of the user's head (e.g., up, down, left or right, etc.), facial gestures (e.g., small gestures to rub teeth, blink, open or close the mouth, etc.), hand gestures, and so forth. By way of another example, it is contemplated that different gestures detected by the interaction analysis component 1308 can be used to control different operations accordingly.
According to yet another example, output component 1302 can vibrate the touch-enabled mobile device to provide force feedback related to control of a game. According to an illustration, with reference to the exemplary user interactions described in fig. 2-4, upon detecting the discontinuity of the third touch, output component 1302 may vibrate the touch-enabled mobile device to provide haptic feedback representing: the thumb joystick has been switched back to the default height; however, it is to be appreciated that the claimed subject matter is not limited to the aforementioned examples of haptic feedback, and it is contemplated that haptic feedback responsive to substantially any other user interaction may additionally or alternatively be provided by output component 1302.
14-15 illustrate exemplary methods related to controlling a game with a touch-enabled mobile device. While the methods are described as a series of acts performed in a sequence, it will be appreciated that the methods are not limited by the order of the sequence. For example, some acts may occur in different orders than described herein. Additionally, an action may occur concurrently with another action. Moreover, in some instances, not all acts may be required to implement a methodology described herein.
Further, the acts described herein may be computer-executable instructions that may be implemented by one or more processors and/or stored on one or more computer-readable media. Computer-executable instructions may include routines, subroutines, programs, threads of execution, and the like. Additionally, the results of the acts of the methods may be stored in a computer readable medium, displayed on a display device, and the like.
FIG. 14 illustrates a method for controlling a game with a touch-enabled mobile device. At 1402, a thumb joystick can be presented over a re-assignable area of a display of a touch-enabled mobile device. At 1404, a mode selection button can be presented on a mode selection area of a display of the touch-enabled mobile device 202. In addition, the re-allocatable region and the mode selection region are non-overlapping. At 1406, a first operation in the game may be controlled with the thumb joystick represented as being at a default height in response to a first touch detected within the re-assignable area of the display.
At 1408, a second touch can be detected. Thus, the second touch may be a drag from the mode selection area of the display to the re-assignable area of the display. At 1410, a second operation in the game may be controlled with the thumb joystick represented at a depressed height in response to a third touch detected within the re-assignable region of the display. Further, the second touch and the third touch are detected without a discontinuity of the contact (e.g., a continuous contact starting from the second touch until the third touch ends). For example, the second touch and the third touch may be a single touch to the display that does not release the display. At 1412, a discontinuity of the third touch may be detected. At 1414, a first operation in the game may be controlled in response to a fourth touch detected within a re-assignable area of the display after detecting the discontinuity of the third touch with the thumb joystick being indicated as at a default height.
Turning to FIG. 15, a method 1500 for changing an input for a game is shown, where the input can be received through a re-allocatable region of a display of a touch-enabled mobile device. At 1502, a toggle button, a thumb joystick, and a mode selection button can be presented on a display of the touch-enabled mobile device 202. A transition button can be presented on a transition area of a display of the touch-enabled mobile device. Additionally, a thumb joystick may be presented on a re-assignable area of a display of the touch-enabled mobile device. Further, a mode selection button can be presented on a mode selection area of a display of the touch-enabled mobile device, wherein the mode selection area of the display and the re-allocatable area of the display are non-overlapping.
At 1504, a first touch may be detected. The first touch may be a drag from a mode selection area of the display to a re-assignable area of the display. At 1506, a second touch may be detected within the re-allocatable region of the display. In addition, the first touch and the second touch are detected without a discontinuity in the contact. At 1508, a first operation in the game may be controlled with the thumb joystick represented at a depressed height in response to a second touch detected within the re-assignable region of the display.
At 1510, a discontinuity in the second touch can be detected (e.g., a user's thumb can be detected as being released from the display). At 1512, a third touch may be detected within the reallocated area of the display after detecting the discontinuity of the second touch. At 1514, a second operation in the game may be controlled with the thumb joystick represented as at a default height in response to a third touch detected within the re-assignable region of the display.
At 1516, a fourth touch may be detected within the transition region of the display while the thumb joystick is presented over the repartitionable region of the display. At 1518, switching from presenting the thumb joystick on the re-assignable area of the display to presenting the directional pad may be performed in response to a fourth touch detected within the transition area of the display. At 1520, a fifth touch may be detected within the reallocated area of the display with the directional pad presented on the reallocated area of the display. At 1522, a third operation in the game may be controlled in response to detecting a fifth touch within the reallocated area of the display.
Referring now to FIG. 16, there is illustrated a high-level diagram of an exemplary computing device 1600 that may be used in accordance with the systems and methods disclosed herein. For example, the computing device 1600 may be used in a system that controls a game executed by a touch-enabled mobile device. The computing device 1600 includes at least one processor 1604 that executes instructions stored in memory 1602. The instructions may be, for example, instructions for implementing the functions described as being performed by one or more of the components described above or instructions for implementing one or more of the methods described above. The processor 1602 may access the memory 1604 by way of a system bus 1606. In addition to storing executable instructions, the memory 1604 may also store game data, visual indicators, and the like.
The computing device 1600 also includes a data store 1608 that is accessible by the processor 1602 via the system bus 1606. The data store 1608 may include executable instructions, game data, visual indicators, and the like. Computing device 1600 also includes input interface 1610, which allows external devices to communicate with computing device 1600. For example, the input interface 1610 may be used to receive instructions from an external computer device, from a user, and the like. Computing device 1600 can also include an output interface 1612 to interface the computing device 1600 with one or more external devices. For example, the computing device 1600 may display text, images, etc. through the output interface 1612.
Additionally, while shown as a single system, it is to be understood that the computing device 1600 may be a distributed system. Thus, for example, several devices may be in communication by way of a network connection and may collectively perform tasks described as being performed by the computing device 1600.
As used herein, the terms "component" and "system" are intended to encompass a computer-readable data store configured with computer-executable instructions that when executed by a processor perform a particular function. The computer-executable instructions may include routines, functions, and the like. It is also to be understood that a component or system may be located on a single device or distributed among several devices.
Additionally, as used herein, the term "exemplary" is intended to mean "serving as an illustration or example of something.
The various functions described herein may be implemented in hardware, software, or any combination thereof. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The computer readable medium includes a computer readable storage medium. Computer readable storage media can be any available storage media that can be accessed by a computer. By way of example, and not limitation, such computer-readable media can comprise RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. Disk (disk) and disc (disc), as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc (BD), where disks (disks) usually reproduce data magnetically, while discs (discs) usually reproduce data optically with lasers. Additionally, propagated signals are not included within the scope of computer-readable storage media. Computer-readable media also includes communication media including any medium that facilitates transfer of a computer program from one place to another. The connection may be, for example, a communication medium. For example, if the software is transmitted from a web site, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of communication medium. Combinations of the above should also be included within the scope of computer-readable media.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned aspects, but one of ordinary skill in the art may recognize that many further combinations and permutations of various aspects are possible. Accordingly, the described aspects are intended to embrace all such alterations, modifications and variations that fall within the spirit and scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim.
Claims (10)
1. A method for controlling a game with a touch-enabled mobile device, comprising:
presenting a thumb joystick on a re-assignable area of a display of the touch-enabled mobile device;
presenting a mode selection button on a mode selection area of a display of the touch-enabled mobile device, wherein the reallocateable area and the mode selection area are non-overlapping;
controlling a first operation in the game with the thumb joystick represented as being at a default height in response to a first touch detected within a re-assignable area of the display;
detecting a second touch, wherein the second touch is a drag from a mode selection area of the display to a re-assignable area of the display;
controlling a second operation in the game with the thumb joystick represented as being at a depressed height in response to a third touch detected within a re-assignable area of the display, wherein the second touch and the third touch are detected without a discontinuity in contact;
detecting a discontinuity of a third touch; and
controlling a first operation in the game in response to a fourth touch detected within a re-assignable area of the display after detecting the discontinuity in the third touch with the thumb joystick represented as being at a default height.
2. The method of claim 1, further comprising:
presenting a first visual indicator on a re-assignable area of the display when the touch joystick is represented as being at a default height; and
presenting a second visual indicator on a re-assignable area of the display when the thumb joystick is represented at a depressed height, wherein at least a portion of the first and second visual indicators are different.
3. The method of claim 1, further comprising:
presenting a transition button on a transition area of a display of the touch-enabled mobile device;
detecting a fifth touch within a transition area of the display while the thumb joystick is presented on a repartitionable area of the display; and
switching from presenting the thumb joystick on a re-assignable area of the display to presenting a directional pad after detecting a fifth touch, wherein the re-assignable area of the display is configured as a discrete input when the directional pad is presented on the re-assignable area.
4. The method of claim 1, further comprising:
measuring an amount of tilt of the touch-enabled mobile device relative to a predetermined plane;
controlling a third operation in the game according to an amount of tilt of the touch-enabled mobile device, wherein the amount of tilt of the touch-enabled mobile device maps to simulated input information for the game, and a level of the simulated input information for the game remains constant when the amount of tilt of the touch-enabled mobile device remains constant relative to the predetermined plane;
presenting a trigger button on a display of the touch-enabled mobile device; and
depicting a press of the trigger button as a function of an amount of tilt of the touch-enabled mobile device.
5. The method of claim 1, further comprising:
detecting whether the touch-enabled mobile device is rotating within a plane defined by a surface of the display; and
controlling a third operation in the game according to whether the touch-enabled mobile device is rotating as detected, wherein the touch-enabled mobile device rotation maps to discrete input information for the game.
6. The method of claim 1, further comprising: displaying a gutter on a gutter area of the display, wherein the gutter area is between the reassignable area and the mode selection area, wherein a drag of a second touch is detected as passing through at least a portion of the gutter area.
7. The method of claim 1, further comprising: vibrating the touch-enabled mobile device after detecting the discontinuity in the third touch to provide haptic feedback indicating that the thumb joystick has been switched back to be at the default height.
8. A system (1300) for controlling a game executed by a touch-enabled mobile device, comprising:
an output component (1302), the output component (1302) presenting a thumb joystick over at least a re-assignable area of a display (1306) of the touch-enabled mobile device and a mode selection button over a mode selection area of the display of the touch-enabled mobile device, wherein the re-assignable area is non-overlapping with the mode selection area;
an interaction analysis component (1308), the interaction analysis component (1308) configured to detect at least a drag from a mode selection area of the display to a re-assignable area of the display, and a touch within the re-assignable area of the display; and
a manipulation component (1310), the manipulation component (1310) controlling a first operation in the game if the thumb joystick is represented as a depressed height when a drag from a mode selection area of the display to a re-assignable area of the display and a touch within the re-assignable area of the display are detected without a discontinuity of contact, and otherwise controlling a second operation in the game if the thumb joystick is represented as a default height in response to a touch detected within the re-assignable area of the display.
9. The system of claim 8, wherein the interaction analysis component detects a gesture of a user from a sequence of images received from a camera coupled with the touch-enabled mobile device, and wherein the manipulation component controls a third operation in the game in response to the gesture of the user.
10. A method for controlling a game with a touch-enabled mobile device, comprising:
presenting a transition button on a transition area of a display of the touch-enabled mobile device;
presenting a thumb joystick on a re-assignable area of a display of the touch-enabled mobile device;
presenting a mode selection button on a mode selection area of a display of the touch-enabled mobile device, wherein the reallocateable area and the mode selection area are non-overlapping;
detecting a first touch, wherein the first touch is a drag from a mode selection area of the display to a re-assignable area of the display;
detecting a second touch within the re-assignable area of the display, wherein the first touch and the second touch are detected without a discontinuity in contact;
controlling a first operation in the game with the thumb joystick represented at a depressed height in response to a second touch detected within a re-assignable area of the display;
detecting a discontinuity of the second touch; and
detecting a third touch within a re-allocatable area of the display after detecting the discontinuity of the second touch;
controlling a second operation in the game if the thumb joystick is represented as being at a default height in response to a third touch detected within a re-assignable area of the display;
detecting a fourth touch within a transition area of the display while the thumb joystick is presented on a repartitionable area of the display;
switching from presenting the thumb joystick on a re-assignable area of the display to presenting a directional pad in response to a fourth touch detected within a transition area of the display;
detecting a fifth touch within a transition area of the display while the directional pad is presented on a repartitionable area of the display; and
controlling a third operation in the game in response to detecting a fifth touch within a re-allocatable area of the display.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US13/252,207 US8678927B2 (en) | 2011-10-04 | 2011-10-04 | Game controller on mobile touch-enabled devices |
| US13/252,207 | 2011-10-04 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| HK1178832A1 HK1178832A1 (en) | 2013-09-19 |
| HK1178832B true HK1178832B (en) | 2016-01-22 |
Family
ID=
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10035063B2 (en) | Game controller on mobile touch-enabled devices | |
| US10990274B2 (en) | Information processing program, information processing method, and information processing device | |
| US20200346110A1 (en) | User interface processing apparatus, method of processing user interface, and non-transitory computer-readable medium embodying computer program for processing user interface | |
| TWI567627B (en) | Graphical user interface, system and method for implementing a game controller on a touch-screen device | |
| JP5994019B2 (en) | Video game processing apparatus, video game processing method, and video game processing program | |
| US9772743B1 (en) | Implementation of a movable control pad on a touch enabled device | |
| US20110172013A1 (en) | User interface processing apparatus, method of processing user interface, and program for processing user interface | |
| JP6228267B2 (en) | Video game processing apparatus, video game processing method, and video game processing program | |
| US9072968B2 (en) | Game device, game control method, and game control program for controlling game on the basis of a position input received via touch panel | |
| JP2025105663A (en) | Method, user device and non-transitory computer readable storage medium - Patents.com | |
| AU2022232383B2 (en) | Virtual button charging | |
| HK1178832B (en) | Game controller on mobile touch-enabled devices | |
| JP2009116908A (en) | Input processing program and input processing apparatus | |
| JP6453972B2 (en) | Video game processing apparatus, video game processing method, and video game processing program | |
| JP2013077309A (en) | Input processing program and input processing device | |
| JP2019037877A (en) | Video game processor, video game processing method, and video game processing program | |
| JP2009093680A (en) | Input processing program and input processing apparatus | |
| JP2009093681A (en) | Input processing program and input processing apparatus | |
| JP2011129166A (en) | Input processing program and input processing device |