HK1086213B - Pose detection method and video game apparatus - Google Patents
Pose detection method and video game apparatus Download PDFInfo
- Publication number
- HK1086213B HK1086213B HK06106296.2A HK06106296A HK1086213B HK 1086213 B HK1086213 B HK 1086213B HK 06106296 A HK06106296 A HK 06106296A HK 1086213 B HK1086213 B HK 1086213B
- Authority
- HK
- Hong Kong
- Prior art keywords
- difference
- background
- calculating
- image
- background difference
- Prior art date
Links
Description
Technical Field
The present invention relates to a gesture detection method for a video game using an input image provided by a camera, a video game device, a gesture detection program, and a computer-readable medium containing a computer program.
Background
In recent years, video games using input images provided by video cameras have become increasingly popular. In such a video game, for example, an image of a posture of a player is captured by a camera, and the captured image is combined with an image of, for example, a button item. For example, the combined image may be displayed on a monitor, and when the player's hand (arm) moves over a button item, the button item may respond to the movement and take an action. In this case, command input is performed only based on the image captured by the camera, and a game controller (control panel operated by hand) may not be required to control the operation of the game.
In the prior art, such video games rely on a technique that includes acquisition of what is known as frame differences and, when a change greater than a predetermined level is detected in a portion, identifying motion in that portion. The frame difference refers to a difference between a previous frame and a current frame (i.e., a difference in terms of, for example, pixel information between pixels of a frame, pixel information corresponding to a color signal in an RGB format, a luminance signal in a YUV format, a signal representing a difference between a luminance signal and a red signal, or a signal representing a difference between a luminance signal and a blue signal). Further, in the related art, one or more images, such as button items, are arranged on a monitor screen to control the progress of a game, and are configured such that an image of a hand or some other movable portion is moved on the button items to realize game operations.
As described above, in a video game using an input image provided by a camera, an operation is realized based on detecting a movement of an image of a hand on one or more button items, and therefore, an image of a player must be displayed on a monitor screen. Therefore, a display screen representing a game feature such as a CG (Computer Graphics) screen cannot be displayed on the entire monitor screen, and game content may be deteriorated.
Further, in this case, since the action is caused only by activating the button item, the operation is restricted; that is, more detailed operations such as the following cannot be realized: continuous control of, for example, the movement and position of a game character is realized by a simulated stick-type game controller (i.e., a game controller capable of outputting operation information including an intermediate value by tilting a stick-type operation unit in the up/down/left/right directions with a finger).
Disclosure of Invention
The present invention has been made in view of one or more problems of the related art, and an object thereof is to provide a gesture detection method, a game device, a gesture detection program, and a computer-readable medium containing a computer program, which can be operated by detecting a gesture of a player in a video game using an input image supplied from a camera.
According to an aspect of the present invention, there is provided a gesture detection method implemented in a video game using input images provided by a camera to detect a gesture of a player, the method comprising the steps of: calculating a left frame difference or a background difference of a left image corresponding to a left area of an image captured by a camera, and a right frame difference or a background difference of a right image corresponding to a right area of the captured image, wherein the captured images include a player image representing a posture of a player, calculating a left center position of the left frame difference or the background difference, and a right center position of the right frame difference or the background difference; calculating a position difference in the up-down direction between the left center position and the right center position; calculating an average position value in the up-down direction of the left center position and the right center position; and generating operation information corresponding to the posture of the player based on the calculated position difference and the calculated average position value; converting a value based on the calculated position difference and the calculated average position value into an analog stick signal; calculating an additional region background difference of an additional region other than the left region and the right region in the captured image; and generating operation information corresponding to the posture of the player based on whether the additional area background difference is a substantial difference or not, when it is determined that the left side background difference and the right side background difference are not substantial differences.
In another preferred embodiment of the present invention, the captured image is displayed in a part of the monitor screen as an operation confirmation screen.
In another preferred embodiment, the gesture detection method of the present invention further comprises the steps of:
at least one marker is displayed indicating the position of the player's hand identified in at least one of the left center position and the right center position.
According to an aspect of the present invention, there is provided a video game apparatus having a CPU which executes a video game using an input image provided by a camera, the apparatus comprising: a left frame difference and background difference calculating section for calculating a left frame difference or a background difference of a left image corresponding to a left area of an image captured by the camera; and a right frame difference and background difference calculation section for calculating a right frame difference or a background difference of a right image corresponding to a right area in the captured image, wherein the captured image includes a player image representing a posture of a player, the video game apparatus further comprising: a center position calculating section for calculating a left center position of the left frame difference or the background difference and a right center position of the right frame difference or the background difference; a position difference calculating section for calculating a position difference in an up-down direction between the left center position and the right center position; an average position value calculating section for calculating average position values in the up-down direction of the left center position and the right center position; an operation information generation section for generating operation information corresponding to a posture of the player based on the calculated position difference and the calculated average position value; an analog stick signal converting section for converting a value based on the position difference and the average position value into an analog stick signal; a first background difference calculation section for calculating a left background difference of the left area and a right background difference of the right area; a second background difference calculation section for calculating an additional region background difference of an additional region other than the left region and the right region in the captured image; wherein the operation information generation section is configured to generate the operation information corresponding to the posture of the player based on whether the additional area background difference is a substantial difference or not when it is determined that the left side background difference and the right side background difference are not the substantial difference.
Drawings
Fig. 1 is a block diagram showing the structure of a game apparatus according to an embodiment of the present invention;
FIG. 2 is a block diagram showing the functional structure used to implement gesture detection;
FIG. 3 is a diagram illustrating example image processing operations to implement gesture detection;
fig. 4 is a diagram showing an example screen displayed on a monitor screen when gesture detection is performed;
FIGS. 5A to 5E are diagrams showing gestures detected in gesture detection according to an example;
fig. 6 is a diagram showing another example screen displayed on a monitor screen when gesture detection is performed;
FIGS. 7A to 7I are diagrams showing a gesture detected in gesture detection according to another example;
fig. 8 is a flowchart showing steps of performing a background image acquisition process according to an embodiment of the present invention;
fig. 9 is a diagram showing an example screen displayed when the left-side background-image acquiring process is performed;
fig. 10 is a diagram showing an example error screen displayed when the left-side background-image acquiring process fails;
fig. 11 is a diagram showing an example screen displayed when the right background image acquisition process is performed;
fig. 12 is a diagram showing an example screen displayed when the right background image acquisition process fails;
fig. 13 is a diagram showing an example screen displayed in a case where the camera is moved from when the left background image is acquired to when the right background image is acquired; and
fig. 14 is a diagram showing an example screen displayed when only images of the left and right edge regions are acquired in the background image acquisition process.
Detailed Description
Preferred embodiments of the present invention will be described below with reference to the accompanying drawings.
Fig. 1 is a block diagram showing the structure of a video game apparatus according to an embodiment of the present invention.
The video game apparatus shown in fig. 1 includes: a video game apparatus body 1, and a video camera 2, a display monitor 3 such as a television monitor, and a speaker 4 connected to the video game apparatus 1 through, for example, a USB (universal serial bus) cable. Note that the video camera 2 does not necessarily have to be directly connected to the video game device body 1, and may be connected to the video game device body through, for example, a network.
The video game device main body 1 includes: a program storage device 11 such as a CD-ROM or DVD in which game software (program) is stored; a CPU12 that executes all control processing of the video game device main body 1; a main memory 13 that temporarily stores programs and data used in execution control processing; a graphic memory (memory) 14 that stores image data; an image processing unit 15 that generates and controls an image according to game content; and an audio processing unit 16 that generates and controls audio; an image output unit 17 that outputs an image signal to the display monitor 3; and an audio output unit 18 that outputs audio to the speaker 4.
Fig. 2 is a block diagram showing a functional structure for realizing gesture detection. According to the shown example, the gesture detection relies on functional parts related to background differences (differences in pixel information between pixels of the background image and the current image) including: a background image storage section 101 for storing a background image acquired by the camera 2; a background difference calculation section 102 for calculating a difference (background difference) between the current image captured by the camera 2 and the background image stored by the background image storage section 101; and a background difference determination section 103 for determining whether or not a background difference exists in a predetermined area.
Further, the gesture detection according to this example relies on a functional part related to a frame difference (difference in pixel information between pixels of a previous frame and a current frame), including: a previous frame image storage section 104 for storing a previous frame image captured by the camera 2; a frame difference calculation section 105 for calculating a difference (frame difference) between the current image captured by the camera 2 and the previous frame image stored by the previous frame image storage section 104; a frame difference center calculating section 106 for calculating a frame difference center position within a predetermined area; and an analog stick input conversion section 107 for converting the input command into a signal corresponding to the analog stick signal from the game controller based on the center position of the frame difference calculated by the frame difference center calculation section 106.
Further, the gesture detection according to this example relies on a functional section for combining the background difference detection result and the frame difference detection result, which is implemented by the operation information conversion section 108 to generate the operation information of the game character based on the determination result of the background difference determination section 103 and the analog stick signal from the analog stick input conversion section 107.
Fig. 3 is a diagram showing an example image processing operation for realizing gesture detection using the functional part shown in fig. 2.
In the example shown in fig. 3, the frame difference calculation section 105 calculates the frame difference of the image 201 captured by the camera 2, and the frame difference center calculation section 106 calculates the frame difference centers 204 and 205 corresponding to the hand (arm) positions of the player 501 detected within the areas 202 and 203 having the predetermined width corresponding to the left and right areas, respectively.
Specifically, center positions 206 and 207 indicating height positions (i.e., positions with respect to the up/down direction) of frame difference centers 204 and 205 of the areas 202 and 203, respectively, are calculated, and a height difference (position difference) 208 between the left and right center positions 206 and 207, and an average value (average position value) 209 of the calculated left and right center positions 206 and 207 with respect to the up/down direction are calculated.
The analog stick input conversion section 107 outputs a value obtained by multiplying the difference 208 between the centers 206 and 207 by a predetermined coefficient as a left/right direction signal of the analog stick signal, and outputs a value obtained by multiplying the average 209 of the centers 206 and 207 by a predetermined coefficient as an up/down direction signal of the analog stick signal.
Further, the background difference calculation section 102 calculates a background difference of the captured image 201. The background difference determination section 103 determines whether there is a background difference in the left and right areas 202 and 203 and the area 210. Note that in the case where gesture detection is limited to detecting gestures based on the positions of the centers 204 and 205 of the left and right hands of the player 501 (for example, extending the arm (hand) in the left-right direction and simultaneously raising/lowering the arm, or alternately raising/lowering the left and right arms), it is not necessary to determine whether or not a background difference is detected in the area 210. However, when the gesture detection is configured to detect a gesture in which, for example, both hands (arms) are down to issue a signal of a "stop" command, or both hands are up to issue a signal of a "sudden acceleration" command, the gesture detection relies on the presence of a background difference in the regions 202 and 203 and the region 210 (note that a detection region for detecting the presence of a hand in a low-side region is not provided because of the fact that an image of a hand is not easily distinguished from an image of a body, and thus it is difficult to detect such a position of a hand of the player 501). Specifically, when the left and right hands of the player 501 are directed upward or downward, the centers 204 and 205 of the left and right hands of the player 501 cannot be detected in the case where the areas 202 and 203 do not detect the background difference. In this case, when the background difference is not detected in the area 210, it can be determined that the hand of the player 501 is downward. Further, when the background difference is not detected in the areas 202 and 203 and the background difference is detected in the area 210, it can be determined that the hand of the player 501 is upward.
In the above example, the centers 204 and 205 of the hands of the player 501 are detected based on the background difference; however, in another example, detection may be implemented based on frame differences. In this example, when the hand of the player 501 is not moving, the frame difference cannot be detected, and in this case, the previous detection state is maintained. The detection based on the background difference has an advantage of detecting a gesture even when the hand of the player 501 is not moving. Further, note that the background difference in the detection area 210 may also be replaced with a frame difference; however, in order to achieve more accurate image recognition, it is preferable to use a background difference.
Fig. 4 is a diagram showing an example screen displayed on a monitor screen when gesture detection is performed. In fig. 4, a game screen is displayed on a monitor screen 331 of the display monitor 3 (see fig. 1), and an operation confirmation screen 335 for realizing gesture detection is displayed on a part of the monitor screen 331. In the illustrated game screen, a boat 333 driven by a game character (e.g., a monkey) is manipulated along a curved route 332 to capture items (e.g., a banana 334) provided along the route 332.
As shown in the enlarged operation confirmation screen 335 of fig. 4, in regions 336 and 337 for detecting the position of the hand (arm) of the player 501, marks 338 and 339 are arranged to indicate the detected position of the hand.
Fig. 5A to 5E are diagrams illustrating example gestures detected in the game gesture detection illustrated in fig. 4. FIG. 5A shows a gesture in which both hands are extended and raised, representing a "slow down" command; FIG. 5B shows a gesture in which both hands are extended in a horizontal direction, representing a "slight acceleration" command; FIG. 5C shows a gesture in which both hands are extended and down, representing an "accelerate" command; FIG. 5D shows a left-hand up and right-hand down gesture, representing a "turn right" command; while figure 5E shows a right hand up and left hand down gesture, representing a "turn left" command. Note that in this example, an intermediate (transitional) state of the gesture is also detected, and the operation information is generated in an analog manner.
Fig. 6 is a diagram showing another example screen displayed on a monitor screen when gesture detection is performed. In fig. 6, a game screen is displayed on a monitor screen 341 of the display monitor 3, and an operation confirmation screen 343 for realizing gesture detection is displayed on a part of the monitor screen 341. In the illustrated game screen, a game character 342 flying in the displayed area is provided.
As shown in the enlarged operation confirmation screen of fig. 6, in regions 344 and 345 for detecting the positions of the hands (arms) of the player 501, the detected positions of the hands are represented by marks 346 and 347. Further, in the illustrated operation confirmation screen, a kick button 348 for abruptly increasing the flying speed is displayed in an upper area. Note that button 348 is an example of region 210 shown in fig. 3.
Fig. 7A to 7I are diagrams illustrating example gestures detected in the game gesture detection illustrated in fig. 6. FIG. 7A shows a gesture in which both hands are extended and raised, representing a "move forward while up" command; FIG. 7B illustrates a gesture in which both hands are extended in a horizontal direction, representing a "smooth" command; FIG. 7C shows a gesture in which both hands are extended and down, representing a "move forward while descending" command; FIG. 7D shows a left-hand up and right-hand down gesture, representing a "turn right while moving forward" command; FIG. 7E shows a right hand up and left hand down gesture, representing a "turn left while moving forward" command; FIG. 7F shows a gesture with both hands raised, representing a "sudden acceleration" command; FIG. 7G illustrates a left-hand down and right-hand horizontal gesture, representing a "turn right in stopped state" command; FIG. 7H illustrates a right hand down and left hand horizontal gesture, representing a "turn left in stopped state" command; and FIG. 7I illustrates a gesture representing a "stop" command. Note that in this example, an intermediate (transitional) state of the gesture is also detected, and the operation information is generated in an analog manner.
According to an aspect of the present invention, in detecting a posture of a player based on an image captured by a camera in a camera video game, a frame difference and/or a background difference is calculated for an image of a predetermined area of the captured image including a posture image of the player captured by the camera, and operation information corresponding to the posture of the player is generated based on the calculated frame difference and/or background difference. In this way, the same detailed operation as that achieved by the analog stick of the game controller can be achieved, as opposed to merely activating the button item to achieve the operation in the related art. Further, the captured image is not necessarily displayed on a large part of the monitor screen together with the image for realizing the operation, such as the button item, and the operation confirmation screen may be displayed on a small part of the monitor screen, so that most of the monitor screen can be used to display a screen such as a CG screen representing the actual game content.
The background image acquisition required to calculate the background difference is explained below.
Fig. 8 is a flowchart showing the procedure of a background image acquisition process for acquiring a background image necessary for calculating a background difference in the video game apparatus shown in fig. 1. Note that the illustrated background image acquisition process may be executed by the CPU12 of the video game apparatus main body 1 using a related program.
In the example shown in fig. 8, when a game is started, a game selection menu is displayed on the display monitor 3 shown in fig. 1 (step S1), and a game to be played is determined according to a selection made by the player (step S2). Note that, in this example, it is assumed that a plurality of mini-games are included in a set of game software; however, in the case where the game software includes only one game, steps S1 and S2 may be omitted.
Then, the guide information for acquiring the left background image is displayed on the display monitor 3 (step S3). Fig. 9 is a diagram showing an example screen displayed when the left-side background image acquisition process is performed. In fig. 9, a frame 303 representing a posture (standing position) outline of the player 501 and a button item 304 operated by the player 501 are displayed in a right-side operation area 302 on the right side of the monitor screen 301, and information 306 representing: "prepare to start the game. Please stand in the indicated box and wave the hand on the OK button after confirming that no moving object is displayed in the area ". Note that the left background acquisition area 305 is set slightly larger than half of the monitor screen 301 so as to provide an overlapping portion with the right background acquisition area 311 (see fig. 11), as described later. In this way, the influence of noise generated at the boundary portion between the left and right background acquisition areas 305 and 311 is reduced, and it is determined whether the camera is moved during acquisition of the background image based on the image difference between the overlapping portions of the left and right background acquisition areas 305 and 311.
Referring back to fig. 8, it is determined whether the OK button is activated (step S4). If the 0K button is not activated, the guidance information for left-side background image acquisition continues to be displayed (step S3). If the OK button is activated, it is determined whether there is any moving object included in the left background acquisition area 305 from the frame difference detected in the area 305 (i.e., the difference in pixel information between pixels of the frame) (step S5). Note that if a moving object is included in the background acquisition area, its corresponding image cannot be used as a background image for calculating a background difference; therefore, to improve efficiency, the presence of a moving object is checked before acquiring a background image.
When it is determined in step S5 that a moving object is present, an error screen is displayed on the monitor screen 301 for a predetermined time (step S6). Fig. 10 is a diagram showing an example error screen displayed when the left-side background-image acquiring process fails. In fig. 10, information representing that "a moving object is detected in the left area of the screen" is displayed on the monitor screen 301. The moving object is represented by a blue dot. Please avoid displaying any moving objects and retry. ".
Referring back to fig. 8, when it is determined in step S5 that the moving object is not included in the left background acquisition area 305, the image of the background acquisition area 305 is acquired as a background image (step S7).
Then, guidance information for acquiring the right background image is displayed on the display monitor 3 (step S8). Fig. 11 is a diagram showing an example screen displayed when the right background image acquisition process is performed. In fig. 11, in the left side operation area on the left side of the monitor screen 301, a frame 309 indicating the stance outline of the player 501 and a button item 310 operated by the player 501 are displayed, and in the right side background acquisition area 311 on the right side of the monitor screen 301, information 312 indicating that "please stand in the indicated frame, and wave the hand on the OK button after confirming that no moving object is displayed in the area is displayed. ".
Then, referring back to fig. 8, it is determined whether or not the OK button is activated (step S9). If the OK button is not activated, the guide information of the right background image acquisition is continuously displayed (step S8). If the OK button is activated, it is determined whether a moving object is included in the right background acquisition area 311 from the frame difference detected in this area 311 (step S10).
When it is determined in step S10 that a moving object is included in the right background acquisition area 311, an error screen is displayed for a predetermined time (step S11). Fig. 12 is a diagram showing an example error screen displayed when the right background image acquisition process fails. In fig. 12, information 313 indicating that "a moving object is detected in the right area of the screen" is displayed on the monitor screen 301. The moving object is represented by a blue dot. Please avoid displaying any moving objects and retry. ".
Referring back to fig. 8, when it is determined in step S10 that no moving object is included in the right background acquisition region 311, the image in the right background acquisition region 311 is acquired as a background image (step S12).
Then, it is determined whether or not a color difference (color difference nce) above a predetermined level is detected between the overlapping portions of the left and right background images (step S13). If such a color difference above the predetermined level is detected, an error screen is displayed (step S14), and the operation returns to the step of displaying guidance information for left-side background image acquisition (step S3). Fig. 13 is a diagram showing an example screen displayed in a case where the camera is moved from when the left background image is acquired to when the right background image is acquired. In fig. 13, information 314 indicating that "the camera has been moved" is displayed on the monitor screen 301. Please restart the preparation for starting the game. ".
Referring back to fig. 8, in the case where no color difference exceeding a predetermined level is detected between the overlapping areas of the left background image and the right background image, the background image acquisition process is ended, and the game is started (step S15).
Fig. 14 is a diagram showing an example screen displayed when only images of the left and right edge regions are acquired in the background image acquisition process. In the process described with reference to FIGS. 8-13, a background image of the entire area captured by the camera is acquired; however, in some games, only a background difference of a particular portion of the captured area may be required. In this case, a background image of the portion may be acquired.
In the example shown in fig. 14, background images of background acquisition regions 315 and 316 having a predetermined width corresponding to the left and right edge regions to be acquired are listed. In this case, a frame 318 representing the posture outline of the user 501 and the button item 310 operated by the player 501 are displayed in the center operation area 317, and information 320 representing "ready to start the game" is displayed. Please wave the hand on the OK button after confirming that no moving object is displayed in the left and right areas. "
According to an aspect of the present invention, operation information corresponding to a gesture is generated based on a frame difference and/or a background difference of an image (i.e., a pixel information difference between pixels of a background image and a current image) in a predetermined region of an image captured by a camera, wherein the captured image includes an image of a gesture of a player. Therefore, the same detailed operation as that realized by the analog stick of the game controller can be realized, and it is not necessary to display the image captured by the camera on the entire monitor screen.
Note that embodiments within the scope of the present invention include a gesture detection method, a video game apparatus, a gesture detection program, and a computer-readable medium containing a computer program. The gesture detection program may be embodied in any computer-readable medium carrying or having computer-executable instructions or data structures thereon. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer. By way of example, and not limitation, such computer-readable media may comprise physical storage media such as: computer readable media can include, but are not limited to, RAM, ROM, EEPROM, CD-ROM, other optical disk storage, other magnetic storage devices, or any other medium which can be used to carry or store desired program code in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Such a medium may include, for example, a wireless carrier signal. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or combination thereof) to a computer, the computer properly views the connection as a computer-readable medium. Thus, any such connection is properly termed a computer-readable medium. Combinations of the above should also be included within the scope of computer-readable media. Computer-executable instructions include, for example: instructions and data which cause a general purpose computer, special purpose computer, or processing device to perform a certain function or group of functions.
Although the invention has been shown and described with respect to certain preferred embodiments, it is obvious that equivalent embodiments or modifications will occur to others skilled in the art upon the reading and understanding of this specification. The present invention includes all such equivalent embodiments and modifications, and is limited only by the scope of the claims.
Claims (4)
1. A gesture detection method implemented in a video game using input images provided by a camera to detect a gesture of a player, the method comprising the steps of:
calculating a left frame difference or a background difference of a left image corresponding to a left area of an image captured by a camera, and a right frame difference or a background difference of a right image corresponding to a right area of the captured image, wherein the captured image includes a player image representing a posture of a player;
calculating a left center position of the left frame difference or the background difference and a right center position of the right frame difference or the background difference;
calculating a position difference in the up-down direction between the left center position and the right center position;
calculating an average position value in the up-down direction of the left center position and the right center position;
generating operation information corresponding to the posture of the player from the calculated position difference and the calculated average position value;
converting a value based on the calculated position difference and the calculated average position value into an analog stick signal;
calculating an additional region background difference of an additional region other than the left region and the right region in the captured image; and
when it is determined that the left-side background difference and the right-side background difference are not substantial differences, operation information corresponding to the posture of the player is generated based on whether the additional area background difference is a substantial difference.
2. The gesture detection method according to claim 1, characterized in that:
the captured image is displayed in a part of the monitor screen as an operation confirmation screen.
3. The gesture detection method according to claim 1, further comprising the steps of:
at least one marker is displayed indicating the position of the player's hand identified in at least one of the left center position and the right center position.
4. A video game apparatus having a CPU that executes a video game using an input image provided by a camera, the apparatus comprising:
a left frame difference and background difference calculating section for calculating a left frame difference or a background difference of a left image corresponding to a left area of an image captured by the camera; and a right frame difference and background difference calculation section for calculating a right frame difference or a background difference of a right image corresponding to a right area in the captured image, wherein the captured image includes a player image representing a posture of a player; the video game device further comprises:
a center position calculating section for calculating a left center position of the left frame difference or the background difference and a right center position of the right frame difference or the background difference;
a position difference calculating section for calculating a position difference in an up-down direction between the left center position and the right center position;
an average position value calculating section for calculating average position values in the up-down direction of the left center position and the right center position;
an operation information generation section for generating operation information corresponding to a posture of the player based on the calculated position difference and the calculated average position value;
an analog stick signal converting section for converting a value based on the position difference and the average position value into an analog stick signal;
a first background difference calculation section for calculating a left background difference of the left area and a right background difference of the right area;
a second background difference calculation section for calculating an additional region background difference of an additional region other than the left region and the right region in the captured image;
wherein the operation information generation section is configured to generate the operation information corresponding to the posture of the player based on whether the additional area background difference is a substantial difference or not when it is determined that the left side background difference and the right side background difference are not the substantial difference.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2004-255948 | 2004-09-02 | ||
JP2004255948A JP2006068315A (en) | 2004-09-02 | 2004-09-02 | PAUSE DETECTION PROGRAM, VIDEO GAME DEVICE, PAUSE DETECTION METHOD, AND COMPUTER-READABLE RECORDING MEDIUM CONTAINING THE PROGRAM |
Publications (2)
Publication Number | Publication Date |
---|---|
HK1086213A1 HK1086213A1 (en) | 2006-09-15 |
HK1086213B true HK1086213B (en) | 2011-03-04 |
Family
ID=
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7559841B2 (en) | Pose detection method, video game apparatus, pose detection program, and computer-readable medium containing computer program | |
US7785201B2 (en) | Background image acquisition method, video game apparatus, background image acquisition program, and computer-readable medium containing computer program | |
US8780052B2 (en) | Input data processing program and information processing apparatus | |
US10421013B2 (en) | Gesture-based user interface | |
EP2394710B1 (en) | Image generation system, image generation method, and information storage medium | |
JP5256269B2 (en) | Data generation apparatus, data generation apparatus control method, and program | |
JP5509227B2 (en) | Movement control device, movement control device control method, and program | |
US6923722B2 (en) | Game system and game program for providing multi-player gameplay on individual displays and a common display | |
JP2011258158A (en) | Program, information storage medium and image generation system | |
US20120172127A1 (en) | Information processing program, information processing system, information processing apparatus, and information processing method | |
US6692357B2 (en) | Video game apparatus and method with enhanced player object action control | |
US20140035813A1 (en) | Input device, input method and recording medium | |
JP2010137097A (en) | Game machine and information storage medium | |
US6793576B2 (en) | Methods and apparatus for causing a character object to overcome an obstacle object | |
WO2011158599A1 (en) | Video game device, video game control program, and video game control method | |
US8926427B2 (en) | Video game with screen flip and dual sets of collision data | |
HK1086213B (en) | Pose detection method and video game apparatus | |
US11654355B2 (en) | Operation input program and operation inputting method | |
HK1086214B (en) | Background image acquisition method and video game apparatus | |
JP5213913B2 (en) | Program and image generation system | |
JP2001224731A (en) | Game device and information storage medium |