[go: up one dir, main page]

WO2002030535A1 - Procede permettant d"afficher et d"evaluer des donnees de mouvement utilisable dans un appareil de jeu d"action - Google Patents

Procede permettant d"afficher et d"evaluer des donnees de mouvement utilisable dans un appareil de jeu d"action Download PDF

Info

Publication number
WO2002030535A1
WO2002030535A1 PCT/KR2001/001710 KR0101710W WO0230535A1 WO 2002030535 A1 WO2002030535 A1 WO 2002030535A1 KR 0101710 W KR0101710 W KR 0101710W WO 0230535 A1 WO0230535 A1 WO 0230535A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
original actor
displaying
position points
game player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/KR2001/001710
Other languages
English (en)
Inventor
Gerard Jounghyun Kim
Ungyeon Yang
Euijae Ahn
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dotacecom Co Ltd
Original Assignee
Dotacecom Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dotacecom Co Ltd filed Critical Dotacecom Co Ltd
Priority to AU2001294329A priority Critical patent/AU2001294329A1/en
Publication of WO2002030535A1 publication Critical patent/WO2002030535A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • A63F13/10
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/25Output arrangements for video game devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1012Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals involving biosensors worn by the player, e.g. for measuring heart beat, limb activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8005Athletics

Definitions

  • the present invention relates to method of playing and evaluating a motion data in a motion game apparatus, and more particularly to methods of playing the motion data so that a game player can easily follow the motion of an original actor and of evaluating the motion of the game player following the motion data.
  • a DDR game apparatus has been all the fashion. It has a music play device and a floor body sensing the foot action of a game player.
  • the foot action in which the game player will follows is informed to the game player by either a monitor or a foot action direction of the floor body together with music played by the music play device, and the game player follows the foot action in accordance with timing.
  • a sensor for sensing hand stretching action as well as the foot action of the game player is installed at a certain position thereof has been introduced. It can check the motions of both foot and hand actions using the sensor.
  • the one object of the present invention can be achieved by a method of displaying a motion data in a motion game apparatus having information of a basic frame displaying main motion of an original actor, and playing the motion data of the original actor consisting of a plurality of frames on a display device, comprising : a first step of setting play position points Xf, Yf and Zf on the display device for playing continuous action of the original actor, and initial position points Xi, Yi and Zi on the display device for displaying beforehand the basic frame which will be played after a predetermined time ⁇ t; a second step of displaying the continuous action of the original actor at the play position points Xf, Yf and Zf based on present time t, and simultaneously displaying the basic frame at the initial position points Xi, Yi and
  • the one object of the present invention can be achieved by a method of displaying a motion data in a motion game apparatus having information of a basic frame displaying main motion of an original actor, consisting of a plurality of frames, having play position points Xf, Yf and Zf on a display device for playing continuous action of the original actor and initial position points Xi, Yi and Zi on the display device for displaying beforehand the basic frame which will be played after a predetermined time ⁇ t, and displaying the motion data of the original actor on the display device, comprising ; a first step of displaying continuously the continuous action of the original actor at the play position points Xf, Yf and Zf based on a present time t, and simultaneously displaying the basic frame at the initial position points Xi, Yi and Zi after drawing out the basic frame which will be displayed at t + ⁇ t; and a second step of displaying with gradually moving the basic frame displayed at the initial position points Xi, Yi and Zi to the play position points Xf, Yf and Z
  • the other object of the present invention can be achieved by a method of evaluating the motion of a game player following the motion data of an original actor displayed on a display device in a motion game apparatus having the information of a basic frame displaying main motion of the original actor, consisting of a plurality of frames, and setting play position points Xf, Yf and Zf on the display device for playing continuous action of the original actor and initial position points Xi, Yi and Zi on the display device for displaying beforehand the basic frame which will be played after a predetermined time ⁇ t, comprising ; a first step of storing the three-dimensional motion data of the original actor retargeted after converting the three-dimensional motion data of the original actor by reflecting the body size of the game player; a second step of displaying continuously the continuous action of the original actor at the play position points Xf, Yf and Zf based on present time t using the retargeted three-dimensional motion data of the original actor and simultaneously displaying the basic frame at the initial position points Xi,
  • FIG.1 is a configuration diagram of a motion game apparatus of the present invention.
  • FIG. 2 is a block diagram of a motion game apparatus applied to the present invention.
  • FIG. 3 is a flow chart showing retargeting processing sequence.
  • FIG. 4 is a frame structure diagram for describing one example of an image frame configuration of a three-dimensional original actor according to the present invention.
  • FIG. 5 and FIG. 6 are description drawings for describing a method of displaying preparation action of an original actor.
  • FIG. 7 is a flow chart showing overall action displaying and evaluating a motion using a motion frame of FIG. 4.
  • FIG. 8 is a flow chart describing a method of evaluating a motion of a game player.
  • FIG. 9a to FIG. 9d are a flow chart showing performance sequence of the present invention in detail.
  • FIG. 1 is a configuration diagram of a motion game apparatus of the present invention.
  • the motion game apparatus includes a plurality of cameras 100, a display device 200 displaying a motion of an original actor, an input section receiving input from a game player, and a sound device.
  • the game player attaches a plurality of optical sensors (not shown) to his body and takes a motion in a certain region range capable of be sensed by the camera.
  • the camera 100 monitors the motion of the game player using the optical sensors attached to the body of the game player.
  • the optical sensors are attached to all articulation parts of the game player if possible. More preferably, the number of the optical sensors is appropriately selected in view of processing rate because the number of image processing operations increase as the number of the optical sensors increases. Furthermore, while it is possible to obtain precise data as the number of the camera increase, it is preferable to appropriately select the number of the cameras in view of the same problem as the number of the sensors above described.
  • FIG. 2 is a block diagram of a motion game apparatus applied to the present invention.
  • the motion game apparatus of the present invention includes an input section receiving an input from a game player, a control section, a sensor/camera, an image output device, a music D/B, a character/stage DB, a sound output device, and an image memory.
  • the control section has a motion selection section outputting a motion selection signal in accordance with an output of the input section, a motion data storage section storing a plurality of motion data of an original actor, a retargeting processing section retargeting the motion of the original actor, a retargeted original actor motion data storage section storing a retargeted motion data, a game player motion data storage section storing the motion data of the game player, a motion evaluation section evaluating after comparing the retargeted motion data of the original actor with the motion data of the game player, a motion capture section capturing the motion of the game player, and a resultant image output section displaying the output of the motion evaluation section on the image output device.
  • the operation flow of a motion game apparatus in FIG. 2 is will be now explained. Firstly, the game player selects one of the plurality of motion data using the input section.
  • the motion select section selects the one of the plurality of motion data stored to the motion data storage section and then inputs the selected data to the retargeting processing section.
  • the retargeting processing section receives a body size from the game player or senses the body size by image processing using the camera, extracts the body size difference between the game player and the original actor, carries out retargeting process for the motion data of the original actor, and stores the retargeted motion data of the original actor in the motion data storage section. Specific description with respect to the retargeting will be described in FIG. 3 later.
  • the retargeted motion of the original actor is provided to the game player by the image processing section, the image memory, and the image output device.
  • the basic frame for main motion which will be provided is provided to the game player by a frame storage section and a sliding control section so that the game player can easily follow the motion of the original actor.
  • the motion of the game player is stored in the game player motion data storage section real-timely.
  • the motion evaluation section evaluates the motion of the game player after comparing the motion of the game player with the retargeted motion data of the original actor. Finally, the result is provided to the game player by the image output device.
  • the retargeting must be performed. That is to say, the retargeting is a processing to remove the body size difference.
  • two data with respect to body sizes of the original actor and game player are received by input from a user or image capture of the camera using an image processing.
  • the two data are analyzed, and the motion data of the original actor is converted so as to be appropriate to the body size of the game player.
  • the three-dimensional motion data of the original actor is magnified or reduced based on the specific part (mostly, waist) of the three-dimensional motion data of game player and then the center of magnified or reduced the three-dimensional motion data of the original actor is moved.
  • the retargeted motion data of the original actor may be obtained by performing conversion to satisfy a restriction condition.
  • FIG. 4 is a frame structure diagram for describing one example of an image frame configuration of a three-dimensional original actor according to the present invention.
  • each of the boxes designates a motion frame stored to unit time intervals and each of the numbers on the boxes shows a frame number.
  • the image frame of an original actor consists of a start frame, an end frame and motion frames.
  • the motion frames are divided into basic motion frames and common motion frames.
  • a special motion (namely, main motion of dance actions) of these motion frames is defined as the basic motion frame.
  • the information with respect to the basic motion frame may be marked using a management program marking "a specific pose" by an expert (content maker) after a point of time creating a motion data or storing the motion data.
  • the information with respect to the basic motion frame may be included in a portion of the motion frame displaying the motion of the original actor or be used as an additional file.
  • the motion frame is generally stored in the form of file to display a motion image such as for example, a BVH file.
  • a BVH file The format of the BVH file firstly describes about a triple structure consisting of a several sensors for designating each of the parts of the body, and then displays formations on an angle and a coordinate value versus time of the sensors corresponding to each of the parts of the body. The coordinate versus time may be understood by this BVH file.
  • FIG. 5 and FIG. 6 are description drawings for describing a method of displaying preparation action of an original actor.
  • FIG. 5 shows an original actor motion frame displaying the motion of the original actor
  • (b) is a basic motion frame displaying an information on the basic motion of the motion frame of the original actor
  • (c) shows a game player motion frame storing the motion of a game player following the motion of the original actor, respectively.
  • 5th, 10th, 17th, 19th, 21st, 25th, 27th, 28th, 31st, 37th, 38th, 40th, 42nd and 48th frames are designated as the basic frames.
  • ⁇ t is to display time interval for detecting the basic motion frame.
  • ⁇ t shows the state that 6th frame is ruled of the time displayed.
  • the 16th motion frame of the original actor is provided to the game player at the present time t, and the basic motion frames (17th, 19th and 21st frames) between t and t + ⁇ t are simultaneously displayed, for showing beforehand to the game player.
  • FIG. 6 is description drawing for describing a method of displaying such points with a display device. As shown in FIG. 6 (a), continuous motions (the 16th frame in FIG.
  • a coordinate value Yi on Y axis starting a guide action and a coordinate value Yf on Y axis of the screen displaying real-timely continuous actions must satisfy following equation (1).
  • the coordinate value Yi on Y axis starting the guide action and the coordinate value Yf on Y axis of the screen displaying real-timely continuous actions must satisfy following equation (2).
  • the direction of a vector consisting of starting points Xi, Yi and Zi and end points Xf, Yf and Zf can be easily determined by analyzing motion data such as BVH of the original actor including the position values of the sensor attached on the articulation parts of the original actor. Therefore, when displaying the guide action in view of the direction of the vector, it is possible to display the guide action more actually. Furthermore, it is preferable to locate position points Xf, Yf and Zf on the screen displaying the continuous actions of the original actor on the center of the screen, for providing the game player with facility. Moreover, it is preferable to appropriately locate start points Xi, Yi and Zi in view of the number of preparation action frame which will be displayed and horizontal size of the screen. The eye height He of the game player can be easily calculated using statistical value from the height of the game player to his eye position. Here, the height of the game player is input by the game player or operated by a camera.
  • FIG. 7 is a flow chart showing overall action displaying and evaluating the motion using the motion frame of FIG. 4.
  • a motion data, a music DB and a character DB are selected from an input section by the game player (step S71).
  • the body size of an original actor is compared with that of the game player using the body size inputted by the game player or obtained by analyzing the image of the camera (step S72).
  • a three-dimensional image coordinate value data of the original actor is converted using this comparison value (step S73).
  • the number and position of sensors attached on the game player are different from the number and position of sensors attached on the original actor for capturing the motions of the original actor, it is necessary to primarily agree with targets which will be compared for the purpose of evaluating the targets appropriately.
  • the position data of the reference sensor is calculated from the motion data of the original actor based on the number and position of the sensors attached on the game player (step S74). Thereafter, the three-dimensional motion data of the original actor is real-timely displayed, and the guide action is simultaneously displayed (step 75). While the three-dimensional motion of the original actor may be displayed prior to retargeting the motion of the original actor, it is preferable to display the three-dimensional motion of the original actor after retargeting. In case of displaying the three-dimensional motion of the original actor prior to retargeting, while it is possible to substantially display the motion of the original actor, it is
  • step S76 The motion of the game player following the motion of the original actor displayed on the screen is stored real-timely by the frame (step S76).
  • step 77 whether or not the three-dimensional motion frame of the original actor displayed at time t, is the basic motion frame, is determined (step 77).
  • the motion frame of the game player identified real-timely is compared with the position data of the reference sensor calculated at the step 74 and evaluated (step S78).
  • the steps S76 and S77 may be performed regardless of sequence.
  • the step S74 may be also performed after step S77.
  • the calculated resultant value can be compared with the sensor position obtained from the motion of the game player and evaluated.
  • the step S76 or the step S74 is performed after the step S77, only the motion frame of the game player following the basic motion frame of the original actor is stored, so that only the position data of the reference sensor will be operated.
  • step S79 Thereafter, whether or not the frame is the last frame, is determined (step S79). In case of the last frame, the step S75 is again started, while in case of not being the last frame, the final result is displayed using the resultant value of the comparison and evaluation (step S80).
  • step 1 Since it must be performed real-timely within a short time to compare and evaluate whether or not the motion of the game player is agreed with the basic motion frame of the original actor, it is performed by procedure proposed in FIG.8 for the purpose of reducing time required for an image evaluation procedure. 1.
  • step 2 The step of operating the position value of the reference sensor (step 1)
  • the three-dimensional position value of the reference sensor is operated from the reference motion data of the original actor based on the number and position of the sensors attached on the game player. This is defined as the position value of the reference sensor. It is assumed that the position value of the reference sensor for a sensor 1 attached on the game player is (XI, Y1,Z1), for facilitating description.
  • step S82 The step of calculating two-dimensional coordinate values corresponding to each of the cameras.
  • a plurality of cameras are used for identifying the three-dimensional positions of the sensors attached on the game player.
  • the three-dimensional image is obtained by synthesizing the two-dimensional image obtained from the plurality of the cameras.
  • the two-dimensional coordinate value of corresponding to sensor position included in each of the cameras for embodying corresponding to the three-dimension image of the original actor can be calculated.
  • a two-dimensional coordinate value (XI 1, Yl l) displayed on a first camera, a two-dimensional coordinate value (X21, Y21) displayed on a second camera, ..., a two-dimensional coordinate value (Xnl, Ynl) displayed on a nth camera can be respectively calculated from the reference position value (XI, YI, ZI) of the step S81 by this image processing method.
  • step S83 The step of calculating a target region of each of cameras
  • a certain region is set at a exact position where a sensor is discovered by each of the cameras. For example, when a game player exactly follows the motion of an original actor, the exact position value of the sensor 1 discovered by the camera 1 must be (XI 1, YI 1). However, it is impossible that the game player exactly follows the motion without a selected error. Accordingly, when setting a certain region having for example, margin of 5 based on (XI 1, Yl l), a square region consisting of (XI 1 - 5, YI 1 - 5), (XI 1 + 5, YI 1 - 5), (XI 1 + 5, YI 1 + 5), and (XI 1 - 5, YI 1 + 5) is selected. The selected region is referred as a target region for facility.
  • step S84 The step of checking whether or not the sensor is detected in the target region
  • Whether or not the image coordinate of the sensor attached on the game player is introduced in the target region set at step S83 among the image data of each of the cameras is checked.
  • a conventional image process method in case using the method that overall region of image data outputted from each of the cameras is checked, the position value of the sensor attached on the game. player is identified, and compared with the sensor position value of the original actor, long time is required for the image processing. Accordingly, the target region where the sensor will be discovered is set beforehand. Furthermore, by only checking not the overall region but a certain target region, image processing time can be considerably reduced.
  • the three-dimensional position information can be obtained by combining the two-dimensional data detected in this manner. Namely/ a method that the three-dimensional information which will be finally used, is calculated by the two-dimensional detection information obtained from at least two cameras based on one sensor, and the three-dimensional data of the original actor is then compared with the calculated information, can be used. Since an algorithm in which a three-dimensional information is obtained from a plurality of two-dimensional information, follows a conventional theory of a computer vision field, the description of the above algorithm is omitted here.
  • FIG. 9 is a flow chart showing performance sequence of the present invention in detail. All the body action of an user can be pursued using music in accordance with selection of the user and high performance motion capture system (steps S101 and SI 02). A character and stage which can be applied to the case viewing this apart from the image of the original actor are then selected (steps SI 03 and SI 04). Next, the motion data of the original actor is read from his the motion capture DB (step S105), and a character image is overlaid on the motion data of the original actor (step SI 06). Since the size of the image file increases when using the image of the original actor according to that, the amount of data processing can be reduced by overlaying the character image stored at the motion coordinate of the original actor beforehand.
  • step SI 07 an operation matching the motion data with the music data is performed (step SI 07).
  • a display device is used for the motion data of the original actor and a music is outputted by a sound device (step SI 08).
  • the user takes motion according to the output data (step SI 09).
  • all the body motions are captured by a plurality of cameras, a mark is identified from the image, and the motion is pursued by the frame (steps SI 10 and Si l l).
  • the two-dimensional coordinate value is extracted using the image data obtained by this method, and the three-dimensional motion coordinate value is obtained from the extracted data (steps SI 13 and SI 14)
  • the body size of the original actor is compared with that of the learner (steps SI 15 and SI 16).
  • retargeting is performed (steps SI 17 and SI 18).
  • the retargeting can be performed during the motion-capture of the learner in accordance with flow as shown in FIG. 9, or prior to carrying out the motion-capture.
  • whether or not the data of the original are agreed with the data of the learner is checked by comparing with the two data (steps SI 19 and S120).
  • step S122 to S124 When the two data are agreed with each other, a score is given, and the total time agreed is counted, thereby displaying advertisement objects obtained using action agreement message or advertisement object production DB (steps S122 to S124).
  • a motion capture data is stored in a user motion storage section, and then whether or not the frame is a last frame is checked (steps S125 and SI 26). Thereafter, while in case of not being the last frame, next motion is captured, in case of the last frame, the game is over.
  • step SI 27 When the motion data of the original actor is not agreed with the motion data of the learner, after checking whether or not a function which the game is over in accordance with disagreement time is set (step SI 27), in case that the two data are not agreed with each other more than certain times, the game is over intentionally.
  • the motion of a game player following the motion of a dancer can be real-timely evaluated by a motion game apparatus.
  • a method playing a motion data is proposed so that the game player can easily follow the motion of the original actor in the motion game apparatus displaying the motion of the original actor, thereby improving facility of user. Moreover, the motion of the game player following the motion of the original actor is real-timely and rapidly evaluated, thereby enabling the motion game apparatus to be utilized.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

La présente invention concerne des procédés d"affichage et d"évaluation de données de mouvement dans un appareil de jeu d"action. Ce procédé consiste en différentes étapes et le procédé d"imagerie permet d"aider le joueur à suivre facilement le déplacement d"un danseur. La première étape consiste à calculer la position de détection de départ devant être détectée lorsque un joueur suit le déplacement d"un danseur. La deuxième étape consiste à calculer la position de détection de chaque caméra de manière à obtenir la position de détection de départ. La troisième étape consiste à établir le domaine cible de chaque caméra dont la limite du domaine se trouve autour de la position de détection de chaque caméra. La quatrième étape consiste à juger si le capteur d"un joueur est détecté dans le domaine cible.
PCT/KR2001/001710 2000-10-11 2001-10-11 Procede permettant d"afficher et d"evaluer des donnees de mouvement utilisable dans un appareil de jeu d"action Ceased WO2002030535A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2001294329A AU2001294329A1 (en) 2000-10-11 2001-10-11 Method of displaying and evaluating motion data used in motion game apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR2000/59646 2000-10-11
KR10-2000-0059646A KR100412932B1 (ko) 2000-10-11 2000-10-11 모션 게임 장치의 모션 데이터 디스플레이 방법 및 평가방법

Publications (1)

Publication Number Publication Date
WO2002030535A1 true WO2002030535A1 (fr) 2002-04-18

Family

ID=19692866

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2001/001710 Ceased WO2002030535A1 (fr) 2000-10-11 2001-10-11 Procede permettant d"afficher et d"evaluer des donnees de mouvement utilisable dans un appareil de jeu d"action

Country Status (3)

Country Link
KR (1) KR100412932B1 (fr)
AU (1) AU2001294329A1 (fr)
WO (1) WO2002030535A1 (fr)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004007034A1 (fr) * 2002-07-12 2004-01-22 Awaba Group Pty Ltd Dispositif d'apprentissage de l a danse
GB2495551A (en) * 2011-10-14 2013-04-17 Sony Comp Entertainment Europe A motion comparison arrangement with variable error tolerance
US8690670B2 (en) 2007-06-14 2014-04-08 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101271688B1 (ko) 2011-11-03 2013-06-05 삼성중공업 주식회사 플로팅 도크의 모션 측정 장치 및 방법
KR101711488B1 (ko) * 2015-01-28 2017-03-03 한국전자통신연구원 동작 기반 인터랙티브 서비스 방법 및 시스템
KR102760663B1 (ko) * 2016-11-29 2025-01-24 주식회사 넥슨코리아 게임 제공 장치 및 게임 제공 방법
CN112069075B (zh) * 2020-09-09 2023-06-30 网易(杭州)网络有限公司 游戏角色的时装测试方法、装置和游戏客户端
KR102456055B1 (ko) * 2020-09-28 2022-10-19 한국생산기술연구원 반복 운동의 동작 훈련을 위한 정량적 자세 분석 및 평가 장치 및 방법
KR102396882B1 (ko) * 2020-10-08 2022-05-11 주식회사 참핏 신체 유연성 평가 장치, 시스템 및 방법
KR102718541B1 (ko) * 2021-10-08 2024-10-17 에스와이엠헬스케어 주식회사 키오스크 기반의 자세평가 장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03224580A (ja) * 1990-01-31 1991-10-03 Fuji Electric Co Ltd 動画像の処理方法
US5554033A (en) * 1994-07-01 1996-09-10 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
JPH11212582A (ja) * 1998-01-27 1999-08-06 Daiichikosho Co Ltd 振り付け採点機能を有するカラオケ装置
JP2000037490A (ja) * 1998-07-24 2000-02-08 Konami Co Ltd ダンスゲーム装置
KR20000054349A (ko) * 2000-06-02 2000-09-05 김용환 자유 스텝이 가능한 3차원 댄스 시뮬레이터

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010027314A (ko) * 1999-09-13 2001-04-06 윤종용 율동 채점 기능을 구비하는 노래방 장치 및 그 방법
KR20000024237A (ko) * 2000-01-31 2000-05-06 김완호 댄스 평가 및 지도 기능을 갖는 노래 반주 시스템 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03224580A (ja) * 1990-01-31 1991-10-03 Fuji Electric Co Ltd 動画像の処理方法
US5554033A (en) * 1994-07-01 1996-09-10 Massachusetts Institute Of Technology System for human trajectory learning in virtual environments
JPH11212582A (ja) * 1998-01-27 1999-08-06 Daiichikosho Co Ltd 振り付け採点機能を有するカラオケ装置
JP2000037490A (ja) * 1998-07-24 2000-02-08 Konami Co Ltd ダンスゲーム装置
KR20000054349A (ko) * 2000-06-02 2000-09-05 김용환 자유 스텝이 가능한 3차원 댄스 시뮬레이터

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004007034A1 (fr) * 2002-07-12 2004-01-22 Awaba Group Pty Ltd Dispositif d'apprentissage de l a danse
US8690670B2 (en) 2007-06-14 2014-04-08 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9278286B2 (en) 2010-03-16 2016-03-08 Harmonix Music Systems, Inc. Simulating musical instruments
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
GB2495551B (en) * 2011-10-14 2014-04-09 Sony Comp Entertainment Europe Motion scoring method and apparatus
US10086283B2 (en) 2011-10-14 2018-10-02 Sony Interactive Entertainment Europe Limited Motion scoring method and apparatus
GB2495551A (en) * 2011-10-14 2013-04-17 Sony Comp Entertainment Europe A motion comparison arrangement with variable error tolerance
EP2581121B1 (fr) * 2011-10-14 2023-03-15 Sony Interactive Entertainment Europe Limited Appareil, programme d'ordinateur et procédé d'évaluation de mouvement

Also Published As

Publication number Publication date
KR20020028578A (ko) 2002-04-17
KR100412932B1 (ko) 2003-12-31
AU2001294329A1 (en) 2002-04-22

Similar Documents

Publication Publication Date Title
EP1324269B1 (fr) Appareil et procédé de traitement d'image, support d'enregistrement, programme informatique et dispositif a semiconducteurs
WO2002030535A1 (fr) Procede permettant d"afficher et d"evaluer des donnees de mouvement utilisable dans un appareil de jeu d"action
JP7127659B2 (ja) 情報処理装置、仮想・現実合成システム、学習済みモデルの生成方法、情報処理装置に実行させる方法、プログラム
JPH1186004A (ja) 移動体追跡装置
JP2009163639A (ja) オブジェクト軌道識別装置、オブジェクト軌道識別方法、及びオブジェクト軌道識別プログラム
WO2016021121A1 (fr) Procédé et dispositif de correction et de vérification
KR101124560B1 (ko) 동영상 내의 자동 객체화 방법 및 객체 서비스 저작 장치
CN113743237A (zh) 跟随动作的准确度判定方法、装置、电子设备及存储介质
JP4555690B2 (ja) 軌跡付加映像生成装置及び軌跡付加映像生成プログラム
JP2020126383A (ja) 動体検出装置、動体検出方法、動体検出プログラム
KR20020011851A (ko) 인공시각과 패턴인식을 이용한 체감형 게임 장치 및 방법.
KR20010107478A (ko) 모션 게임 장치
JP7059701B2 (ja) 推定装置、推定方法、及び推定プログラム
JP7533765B2 (ja) 骨格認識方法、骨格認識プログラムおよび体操採点支援システム
JP7500333B2 (ja) 生成装置、生成方法、およびプログラム
KR200239844Y1 (ko) 인공시각과 패턴인식을 이용한 체감형 게임 장치.
KR100607046B1 (ko) 체감형 게임용 화상처리 방법 및 이를 이용한 게임 방법
KR101892514B1 (ko) 좌표값을 이용한 당구게임의 득점 자동계산 시스템
JP7697581B2 (ja) 画像処理装置、画像処理方法、およびプログラム
CN100359437C (zh) 交互式影像游戏系统
JP7676123B2 (ja) トラッキング装置
JP2006146823A (ja) 映像オブジェクト軌跡付加装置及び映像オブジェクト軌跡付加プログラム
JP7708226B2 (ja) 画像処理装置、画像処理方法、およびプログラム
JP2006279890A (ja) 相関追尾方法および相関追尾装置
JP4615252B2 (ja) 画像処理装置、画像処理方法、記録媒体、コンピュータプログラム、半導体デバイス

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PH PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP