GB2482729A - An augmented reality musical instrument simulation system - Google Patents
An augmented reality musical instrument simulation system Download PDFInfo
- Publication number
- GB2482729A GB2482729A GB1013621.6A GB201013621A GB2482729A GB 2482729 A GB2482729 A GB 2482729A GB 201013621 A GB201013621 A GB 201013621A GB 2482729 A GB2482729 A GB 2482729A
- Authority
- GB
- United Kingdom
- Prior art keywords
- user
- response
- musical instrument
- track
- playing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/814—Musical performances, e.g. by evaluating the player's ability to follow a notation
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/65—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B15/00—Teaching music
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/32—Constructional details
- G10H1/34—Switch arrangements, e.g. keyboards or mechanical switches specially adapted for electrophonic musical instruments
- G10H1/342—Switch arrangements, e.g. keyboards or mechanical switches specially adapted for electrophonic musical instruments for guitar-like instruments with or without strings and with a neck on which switches or string-fret contacts are used to detect the notes being played
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/361—Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems
- G10H1/368—Recording/reproducing of accompaniment for use with an external source, e.g. karaoke systems displaying animated or moving pictures synchronized with the music or audio part
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H7/00—Instruments in which the tones are synthesised from a data store, e.g. computer organs
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8047—Music games
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/021—Background music, e.g. for video sequences or elevator music
- G10H2210/026—Background music, e.g. for video sequences or elevator music for games, e.g. videogames
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/005—Non-interactive screen display of musical or status data
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/135—Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/201—User input interfaces for electrophonic musical instruments for movement interpretation, i.e. capturing and recognizing a gesture or a specific kind of movement, e.g. to control a musical instrument
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/321—Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/401—3D sensing, i.e. three-dimensional (x, y, z) position or movement sensing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/441—Image sensing, i.e. capturing images or optical patterns for musical purposes or musical control purposes
- G10H2220/455—Camera input, e.g. analyzing pictures from a video camera and using the analysis results as control data
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/045—Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/045—Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
- G10H2230/065—Spint piano, i.e. mimicking acoustic musical instruments with piano, cembalo or spinet features, e.g. with piano-like keyboard; Electrophonic aspects of piano-like acoustic keyboard instruments; MIDI-like control therefor
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/045—Special instrument [spint], i.e. mimicking the ergonomy, shape, sound or other characteristic of a specific acoustic musical instrument category
- G10H2230/075—Spint stringed, i.e. mimicking stringed instrument features, electrophonic aspects of acoustic stringed musical instruments without keyboard; MIDI-like control therefor
- G10H2230/135—Spint guitar, i.e. guitar-like instruments in which the sound is not generated by vibrating strings, e.g. guitar-shaped game interfaces
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- User Interface Of Digital Computer (AREA)
- Electrophonic Musical Instruments (AREA)
- Processing Or Creating Images (AREA)
Abstract
An augmented reality system (5) enables a user to simulate playing a musical instrument such as a guitar or keyboard. The system (5) comprises visual tracking means, such as a camera or optical sensor, in conjunction with object recognition or tracking software. The visual tracking means detects movement, manipulation or occlusion of the object 1 by the user. Preferably the object 1 is a portion of the users body, or a graphical item carried by the user or displayed on a garment such as a t-shirt 2. The system (5) further comprises response selection means, such as augmented reality software, to determine a response based upon the movement, manipulation or occlusion detected by the tracking means; and output means, which may be visual or audible, to generate an output corresponding to the response and relating to the user's simulated playing of the musical instrument. The users simulated performance may be recorded for subsequent storage, review or playback.
Description
INTELLECTUAL
. .... PROPERTY OFFICE Application No. GB1013621.6 RTM Date:8 November 2010 The following terms are registered trademarks and should be read as such wherever they occur in this document: Adobe Intellectual Property Office is an operating name of the Patent Office www.ipo.gov.uk An Auimented Reality System The present invention relates generally to augmented reality systems, and in particular to augmented reality systems which enable a user to simulate playing a musical instrument.
Augmented reality (AR) systems consist of computer-related technology and input/output devices which work together to enhance a user's perception of reality. A typical AR system may comprise a display device (such as a screen), a tracking/viewing device (such as a camera), an input device (such as a controller, wand, pointing device etc) and computer comprising a processor running the AR application software.
Some music-related AR applications have been developed, typically in relation to music videos produced by bands to accompany musical releases. In one known example, a user is able to generate his own music video of a music band's performance. The user is able to print out symbols or graphical representations (known as markers') from the band's web site, each marker representing a pre-recorded performance of a music track by a particular band member. The user is then able to manipulate the band's performance of the song in real time via their computer web cam. As the user manipulates (e.g. moves) the real-world marker, this manipulation is detected by the web cam and the corresponding band member's performance is manipulated on the screen. The user may record the interaction for future play-back and review. However, whilst this application allows the user to manipulate pre-recorded musical performances by other people, it does not enable the user to produce (or at least simulate the production of) his own musical compositions.
An example of a system which enables the user to simulate playing a musical instrument is the Guitar Hero ® game which enables players to use a guitar-shaped controller (input device) to simulate playing the lead, bass or rhythm guitar. The controller is provided with coloured "fret buttons" and a "strum bar" corresponding to the frets and strings found on a real' guitar. The user's manual contact with the buttons and strum bar is used as input to the game software. As the system plays a musical track, notes scroll across the bottom of the screen. In order to score points, the player must match the scrolling notes to the fret buttons and strum the strum bar in time to the music. However, without the guitar-shaped controller, the user is not able to interact with the game. Such controllers may be expensive, thus the cost of the controller may be prohibitive to some users. Alternatively, the controller may be lost or may malfunction, and therefore the user can no longer interact with the game. Dependence upon the controller device is, therefore, a drawback of such a system.
A solution has now been devised which enables a user to play (or at least simulate the playing of) a musical instrument without the need for manipulation of a physical controller or input device.
Thus, in accordance with a first aspect of the present invention, there is provided an augmented reality system arranged to enable a user to simulate playing a musical instrument, the system comprising visual tracking means configured to visually track an object and detect movement, manipulation or occlusion of the object by the user; response selection means configured to determine a response based upon said movement, manipulation or occlusion detected by the tracking means; and output means configured to generate an output corresponding to said response and relating to the user's simulated playing of the musical instrument.
The object may be a portion of the user's body, such as a hand or arm.
Alternatively, the object may be a graphically represented item. Preferably, the object is a high contrast marker such as a black and white grid consisting of 8x8 squares. However, the graphical item may be a representation of a musical instrument or part of a musical instrument. Alternatively, the object may be a company logo, abstract artwork, geometric symbol or any other indicia. The object may be provided onlin a garment or item of clothing worn or carried by the user. For example, the garment may be a T-shirt printed with a picture of a guitar and/or pick-up' marker. The invention is not intended to be limited in regard to the shape or type of object visually tracked by the system.
Preferably, the visual tracking means may comprise a camera or other optical means configured to visually monitor or track an object. Preferably, the visual tracking means tracks the object by generating an image of the scene and detecting its presence, absence, orientation and/or location within that scene. Preferably, a digital representation (such as a bitmap) is generated corresponding to the scene.
Preferably, the camera is a web cam, digital camera or optical sensor in communication with a computer processor and/or other system components. The camera may be configured to repeatedly refresh the view or image of the object (for example, at a refresh rate of 25 frames per second).
Preferably, the visual tracking means also comprises object recognition means arranged and configured to detect and track the object being viewed by the camera. Preferably, the recognition means is a software component capable of tracking a physical marker in real time. The software may be proprietary software created and designed specifically for use with the invention. Alternatively, the software may be an off-the-shelf product, such as ARToolkit, comprising video tracking libraries.
Preferably, the visual tracking means is configured to notify the response selection means that an object-related event has occurred.
Detection of movement, manipulation or occlusion of the object may cause the tracking means to generate an interrupt signal when an event occurs. For example, if the object is recognised as being present within an image taken by the camera, no interrupt is generated.
However, if the object has not been recognised in one or more images an interrupt may be generated to inform the response selection means that an event (manipulationlmovement/occlusion) has occurred.
The visual tracking means is configured to track the object and detect the occurrence of any change caused by the user's manipulation, movement or occlusion of the object. For example, the user may reposition or move the object. Alternatively, the change may occur because the object has been completely or partially obscured behind another object. For example, the user may place his hand between the object and the camera such that the object is completely or partially hidden from view.
Preferably, the response selection means is a software component configured to select an appropriate response based upon input received from the visual tracking means. The response selection means may be a software component, such as an event handler, containing code which specifies the course or courses of action to be taken by the system in response to a particular event. For example, if a sufficient, pre-determined amount of the object has been occluded, the appropriate response may be to play a sound.
Preferably, the response selection means is arranged and configured to select an appropriate response from a plurality of possible responses. For example, the response may be either to play a musical note or make a non-musical note.
In some embodiments, the event handler may be provided with information regarding the change (e.g. whether the object been fully occluded, partially occluded, moved in an upwards direction, rotated etc). This information may be used to select an appropriate response to the particular object-related change which has occurred.
Preferably, the response selection is made in relation to a pre-determined piece of music.
For example, if the user obscures the object at a pre-determined point during the performance of the musical piece, the system may play the musical note corresponding to the correct note in the piece at that point. Tn this manner, the system may enable the user to simulate strumming a guitar or pressing a key on a piano, for example.
Preferably, the response selected by the response selection means is communicated to the output means and/or some other system component.
Preferably, the output means is a device configured to generate an audio and/or visual output. Preferably, the output device is a speaker, screen or projector. In some embodiments, more than one output device may be provided (e.g. speaker, projector and a screen, or combination thereof).
The augmented reality system may be configured to operate only when the object corresponds to a predetermined type or configuration. Thus, operation of the system may be controlled by determining the presence of a predetermined object type or configuration.
For example, the user may only be permitted to use the system if a company logo or other given symbol has been detected in the scene viewed by the camera. Alternatively, the system may be configured to recognise whether the object is a picture of a guitar, or the user's hand, or a geometric marker etc. Thus, the object can function as a key or password which enables or prohibits use of or access to the system.
The user's manipulation, occlusion or movement of the object may be assessed in relation to a pre-determined set of criteria to provide an evaluation of the user's simulated playing of the musical instrument. The pre-determined set of criteria may be a set of cues' placed in a track of music at pre-determined points. The user's simulated performance may then be assessed in relation to the cue points to determine whether the user has played' the instrument at the correct time. The cue points may be stored in digital format. The cue points may be stored in an XML database. Thus, the user's musical timing can be assessed.
The system may comprise means for playing a base track'. The base track may be a track or piece of music which the user is to accompany. The sound of a particular instrument may have been removed from the base track such that it can be overlaid separately on the base track during use of the AR system. For example, the guitar track may have been removed from a base track. The base and/or instrument tracks may be stored in digital format, such as in MP3 files. This enables the base and instrument tracks to be played independently of each other during the user's interaction with the AR system.
Preferably, during use, the base and the instrument tracks are played simultaneously, in-synch. Preferably, the volume of the instrument track is reduced (preferably to zero) until the user simulates playing of the instrument by occlusion, movement or manipulation of the object. When the user creates the object-related event, the volume of the instrument track may be increased such that it appears that the user has played the instrument. For example, if the user has put his hand in front of the marker, the system may detect this event and interpret it as a guitar strum', increasing the volume of the guitar track. Thus, it may appear that the user is playing the guitar accompaniment to the base track.
This provides the advantage that the user can simulate playing a musical instrument without having to use a controller device. For example, the user can simply make a guitar strumming motion with his hand. This motion will be detected by the visual tracking means (e.g. camera plus tracking software) and a corresponding musical performance will be produced. In effect, the computer allows the user to produce music by playing air guitar'.
n some embodiments, if the user plays' incorrectly then an error indication may be output by the system. For example, if the user strums' at a point which does not correspond with a cue point in the base track, then a bum note' or other audible or visual indicator may be produced. The bum notes or error indicators may be saved as an error track' in digital format, such as in an MP3 file.
The system may comprise means for quantifying and/or recording the number of correct' and/or incorrect' notes or strokes that the user plays. Thus, the system may maintain a score related to the user's musical performance.
Preferably, a user interface is provided to enable the user to alter variables relating to the user's interaction with the system. For example, a start' icon may be provided on the screen.
The system may further comprise a prompt component for prompting or advising the user how and/or when to simulate playing the musical instrument. This prompt component may be a visual, graphical prompt displayed at the bottom of a screen. The prompt may be a scrolling prompt. The prompt component may prompt the user to play' the instrument in time with the cue points.
In some embodiments of the invention, the system may comprise means for recording the user's interactive session with the system. The recorded interactive session may be configured for subsequent play-back. The recording may be handled by and/or stored on a media server.
Also in accordance with the present invention, there is provided a method of simulating a user's playing of a musical instrument, the method comprising the steps: i) visually tracking an object to detect manipulation, movement or occlusion of the object by the user; ii) determining a response based upon the detected manipulation, movement or occlusion; and iii) generating an output corresponding to the response and relating to the user's simulated playing of the musical instrument.
The method may further comprise the step of generating a digital representation of the object using a camera or optical sensor. Additionally or alternatively, tracking of the object may be performed by a camera. The camera may be a video camera or web cam or digital camera.
Determination of the response may be performed by a software-based event handler running on a processor. The object may be provided on or in an item of clothing worn or carried by the user.
According to another aspect of the invention, an item of clothing is provided carrying or bearing an object to be visually tracked by an augmented reality system according to the invention as described above.
Preferred features in relation to these aspects may be in line with the preferred features of the first aspect of the invention.
These and other aspects of the present invention will be apparent from, and elucidated with reference to, the embodiment described herein.
An embodiment of the present invention will now be described by way of example only and with reference to the accompanying drawings, in which: Figures 1 a and lb show a flowchart depicting a typical user-system interaction in accordance with a preferred embodiment of the invention.
Figure 2 shows a user long-in screen of a system in accordance with an exemplary embodiment of the invention.
Figures 3 to 7 show screen shots of a system in accordance with an exemplary embodiment of the invention.
Figure 8 shows an example of an object suitable for use with the present invention.
A preferred embodiment of the present invention comprises a visual tracking means (web cam), a computer and output device (speaker(s) and a screen). The computer comprises memory and a processor configured to store and execute the software components of the invention (response selection means, user interface etc.).
The camera is configured to visually track a pre-determined object. In the preferred embodiment, the object 1 is a high contrast marker such as a black and white grid consisting of 8x8 squares. An example of such an object is shown in Figure 8. The camera views the environment (scene') and takes a snap shot of what it sees', producing an image of that scene in digital format, such as a bitmap.
The video imagery produced by the camera is analysed by an object recognition means.
The object recognition is performed by at least one software component which assesses whether or not the object is present, and/or assesses other object-related properties such as the object's position or orientation etc. The inventors have found that the ARToolKit software library is suitable for this purpose. The ARToolKit video tracking libraries enable calculation of the camera position and orientation relative to the physical marker 1 in real time. The toolkit libraries include code to enable various functionalities, including camera positionlorientation tracking, tracking of black squares or any square marker patterns, and camera calibration code.
If the vision algorithms within the recognition software determine that the marker 1 is present within an image, a flag is set within the system code to indicate this fact. The camera continues to track the marker 1, repeatedly refreshing the image at a frequency according to its refresh rate.
This process continues until an object-related event occurs. An object-related event occurs when the user manipulates, moves or occludes the marker 1. For example, the user might put his hand between the camera and the marker 1 so that the marker is at least partially shielded from the camera. When this event is detected, an interrupt is generated and sent to the response selection means, The response selection means comprises event handling code executing on the processor. The interrupt may be generated as soon as the tracking means is unable to recognise the marker within a single image or when it has failed to recognise it in a pre-determined number of sequential images. In this way, the marker is visually tracked or monitored. The system may be configured to detect and respond to any type of change in the marker's position, presence or orientation etc., although the present embodiment responds to detection of the marker's presence or absence/occlusion.
When the event handling code is triggered by the interrupt, the event handling code determines how to respond to the event that has occurred and been detected. For example, the response may be to play a musical note when the marker 1 is hidden from view. This may be achieved by having a pre-determined music track playing at zero-volume whilst the user is using the system, the volume being increased when the user obscures the marker 1.
Alternatively or additionally, the response may be to flash a light or display some information on the screen. Thus, the output devices (speakers/screen) are used to convey the response to the user.
Thus, in the preferred embodiment, a musical output is produced when the user hides the marker from the camera, thus simulating strumming of a guitar.
The event may be assessed in relation to pre-determined criteria. For example, a pre-determined track of music may be used to assess the user's performance. Specific (cue') points may be determined on the track of music such that the user is required to strum' only at those cue points. If the user strums at a correct point in the track, the correct note is played by the system for that point in the track.
The track maybe split such that the guitar track is separated out leaving a base track'.
Both the base track and the guitar track are played simultaneously and in-synch, the guitar track being played by default at zero volume. When the user strums at a pre-designated cue point in the base track, the volume of the guitar track is raised such that the user can hear it. If, however, the user strums at a point in the base track other than a designated cue point, a bum note' or other incongruous sound may be played. These sounds may be stored in an error track'.
Thus, the system enables the user to simulate playing the guitar accompaniment to a track of music.
A typical interaction or session between the user and system is shown in Figure 1. The user starts the application 3 and signs in 4 to the system using a screen as shown in Figure 2.
Upon correct log in, system data is loaded and the user may be presented with information such as a tutorial on how to use the system 5, or promotional material. Figure 3 shows a screen shot of a prompt instructing the user to put on a T-shirt 2 printed with the pick-up marker 1.
The web cam is configured for use 6, and the various system components are loaded.
These include the base track, instrument (guitar) track, error track and cue points 7. The cue points (i.e. timings for the strumming) are drawn from an XML database. Figure 4 shows a screen shot of the system in use after the user has logged in, the camera has been configured ready for use and the system components have been loaded 8. The user is able to change settings via a settings' tab on the screen. This enables the user to alter variables pertaining to the user's interaction with the system, thus providing the user with a degree of control and flexibility over how the interaction is to be conducted.
When the user initiates a performance session 9, the user is presented with the option of recording the interactive session lOa, lOb. Figures 5 and 6 show screen shots of the system in use, wherein the user is offered the possibility of recording the performance. The recording facility is discussed in more detail below. In either event, the user begins the session by choosing a play' option 11.
The system begins to simultaneously play the base track and instrument (guitar) track, and the cue points are accessed from the XML database 12. As the base and guitar tracks are started at the same time, they are in-synch. The guitar track, however, is played at 0% volume (or thereabouts).
The web cam and vision software track the presence of the marker 1, listening for interrupts 13. Graphics flow across the screen in time to the base track, prompting the user when to strum according to the cue points drawn from the XML database.
When the marker 1 is recognised, no response is required and no interrupt is generated.
However, when the user makes a guitar-strumming motion, his hand moves in front of the marker 1, thus occluding it so that it is not seen by the camera. When the software detects that a sufficiently large portion of the marker (e.g. at least 50% of the marker) has been occluded, an interrupt is made 14. The event handler then compares the interrupt time to the cue points 15, to determine whether the user has strummed at (or within an acceptable proximity to) a cue point. If the answer is yes 16a, and the user has strummed at the correct' moment, the volume of the guitar track is increased to play the guitar track for a period of time 16b. If the answer is no' 17a, then the error track (e.g. bum note) is played 17b. In either event, the user's action is responded to in an appropriate manner. The system then returns to listening mode 18, monitoring the object and listening for an event.
When the base track (and corresponding instrument track) is finished, the session terminates 19.
If the user opted to record the session lOa, then input received from the web cam is recorded during the user's session. The web cam recording begins at the same time, or shortly before, the base/guitar tracks begin to play 20a. All interrupts (i.e. strums) generated during the user's session are recorded and stored on the applications server 20b.
The recording terminates when the session terminates 20c. The recorded web cam footage is then saved on a media server store (discussed below) 20d.
A score may be maintained to record how well the user has performed (e.g. how many correct and/or incorrect strokes the user has made). The screen may also be updated to reflect whether or not the user has strummed at the correct point.
A media server is used to enable the recording of the user's webcam at a predefined quality level. The webcam footage is stored on the media server and can be streamed for viewing whenever necessary. Once the AR application connects to the media server, the server provides the application with access to all recorded webcam footage as well as the ability to record webcam footage to the media server.
The recorded sessions are available for viewing via a video wall'. The user can choose a system option which will show the video wall 21. If the user chooses this option, all saved footage will be loaded. The user is able to click on a particular video to play and view 23.
Figure 7 shows a screen shot wherein the user is permitted select a previously recorded performance (by himself or someone else) to watch from the video wall. Upon choosing a particular pre-recorded video, a player component is loaded, along with the video and the interrupt XML 24. The video, base track, guitar track and interrupt XML are all played 25.
Thus, the footage recorded during the chosen session is streamed from the media server and combined with the base music track and the instrument (e.g. guitar) track as well as the relative xml. Both the audio tracks and the webcam footage are played in-sync, the instrument track's volume then being controlled by the AR application according to the session's xml to create an accurate representation of the user's session.
Adobe Flash Media interactive Server has been found to be suitable for this purpose, although other commercial or proprietary technology may be used.
It should be noted that whilst the embodiment described above uses an instrument track to produce the simulation of playing the guitar, the invention is not intended to be limited in these regards. Instead, the system may generate music on the fly1 rather than increasing the volume on a pre-recorded track. The music may be generated in response to the user's movement, manipulation or occlusion of the object 1. Also, the instrument being simulated may not be a guitar. For example, the user may be wearing a T-shirt with a keyboard or trumpet printed on it, or holding a card shaped like a maraca. When the user presses a particular key on the picture or moves the card, the corresponding musical note or sound would be played by the system. Alternatively, the user may raise or reposition the picture of the instrument to increase the volume and/or tone of the musical note played by the system. Such an embodiment may be arranged and/or adapted to simulate playing a trombone, for example. Percussion instruments and stringed instruments may also be simulated by varying the object and/or tracking functionalities of the system.
Thus, the present invention uses computer vision tracking technology to permit the user to interact with the AR system.
Thus, the present invention provides a novel system to enable a user to simulate playing a musical instrument without the need to use a physical controller to direct the system.
Instead, the visual recognition and/or tracking of an object is used in place of the physical manipulation of a controller/input device. This provides the advantage that the user does not have to be in possession of a controller. This provides a more flexible and cost-effective system.
The invention can be used for entertainment purposes, such as game playing, or for teaching the user how to play a musical instrument. The invention is not intended to be limited in respect of the type of instrument being simulated.
Another advantage provided by the system is that an item of clothing (or some other article/item) can be used to trigger a sound and, in particular, a musical sound. The sound may be controlled or manipulated by the item of clothing (for example, the volume of the sound may be altered, or the pitch of the note altered, or the speed at which a tune is played may be varied). Thus, a garment may become part of a musical production or perform the function of a musical instrument. This provides a novel, interesting and entertaining use of AR technology.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be capable of designing many alternative embodiments without departing from the scope of the invention as defined by the appended claims. In the claims, any reference signs placed in parentheses shall not be construed as limiting the claims. The word "comprising" and "comprises", and the like, does not exclude the presence of elements or steps other than those listed in any claim or the specification as a whole. In the present specification, "comprises" means "includes or consists of' and "comprising" means "including or consisting of'. The singular reference of an element does not exclude the plural reference of such elements and vice-versa. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In a device claim enumerating several means, several of these means may be embodied by one and the same item of hardware.
The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
Claims (4)
- CLAIMS: 1. An augmented reality system arranged to enable a user to simulate playing a musical instrument, the system comprising visual tracking means configured to visually track an object and detect movement, manipulation or occlusion of the object by the user; response selection means configured to determine a response based upon the movement, manipulation or occlusion detected by the tracking means; and output means configured to generate an output corresponding to said response and relating to the user's simulated playing of the musical instrument.
- 2. A system according to claim 1 wherein the object is a portion of the user's body or a graphical item.
- 3. A system according to claim 2 wherein the object is a graphical item provided on or in a garment worn or carried by the user.
- 4. A system according to any preceding claim wherein operation of the system is permitted only when the object is recognised as corresponding to a pre-determined type or configuration.6. A system according to any preceding claim wherein the tracking means comprises a camera or optical sensor.7. A system according to any preceding claim wherein the tracking means comprises object recognition and/or tracking software.8. A system according to any preceding claim wherein the output is an audible and/or visual output.9. A system according to any preceding claim wherein the response selection means is a software-based event handler running on a computer processor.10. A system according to any preceding claim wherein the user's manipulation, movement or occlusion of the object is assessed in relation to a pre-determined set of criteria to provide an evaluation of the user's simulated playing of the musical instrument.11. A system according to claim 10 wherein the pre-determined set of criteria is a set of cue points in a track of music.12. A system according to any preceding claim further comprising a prompt component for prompting or advising the user how and/or when to simulate playing the musical instrument.13. A system according to any preceding claim further comprising means for recording the user's simulation for subsequent storage, review or playback.14. An item of clothing carrying or bearing an object to be visually tracked by an augmented reality system according to any preceding claim.15. An item of clothing according to any preceding claim wherein the augmented reality system is configured to operate only when the object carried or borne by the item of clothing corresponds to a predetermined type or configuration.16. A method of simulating a user's playing of a musical instrument, the method comprising the steps: i) visually tracking an object to detect manipulation, movement or occlusion of the object by the user; ii) determining a response based upon the detected manipulation, movement or occlusion; and iii) generating an output corresponding to the response and relating to the user's simulated playing of the musical instrument.17. The method of claim 16 further comprising the step of generating a digital representation of the object using a camera or optical sensor.18. The method of claims 16 or 17 wherein visual tracking of the object is performed by a camera and/or determination of the response is performed by a software-based event handler running on a processor.19. The method of claims 16 to 18 wherein the object is provided on or in an item of clothing worn or carried by the user.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1013621.6A GB2482729A (en) | 2010-08-13 | 2010-08-13 | An augmented reality musical instrument simulation system |
| PCT/GB2011/051477 WO2012020242A2 (en) | 2010-08-13 | 2011-08-04 | An augmented reality system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1013621.6A GB2482729A (en) | 2010-08-13 | 2010-08-13 | An augmented reality musical instrument simulation system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| GB201013621D0 GB201013621D0 (en) | 2010-09-29 |
| GB2482729A true GB2482729A (en) | 2012-02-15 |
Family
ID=42937953
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB1013621.6A Withdrawn GB2482729A (en) | 2010-08-13 | 2010-08-13 | An augmented reality musical instrument simulation system |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB2482729A (en) |
| WO (1) | WO2012020242A2 (en) |
Cited By (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2496521A (en) * | 2011-11-11 | 2013-05-15 | Fictitious Capital Ltd | Computerised musical instrument using motion capture and analysis |
| EP2972763A4 (en) * | 2013-03-15 | 2017-03-29 | Elwha LLC | Temporal element restoration in augmented reality systems |
| US20230066179A1 (en) * | 2021-09-02 | 2023-03-02 | Snap Inc. | Interactive fashion with music ar |
| US12148108B2 (en) | 2021-10-11 | 2024-11-19 | Snap Inc. | Light and rendering of garments |
| US12367616B2 (en) | 2021-09-09 | 2025-07-22 | Snap Inc. | Controlling interactive fashion based on facial expressions |
| US12380618B2 (en) | 2021-09-13 | 2025-08-05 | Snap Inc. | Controlling interactive fashion based on voice |
| US12412347B2 (en) | 2021-09-30 | 2025-09-09 | Snap Inc. | 3D upper garment tracking |
| US12462507B2 (en) | 2021-09-30 | 2025-11-04 | Snap Inc. | Body normal network light and rendering control |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20130106689A1 (en) * | 2011-10-25 | 2013-05-02 | Kenneth Edward Salsman | Methods of operating systems having optical input devices |
| US10918924B2 (en) | 2015-02-02 | 2021-02-16 | RLT IP Ltd. | Frameworks, devices and methodologies configured to enable delivery of interactive skills training content, including content with multiple selectable expert knowledge variations |
| US10942968B2 (en) | 2015-05-08 | 2021-03-09 | Rlt Ip Ltd | Frameworks, devices and methodologies configured to enable automated categorisation and/or searching of media data based on user performance attributes derived from performance sensor units |
| US10235808B2 (en) | 2015-08-20 | 2019-03-19 | Microsoft Technology Licensing, Llc | Communication system |
| US10169917B2 (en) | 2015-08-20 | 2019-01-01 | Microsoft Technology Licensing, Llc | Augmented reality |
| US9934594B2 (en) | 2015-09-09 | 2018-04-03 | Spell Disain Ltd. | Textile-based augmented reality systems and methods |
| EP3387634B1 (en) * | 2015-12-10 | 2021-02-24 | GN IP Pty Ltd | Frameworks and methodologies configured to enable real-time adaptive delivery of skills training data based on monitoring of user performance via performance monitoring hardware |
| CN111679806A (en) * | 2020-06-10 | 2020-09-18 | 浙江商汤科技开发有限公司 | Play control method and device, electronic equipment and storage medium |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP1160651A1 (en) * | 2000-05-29 | 2001-12-05 | Ein-Gal Moshe | Wireless cursor control |
| US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
| US20030100965A1 (en) * | 1996-07-10 | 2003-05-29 | Sitrick David H. | Electronic music stand performer subsystems and music communication methodologies |
| US20050211080A1 (en) * | 2004-01-20 | 2005-09-29 | Hiromu Ueshima | Image signal generating apparatus, an image signal generating program and an image signal generating method |
| WO2008152644A2 (en) * | 2007-06-12 | 2008-12-18 | Eyecue Vision Technologies Ltd. | System and method for physically interactive music games |
Family Cites Families (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US7772480B2 (en) * | 2007-08-10 | 2010-08-10 | Sonicjam, Inc. | Interactive music training and entertainment system and multimedia role playing game platform |
-
2010
- 2010-08-13 GB GB1013621.6A patent/GB2482729A/en not_active Withdrawn
-
2011
- 2011-08-04 WO PCT/GB2011/051477 patent/WO2012020242A2/en not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20030100965A1 (en) * | 1996-07-10 | 2003-05-29 | Sitrick David H. | Electronic music stand performer subsystems and music communication methodologies |
| US20020036617A1 (en) * | 1998-08-21 | 2002-03-28 | Timothy R. Pryor | Novel man machine interfaces and applications |
| EP1160651A1 (en) * | 2000-05-29 | 2001-12-05 | Ein-Gal Moshe | Wireless cursor control |
| US20050211080A1 (en) * | 2004-01-20 | 2005-09-29 | Hiromu Ueshima | Image signal generating apparatus, an image signal generating program and an image signal generating method |
| WO2008152644A2 (en) * | 2007-06-12 | 2008-12-18 | Eyecue Vision Technologies Ltd. | System and method for physically interactive music games |
Cited By (11)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB2496521A (en) * | 2011-11-11 | 2013-05-15 | Fictitious Capital Ltd | Computerised musical instrument using motion capture and analysis |
| US9224377B2 (en) | 2011-11-11 | 2015-12-29 | Fictitious Capital Limited | Computerized percussion instrument |
| GB2496521B (en) * | 2011-11-11 | 2019-01-16 | Fictitious Capital Ltd | Computerised percussion instrument |
| EP2972763A4 (en) * | 2013-03-15 | 2017-03-29 | Elwha LLC | Temporal element restoration in augmented reality systems |
| US20230066179A1 (en) * | 2021-09-02 | 2023-03-02 | Snap Inc. | Interactive fashion with music ar |
| US12198664B2 (en) * | 2021-09-02 | 2025-01-14 | Snap Inc. | Interactive fashion with music AR |
| US12367616B2 (en) | 2021-09-09 | 2025-07-22 | Snap Inc. | Controlling interactive fashion based on facial expressions |
| US12380618B2 (en) | 2021-09-13 | 2025-08-05 | Snap Inc. | Controlling interactive fashion based on voice |
| US12412347B2 (en) | 2021-09-30 | 2025-09-09 | Snap Inc. | 3D upper garment tracking |
| US12462507B2 (en) | 2021-09-30 | 2025-11-04 | Snap Inc. | Body normal network light and rendering control |
| US12148108B2 (en) | 2021-10-11 | 2024-11-19 | Snap Inc. | Light and rendering of garments |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2012020242A2 (en) | 2012-02-16 |
| GB201013621D0 (en) | 2010-09-29 |
| WO2012020242A3 (en) | 2012-09-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| GB2482729A (en) | An augmented reality musical instrument simulation system | |
| US8445767B2 (en) | Method and system for interactive musical game | |
| US10262642B2 (en) | Augmented reality music composition | |
| KR101701073B1 (en) | Game machine, control method used in same, and recording medium | |
| US8961309B2 (en) | System and method for using a touchscreen as an interface for music-based gameplay | |
| US8858330B2 (en) | Music video game with virtual drums | |
| US7799984B2 (en) | Game for playing and reading musical notation | |
| US20110319160A1 (en) | Systems and Methods for Creating and Delivering Skill-Enhancing Computer Applications | |
| US20120247305A1 (en) | Musical score playing device and musical score playing program | |
| WO2009007512A1 (en) | A gesture-controlled music synthesis system | |
| JP2015060207A (en) | Music data display device, music data presentation method, and program | |
| Fonteles et al. | User experience in a kinect-based conducting system for visualization of musical structure | |
| JP6411412B2 (en) | Program, game apparatus, and game progress method | |
| US20130106689A1 (en) | Methods of operating systems having optical input devices | |
| JP2017211974A (en) | System, method and program for tracking fingers of user | |
| Santini | Composition as an embodied act: A framework for the gesture-based creation of augmented reality action scores | |
| Barbancho et al. | Human–computer interaction and music | |
| JP6623480B2 (en) | Game device and game program | |
| JP5773956B2 (en) | Music performance apparatus, music performance control method, and program | |
| KR101539905B1 (en) | Gaming apparatus, control method used for same and recording medium | |
| WO2014174621A1 (en) | Recording medium, gaming apparatus and game progress method | |
| JP7810053B2 (en) | Information processing system, information processing method, and program | |
| JP2011152334A (en) | Game system, control method and computer programs used for the same | |
| JP2019025346A (en) | Program, game apparatus, and game progress method | |
| GB2641406A (en) | Method and system for generating audio-visual content from video game footage |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |