EP2793222B1 - Method for implementing an automatic music jam session - Google Patents
Method for implementing an automatic music jam session Download PDFInfo
- Publication number
- EP2793222B1 EP2793222B1 EP13197613.6A EP13197613A EP2793222B1 EP 2793222 B1 EP2793222 B1 EP 2793222B1 EP 13197613 A EP13197613 A EP 13197613A EP 2793222 B1 EP2793222 B1 EP 2793222B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- audio
- user
- loop
- loops
- instrument
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0008—Associated control or indicating means
- G10H1/0025—Automatic or semi-automatic music composition, e.g. producing random music, applying rules from music theory or modifying a musical piece
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/38—Chord
- G10H1/383—Chord detection and/or recognition, e.g. for correction, or automatic bass generation
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/36—Accompaniment arrangements
- G10H1/40—Rhythm
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/066—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for pitch analysis as part of wider processing for musical purposes, e.g. transcription, musical performance evaluation; Pitch recognition, e.g. in polyphonic sounds; Estimation or use of missing fundamental
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/031—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal
- G10H2210/081—Musical analysis, i.e. isolation, extraction or identification of musical elements or musical parameters from a raw acoustic signal or from an encoded audio signal for automatic key or tonality recognition, e.g. using musical rules or a knowledge base
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/125—Medley, i.e. linking parts of different musical pieces in one single piece, e.g. sound collage, DJ mix
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/101—Music Composition or musical creation; Tools or processes therefor
- G10H2210/141—Riff, i.e. improvisation, e.g. repeated motif or phrase, automatically added to a piece, e.g. in real time
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2210/00—Aspects or methods of musical processing having intrinsic musical character, i.e. involving musical theory or musical parameters or relying on musical knowledge, as applied in electrophonic musical tools or instruments
- G10H2210/325—Musical pitch modification
- G10H2210/331—Note pitch correction, i.e. modifying a note pitch or replacing it by the closest one in a given scale
- G10H2210/335—Chord correction, i.e. modifying one or several notes within a chord, e.g. to correct wrong fingering or to improve harmony
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/106—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/091—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
- G10H2220/101—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
- G10H2220/106—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters
- G10H2220/111—Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters using icons, e.g. selecting, moving or linking icons, on-screen symbols, screen regions or segments representing musical elements or parameters for graphical orchestra or soundstage control, e.g. on-screen selection or positioning of instruments in a virtual orchestra, using movable or selectable musical instrument icons
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2230/00—General physical, ergonomic or hardware implementation of electrophonic musical tools or instruments, e.g. shape or architecture
- G10H2230/025—Computing or signal processing architecture features
- G10H2230/031—Use of cache memory for electrophonic musical instrument processes, e.g. for improving processing capabilities or solving interfacing problems
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/075—Musical metadata derived from musical analysis or for use in electrophonic musical instruments
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/121—Musical libraries, i.e. musical databases indexed by musical parameters, wavetables, indexing schemes using musical parameters, musical rule bases or knowledge bases, e.g. for automatic composing methods
- G10H2240/131—Library retrieval, i.e. searching a database or selecting a specific musical piece, segment, pattern, rule or parameter set
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/175—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments for jam sessions or musical collaboration through a network, e.g. for composition, ensemble playing or repeating; Compensation of network or internet delays therefor
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/295—Packet switched network, e.g. token ring
- G10H2240/305—Internet or TCP/IP protocol use for any electrophonic musical instrument data or musical parameter transmission purposes
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2240/00—Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
- G10H2240/171—Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
- G10H2240/281—Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
- G10H2240/311—MIDI transmission
Definitions
- the present invention relates generally to the field of multimedia editing and, more generally, to automatically composing musical works with intuitive user interaction.
- a system and method that supports a user when generating music, wherein the level of support is adjustable by the user.
- the level of support should be variable and include approaches such as fully automatic, user driven selection, and real-time automatic accompaniment generation for a user who is playing an instrument. Additionally what is needed is a system and method that smoothly combines the traditional approach to music creation - with instruments - with a computer based music clip driven approach.
- EP1557836 defines composing/editing media clips by automatic selection of themes and of (harmony-) compatible replacement clips and by pointing on-screen on a media track. Criteria are clip-type, -tempo, -start and - ending time. However there is no audio loops in the description, although Figure 4 shows audio loops, only audio clips are meant; and not real time interactive music composition is disclosed.
- XP031481665 Browsing Musical Loop Libraries_, CONTENT BASED MULTIMEDIA INDEXING, 2009. CMBI 09. SEVENTH INTERNATIONAL WORKSHOP ON, IEEE, PISCATAWAY, NJ, USA, 3June 2009 (2009-06-03), pages 73-80 discloses music loop library browser displaying music according to similarity in timbre, harmony and rhythm. Spatial sound is obtained by playing several extracts at the same time, including beat synchronous loop play and pitch alteration.
- US2004089141 provides a top-down and interactive auto-composition approach. It defines patterns used as building blocks for real time interactive music composition, or a user modifying an auto-composed music in real time. Improvisation is also disclosed as well as data structures having instrument, chord, tempo, key, style and structure indexed by part. It does not disclose audio loops or that individual parts can be utilized as parts, it discloses sub-blocks or riffs but no loops. It discloses the storage of MIDI-loops however these are not utilized within the disclosed algorithm.
- US2012297959 defines particular loop replacement according to a score versus a specific harmony matrix. A default instrument is automatically selected for each given audio track. However these tracks are key adjusted note by note, not loop by loop.
- US2012312145 defines a music composition automation system combining 'phrases_ which can be pre-recorded audio clips, synthesizer loops or other types of audio data sources.
- Each instrument in the style includes a set of phrases that can be played, associated with a specific song section type using this data structure.
- the music style library data structure includes fields: Phrase to loop_, style, re-trigger flag, chord, start, end, genre, instrument and volume.
- the audio recordings library data structure includes type, tempo, beats per bar, key and chords. It does not disclose a real time interactive algorithm, no jam session capability and no recursive algorithm.
- E P1666967 discloses automatically creating an emoti on-controlled soundtrack for use with an existing video work.
- An algorithm initiates loop insertions according to emotion tag and style. The selected music loops will be cut, looped, cross-faded, etc., to create a seamless audio soundtrack with additional automatic insertion of fade-ins or fade-outs before and after selected music loops to smooth the audio transitions into / out of a section.
- This reference discloses clips/loops, tags, it does not however disclose a part-suitability tag or field. Additionally it does disclose beat insertion of clips bases on similar beat length, not beat synchronicity.
- US5877445 defines an audio block sequence compiler selecting segments from a segments library classified in a characteristic compatibility table by: Duration; suitability for begin or end of a sequence, compatibility with each block.
- the compiler generates audio sequences using the characteristic table and user criteria (duration, mood, intensity), it does not define however audio loops, it utilizes looping of individual segments, additionally there is no tempo selection, no style selection and no disclosure of an iterative algorithm.
- At least a portion of the instant invention will be implemented in form of software running on a user's computer 100.
- a computer will have some amount of program memory and hard disc storage (whether internal or accessible via a network) as is conventionally utilized by such units.
- an external camera 110 of some sort be utilized with - and will preferably be connecti bl e to - the computer so that video and/or graphic information can be transferred to and from the computer.
- the camera 110 will be a digital video camera, although that is not a requirement, as it is contemplated that the user might wish to utilize still images from a digital still camera in the creation of his or her multimedia work. Further given the modern trend toward incorporation of cameras into other electronic components (e.g.
- the camera might be integrated into the computer or some other electronic device and, thus, might not be a traditional single-purposes video or still camera.
- the camera will preferably be digital in nature, any sort of camera might be used, provided that the proper interfacing between it and the computer is utilized.
- a microphone 130 might be utilized so that the user can add voice-over narration to a multimedia work or can control his or her computer via voice-recognition software.
- a CD or DV D burner 120 could be useful for storing content on writable or rewritable media.
- a mobile data storage device 140 might be connected to the computer, such as an mp3 player for example, for storage of individual music clips or other data as needed by the instant invention.
- the user would bring a smart phone 150 or other touch based device (e.g., a tablet computer such as a M icrosoft ⁇ Surface ⁇ table or an iPad ⁇ , or other device with a touch-sensitive display) into communication with the computer in order to, for example, control the computer or exchange data between the computer and the device.
- a smart phone 150 or other touch based device e.g., a tablet computer such as a M icrosoft ⁇ Surface ⁇ table or an iPad ⁇ , or other device with a touch-sensitive display
- the user might also be able to connect instruments such as a keyboard 160 to the computer to allow for the input and recording of music data directly from the user.
- the process of the instant invention will provide a user-friendly preferably touch-based graphical user interface via music generation and editing software.
- the method will preferably utilize MIDI loops or audio clips organized into styles, with these MIDI loops being enhanced with a classification into song parts.
- the loops will also be tagged with data that is then used by the instant invention during the music generation process. These tags represent a classification of the loops - as song parts, pitch and melody qualification.
- a well organized and tagged database of MIDI loops or audio loops is an essential part of the instant invention. Such a database of MIDI loops or a selection of individual styles referring to the MIDI loops will preferably be provided by the instant inventors.
- the type, layout and interaction possibilities of the graphical user interface will be accessible, in some embodiments, by the user with mouse and keyboard. However, the instant invention will be especially useful when used in connection with a touch interface, if the device(s) on which the instant invention is executed provide such a possibility.
- the music generation, or jam mode of the instant invention will react to the user interactions and user input in real time, incorporating the users activities into the music generation process. This will give the user instantaneous dynamic feedback and sense of achievement.
- Each different interaction of the user, the selection of a different tone pitch, the selection and de-selection of a different instrument or the definition of a different song part will be incorporated almost instantly by the instant invention, however in a preferred embodiment only after the next bar is reached.
- the present invention will preferably begin with the initiation by the user of the music generation process - the live jam mode.
- the live jam mode might be a component part of a general music generation software or it might be a standalone program
- the user will be presented with a graphical user interface 200 containing controls for all essential settings. All of these controls will be selectable and changeable by the user at any time both before and during the music generation process, with the selections made by the user being incorporated by the instant invention into the music generation process in real time.
- the user preferably will preferably start with the selection of a music style 210, a style which will then be used in connection with the created music work as long as it is selected.
- the user will also be able to define and change the tempo 220 of the music work.
- a vol ume control 230 will be provided which will allow the user to change the output vol ume.
- the vol ume control 230 will affect only the voice that is currently selected in the instrument bank 270 (e.g., keyboard 275), thereby adjusting the vol ume of the selected voice as compared with the other currently active / selected voices 260. In other instances, it could be used to control the total audio volume as heard through an attached speaker or headphones.
- the overall volume setti ng will general ly not be stored by the instant invention during the music generati on process (i.e., preferably the setting of the output volume will not effect the volume of the musical work that is subsequently saved), although in some instances it might be tracked in order to adjust the relative vol ume of one instrument / part with respect to another during playback.
- an embodi ment there will be a number of controls and settings that allow the user to direct the music generation process, i.e., to influence the live jam mode on a deeper level.
- the user will be able to define and select individual song parts 240 whi ch can be combined to create a complete song. That is, in some embodiments the user will be able to signal to the instant invention that he or she wants the music that is being created to transition into the next phase, e.g., from the introduction, to the verse.
- the instant invention will then select and play MIDI or music loops that are compatible with or representative of the selected song section.
- the instant invention will provide the user with an assortment of different song parts that allow the user to control the song structure.
- the instant invention will incorporate at least one music or MIDI loop that is tagged with the 'intro_ tag and has a tempo, chord structure, etc., compatible with an introduction.
- Other possibilities include such parts as verse, chorus, bridge, outro, etc., as those terms are known in the music arts.
- the instant invention in this embodiment has provided parts labeled 'Ending_, 'VAR1_, 'Fill_ and 'VAR2,_ selection of which will cause the instant program to insert different music or MIDI loops that contain music that adheres to the selected style, and pitch and to the selected song part.
- 'VAR1_ and 'VAR2_ stand for variations and 'Fill_ represents music material that is usable as fill material as those terms are known and understood in the art.
- the selection of 'Ending_ will insert a music or MIDI loop that provides an ending to the current music piece, allowing the user to end the generated song in a musically pleasing and/or correct fashion.
- the instant invention will preferably cycle through the audio loop material that is available to prevent the same audio loops from being sel ected for, by way of example, four consecutive bars.
- the instant invention will preferably do that for all audio loops, including both loops containing melody based instruments as well as loops contai ni ng drum and bass audio material.
- the instant invention will preferably generate and select the song parts automatical ly (e.g., intro, verse, chorus, outro, ending) and the user will be provided with a fully rough structured song.
- the instant invention will provide an effects option 242.
- an opportunity will be provided to apply, modify, or remove various audio effects (e.g., filters, reverb, flange, etc.).
- the instant invention when the user selects the PLAY control 292 the instant invention will automatically start the music generation process. In this example, this will cause the program to utilize the user's selected style, song parts, tone pitch / key and instruments in the generation process.
- the REC control 295 will be activated too, thereby initiating the recording process.
- the user will indicate to the instant invention that the generation and recordi ng of the current music work is to be ended. In this instance, the instant invention will insert a music loop representing an ending in order to complete the music generation process for the current music piece.
- a pitch setting 250 will also be provided in some embodiments. This option will allow the user to select individual tone pitch / note values manually or to select an automatic setting wherein the instant invention will automatically generate tone pitch / note changes accordingly.
- the currently selected and active tone pitch will be displayed in the graphical user interface.
- the real key will also be displayed to the user, so that the user who playing along with the instant invention via a connected instrument will be able to play in the correct key.
- the instant invention will preferably select tone pitch / note settings in order to adapt to generated song to the playing of the user, thereby generating a musically pleasing music work. If a specific tone pitch has been selected by the user then the instant invention will generate the music - i.e., it will initiate the jam mode in the selected tone pitch / key as long as it is selected.
- the user will also be able to define and select the instruments that are to be used in generating the music.
- the graphical user interface will provide an instrument selection bar 260 displays the current voice selection (i.e., the instrument bank 270).
- the user will be able to select and de-select individual instruments dynamically which changes will be reflected in real time via external speakers, headphones, etc.
- the inclusion / exclusion and change of instruments can be done at any time and without any restrictions, although the instant invention also provides an automatic setting, which when activated, utilizes and automatically selects instruments that are available according to the currently selected style. The instant invention will select / deselect and change these instruments in an alternating fashion to generate a pleasing music work.
- Figure 3 provides a summary of a preferred workflow of the instant invention.
- the user will activate the jam mode 300.
- the user will select a style 310, the tempo 320, the tone pitch 330, the individual song parts 340 and the preferred instruments 350.
- these settings do not necessarily need to be defined sequentially before the start of the music generation process. They will be selectable and changeable the whole time that the jam mode is active. This will encourage the user to alternate between multiple ones of the available settings 360.
- the instant invention will automatically incorporate the changed settings into the jam mode - the music generation process will be modified dynamically in real time.
- the instant invention will record 370 the generated music while jam mode is active and store it on a computer.
- the store might be hard disc, a flash based memory device, etc.
- the recording will then be available for the user for further additional processing if that is desi red.
- Figure 4 illustrates a preferred data structure of the instant invention.
- the instant invention utilizes, in an embodiment, an organized and data structure.
- This database will be provided in some instances by the instant inventors.
- a limited set of styles might be provided initially, with additional styles available on demand and for a fee.
- additional styles the order will be initiated by the user, with the new styles being transmitted digitally directly to the user.
- the purchased styles might be shipped on a storage medi um after purchase.
- the user's styles could be stored either locally on the user's computer, or remotely and access via a LAN, Wi-Fi, the Internet, etc.
- Each style will preferably be stored and named internally according to a simple naming scheme.
- Style A 400 where each style has a specific number of audio loops associated with that style.
- each of these loops (audio loop A 405, audio loop B 410, and audio loop C 415) need not be strictly associated with a single style. It is preferable and possible that an audio loop might be associated with multiple different styles (style b 420).
- Figure 5 depicts a preferred data structure of one of the individual styles 500 of the instant invention.
- the data record associated with a style 500 will preferably contain information about the number of individual song parts that are available and selectable 510. Additionally, the number and type of individual instruments 520 that are part of the style will preferably stored in the data structure of each individual style.
- Each style will preferably have a predefined tempo 530. However it should be noted that once the user selects the style and interacts with the user controls, the predefined tempo might be changed automatically and/or manually by the user.
- each style will have a predefined tone pitch 540 or key that can be modi fi ed by the user.
- each style will contain icons 550 and animations 560 that represent the corresponding instrument and/or the particular style. These pictures and animations will preferably be displayed in the graphical user interface as is generally indicated in Figure 5 .
- the icons / images will be ani mated so that the user will be able to see a representation of a musician playing the selected instrument.
- a nother preferred data value that could be stored with each style is the name 570 of that style which will be the name that is displayed in the graphical user interface for selection by the user.
- Figure 6 depicts a preferred data structure for use with the individual audio loops 600.
- the audio loops in addition to the audio material , might contain information about the styles 610 with which they have been associated. This might be a plurality of styles, or only one style.
- the audio loop has a specific predefined inherent tempo value 620, which is also stored in the data structure of the audio loop. Additionally information about the usability of the audio loop as a part 630 (intro, variance 1, variance 2, ending, etc.) in the music generation process will be stored in this data structure.
- Each audio loop will additionally, and preferably, be associated with information regarding the instrument 640 the loop was taken from or created by.
- an important value in the data structure of the audio loops will be the information about the harmony suitability 650 or compatibility of each clip with respect to the others. This quantity will indicate, in a general way, whether or not one audio loop is musically compatible with another.
- the harmony suitability could either be provided by the user and inserted into the data structure, or designed by the creator of the database based on a scale that indicates compatibility with another currently selected audio loop.
- the instant invention will determine the harmony suitability by first analyzing the data values of a selected audio loop and then comparing those values to the data of another audio loop to determine respective pitch, tempo, note scale (e.g., blues, minor, rock, etc.).
- respective pitch, tempo, note scale e.g., blues, minor, rock, etc.
- the user might be provided with the option to modify the filter effects in real time.
- the effects button 242 has been activated which has brought a filter parameter window 810 to the forefront.
- the user has been provided with the option of simultaneously adjusting two parameters by way of a stroke 820 across the face of the open window 810.
- the user will be able to simultaneously modify two parameters, although it should also be clear that a single parameter (or three or more) could be similarly modified.
- the audio center frequency might be modified ('HIGH_ to 'LOW_) and/or the reverb ('HEAVY_ to 'LIGHT_).
- the user has elected to increase the center frequency and increase the reverb by virtue of this single stroke 820.
- this approach is possible and have been specifically contemplated by the instant inventors.
- the instant invention will be used in practi ce generally as indicated in this figure.
- the jam mode will be activated (step 700). This will initiate the computer program that implements the instant invention on the device selected by the user which might include a desktop computer, a laptop computer, a smart phone, a tablet computer, etc.
- the style, tempo, and musical key of the work will be selected (step 705).
- the instrument bank 270 might include a vocal performance of scat lyrics.
- the selection of style might automatically populate the voice bank with horns, strings, woodwinds, etc., each of which will preferably have been selected to compliment the style selected by the user.
- the associated instruments might include claves and/or drums (on percussion), acoustic guitar(s), bass, piano, flute, etc.
- the time signature will often be 12/8 or 6/8, but, in some embodiments it will be 4/4, 2/2, or 2/4.
- the default instruments might be drums, guitar(s) (e.g., electric, slide, or acoustic), bass, harmonica, keyboard, fiddle (violin), etc.
- the time signature would most commonly be 4/4, but other choices are certainly possible.
- the form of the music could follow, by default, the standard 12 bar or 8 bar blues chord progressions, as those terms are known in the art
- the solo instruments e.g., lead guitar, harmonica, keyboard, etc.
- the user will accept the default instrument bank 270 or select alternative instruments or vocal s. That is, in some embodiments, the user might prefer, by way of example only, a distorted guitar sound as opposed to an acoustic guitar sound. In such an instance, the user will be abl e to either accept the instruments/vocals offered or replace any of those with another instrument or vocal sound.
- the user will activate a first instrument (step 715) which will cause it to begin playing according to the parameters previously selected by the user.
- the activated instruments 275, 280, and 285 will preferably be highlighted on the screen so that the user can readily tell which instrument or instruments have been activated and are, thus, in the current mix.
- the instant invention when the instant invention will execute on a tablet computer or other device with a touch-sensitive screen. In such environments, the user will merely need to touch the associated icon in order to activate or mute an instrument
- the instant invention Upon activation of the first instrument, the instant invention will preferably automatically begin a performance of that instrument accordi ng to the selected style, at the selected tempo, and in the selected key or according to a sel ected pitch (step 725), etc.
- the current state of the work will be communicated to the user as it is generated via a local speaker or, if the user is wearing headphones, to the user's headphones.
- This first instrument that is selected will provide the foundation for what is to follow. In many cases, the first instrument that is activated will be the drums, but that is not a requirement.
- the instant invention will, in some embodiments, begin to continuously record the performance (step 730).
- the first iteration there will typically be a single instrument that is playing but, subsequently, more instruments will be added as is discussed below.
- the recordi ng will be initiated using all appropriate instruments. In that case, the user will select/deselect and modify the instruments that are currently playing as described below.
- the starting and stopping of the recording will be under the control of the user and the recording may not commence until after the user has added several instruments to the mix, at which time it will begin automatically. In other cases, no recording will be performed until the user specifically selects that option (e.g., via the on-screen "Record_ button 295 in Figure 2 ).
- the instant invention will preferably follow the 'YES_ branch.
- the instant invention will preferably branch to step 725 and thereby continue to perform and, optionally, record the performance accordi ng the step 730.
- a test will be performed to determi ne whether or not the change that has been made is an i ndi cati on that the user wishes to stop the music generation process (decision item 738). If the user has decided to end the music creation process (i.e., the 'YES_ branch of decision item 738), the instant invention will preferably write whatever recording was made to nonvolatile computer storage / computer readable media (e.g., magnetic di sk, flash memory, CD, DV D, etc.) and stop the music creation process (step 750).
- nonvolatile computer storage / computer readable media e.g., magnetic di sk, flash memory, CD, DV D, etc.
- the user will be given an opportunity to make changes in the performance environment (step 740).
- the user will be given the option of replacing any or all of the items in the instrument bank 270, adjusting the relative volume of each instrument in the mix, adjusting the tempo of the entire work, changing the style of the entire musical work, it's key, time signature, etc.
- the user will be given the option of modifying any performance-related parameter in real time while the performance and/or recording is taking place (step 740).
- changed parameters will be instantly reflected in the audio output of the work, where 'instantly_ should be understood to mean that the changes will be performed at the next moment when it would make musical sense to do so (e.g., at the next beat, at the start of the next measure, etc.), i.e., at a moment that is musically synchronous with the instruments/style/tempo that are already in the performance.
- any change that the user makes in a performance variable will be immediately reflected in the audio output of the performance.
- the change or changes i ndi cated by the user will be implemented in the musical performance (step 745) in such a way as to harmoniously combine the additional, changed, or removed (muted) instruments/vocals with the current performance.
- This might mean, by way of example, matching the tempo of the current performance, its key, etc.
- the new instrument will not enter the mix until the start of the next full measure of the performance but that is not a requirement.
- the instant invention provides a highly creative work method for both novice and professional user when generating music or just enjoying music and the music generati on process.
- the instant invention will adapt to the knowledge and professionalism of the user providing individual options and features for selection that either adds complexity (for professionals) or minimizes complexity (for novice users).
- the instant invention will enter a random creativity mode, wherein the method will automatically change and replace loops, instruments and pitch. This mode will be entered when the user interaction level is very low.
- the user can link an instrument with the instant invention and the method will 'listen_to the input from the instrument of the user and will accordingly select song parts, instruments and pitch to therewith generate a music work.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Electrophonic Musical Instruments (AREA)
Description
- The present invention relates generally to the field of multimedia editing and, more generally, to automatically composing musical works with intuitive user interaction.
- The field of music editing and generation has undergone substantial change in recent years. A mong the more significant changes is that music generation went from a work intensive and time consumi ng process of learning to play and master a real instrument to one that based around the use of computerized techniques to generate new music works.
- Despite all the technological advances, computer driven / digital data driven composition has never completely supplanted the traditional way of making music by playing actual instruments. That being said, a number of approaches are available wherein computers and instruments can be used in together to help a user compose music. These approaches, however, are typically very limited in that the user enters music into the computer via an instrument and the computer records the music. The computer may additionally provide, for example, background music or supporting music to accompany the user's performance. This accompaniment however, once initiated is static or at least always depends on continuous input by the user to change particulars, like the music itself, or the key, rhythm or pitch which prevents the user from concentrating on playing the instrument.
- Thus, what is needed is a system and method that supports a user when generating music, wherein the level of support is adjustable by the user. The level of support should be variable and include approaches such as fully automatic, user driven selection, and real-time automatic accompaniment generation for a user who is playing an instrument. Additionally what is needed is a system and method that smoothly combines the traditional approach to music creation -with instruments - with a computer based music clip driven approach.
- Heretofore, as is well known in the media generati ng and editing industry, there has been a need for an invention to address and solve the above-descri bed problems. Accordingly it should now be recognized, as was recognized by the present inventors, that there exists, and has existed for some time, a very real need for a system and method that would address and solve the above-described problems.
-
EP1557836 defines composing/editing media clips by automatic selection of themes and of (harmony-) compatible replacement clips and by pointing on-screen on a media track. Criteria are clip-type, -tempo, -start and -ending time. However there is no audio loops in the description, althoughFigure 4 shows audio loops, only audio clips are meant; and not real time interactive music composition is disclosed. -
EP1876583 defines a block sequence compiler with a user interface defi ni ng duration, library (=part) selection. Loop playback command while correcting tempo and pitch is also disclosed, additionally a play repetition of a single loop is foreseen. However there is no disclosure of the selection of either instrument, tempo and/or pitch values by the user. - XP031481665: 'AudioCycle: Browsing Musical Loop Libraries_, CONTENT BASED MULTIMEDIA INDEXING, 2009. CMBI 09. SEVENTH INTERNATIONAL WORKSHOP ON, IEEE, PISCATAWAY, NJ, USA, 3June 2009 (2009-06-03), pages 73-80 discloses music loop library browser displaying music according to similarity in timbre, harmony and rhythm. Spatial sound is obtained by playing several extracts at the same time, including beat synchronous loop play and pitch alteration. It defines a database having descriptors (style, instrument) and data structure having features (timbre, rhythm, tempo, onset positions, pitch, harmony notes) stored as metadata in the music loops library, the loop segments being synchronized at beat locations and having a length of a beat period. However it does not define music composition, it only discloses simultaneous multi-loop beat synchronous playback.
-
US2004089141 provides a top-down and interactive auto-composition approach. It defines patterns used as building blocks for real time interactive music composition, or a user modifying an auto-composed music in real time. Improvisation is also disclosed as well as data structures having instrument, chord, tempo, key, style and structure indexed by part. It does not disclose audio loops or that individual parts can be utilized as parts, it discloses sub-blocks or riffs but no loops. It discloses the storage of MIDI-loops however these are not utilized within the disclosed algorithm. -
US2012297959 defines particular loop replacement according to a score versus a specific harmony matrix. A default instrument is automatically selected for each given audio track. However these tracks are key adjusted note by note, not loop by loop. -
US2012312145 defines a music composition automation system combining 'phrases_ which can be pre-recorded audio clips, synthesizer loops or other types of audio data sources. Each instrument in the style includes a set of phrases that can be played, associated with a specific song section type using this data structure. The music style library data structure includes fields: Phrase to loop_, style, re-trigger flag, chord, start, end, genre, instrument and volume. The audio recordings library data structure includes type, tempo, beats per bar, key and chords. It does not disclose a real time interactive algorithm, no jam session capability and no recursive algorithm. - E P1666967 discloses automatically creating an emoti on-controlled soundtrack for use with an existing video work. An algorithm initiates loop insertions according to emotion tag and style. The selected music loops will be cut, looped, cross-faded, etc., to create a seamless audio soundtrack with additional automatic insertion of fade-ins or fade-outs before and after selected music loops to smooth the audio transitions into / out of a section. This reference discloses clips/loops, tags, it does not however disclose a part-suitability tag or field. Additionally it does disclose beat insertion of clips bases on similar beat length, not beat synchronicity.
-
US5877445 defines an audio block sequence compiler selecting segments from a segments library classified in a characteristic compatibility table by: Duration; suitability for begin or end of a sequence, compatibility with each block. The compiler generates audio sequences using the characteristic table and user criteria (duration, mood, intensity), it does not define however audio loops, it utilizes looping of individual segments, additionally there is no tempo selection, no style selection and no disclosure of an iterative algorithm. - XP032265400 'Advanced Synchronization of Audio or Symbolic Musical Patterns: AnAlgebraic Approach_ _SEMANTIC COMPUTING (IGSC), 2012 IEEE SIXTH INTERNATIONAL CONFERENCE ON, IEEE, 19 September 2012 (2012-09-19), pages 302-309 discloses an 'advance live looping_ approach using synchronization windows for the creation of interactive music pieces, wherein loops are added to an initial loop selection in real time. However, style and audio loops data structures having the fields 'instrument, tempo, part_ are not disclosed.
- Before proceeding to a description of the present invention, however, it should be noted and remembered that the description of the invention which follows, together with the accompanying drawings, should not be construed as limiting the invention to the exampl es (or preferred embodiments) shown and described. This is so because those skilled in the art to which the invention pertains will be able to devise other forms of the invention within the ambit of the appended claims.
- The invention is as defined by the appended
independent claims 1, 11, 12, with specific embodiments defined by dependent claims 2-10. - The expressions "MIDI loops, "audio loops, "audio clips", "sound" are considered here equivalent, specifically in the context of the present description, see
figure 6 and in view ofEP1557836 figure 4 . - All passages of the description, even if defined as "maybe" or "in some embodiments" or as optional, that correspond to the subject-matter of
claim 1 are to be interpreted as non-optional essential features. - Other objects and advantages of the invention will become apparent upon reading the following detailed description and upon reference to the drawings in which:
-
Figure 1 depicts a general working environment of the instant invention. -
Figure 2 illustrates a preferred graphical user interface of the instant invention. -
Figure 3 illustrates a general workflow suitable for use with the instant invention. -
Figure 4 depicts a general layout of the data structure of the audio material suitable for use with the instant invention. -
Figure 5 illustrates one possible data content of the individual selectable styles of the instant invention. -
Figure 6 illustrates one possible data structure and content of the audio loops suitable for use with the instant invention. -
Figure 7 contains a more detailed operating logic suitable for use with the instant invention. -
Figure 8 illustrates one example by which filter settings might be altered in real time. - Referring now to the drawings, wherein like reference numerals indicate the same parts throughout the several views, there is provided a preferred system and method for implementing an intelligent automatic music jam session.
- As is generally indicated in
Figure 1 , at least a portion of the instant invention will be implemented in form of software running on a user'scomputer 100. Such a computer will have some amount of program memory and hard disc storage (whether internal or accessible via a network) as is conventionally utilized by such units. Additionally it is possible that anexternal camera 110 of some sort be utilized with - and will preferably be connecti bl e to - the computer so that video and/or graphic information can be transferred to and from the computer. Preferably thecamera 110 will be a digital video camera, although that is not a requirement, as it is contemplated that the user might wish to utilize still images from a digital still camera in the creation of his or her multimedia work. Further given the modern trend toward incorporation of cameras into other electronic components (e.g. in handheld computers, telephones, laptops, etc.) those of ordinary skill in the art will recognize that the camera might be integrated into the computer or some other electronic device and, thus, might not be a traditional single-purposes video or still camera. Although the camera will preferably be digital in nature, any sort of camera might be used, provided that the proper interfacing between it and the computer is utilized. - Additionally a
microphone 130 might be utilized so that the user can add voice-over narration to a multimedia work or can control his or her computer via voice-recognition software. A CD orDV D burner 120 could be useful for storing content on writable or rewritable media. Additionally it might be possible that a mobiledata storage device 140 might be connected to the computer, such as an mp3 player for example, for storage of individual music clips or other data as needed by the instant invention. Furthermore in some embodiments the user would bring asmart phone 150 or other touch based device (e.g., a tablet computer such as a M icrosoft÷ Surface÷ table or an iPad÷, or other device with a touch-sensitive display) into communication with the computer in order to, for example, control the computer or exchange data between the computer and the device. - A ccording to an embodiment of the instant invention, the user might also be able to connect instruments such as a
keyboard 160 to the computer to allow for the input and recording of music data directly from the user. - The process of the instant invention will provide a user-friendly preferably touch-based graphical user interface via music generation and editing software. The method will preferably utilize MIDI loops or audio clips organized into styles, with these MIDI loops being enhanced with a classification into song parts.
- The loops will also be tagged with data that is then used by the instant invention during the music generation process. These tags represent a classification of the loops - as song parts, pitch and melody qualification. A well organized and tagged database of MIDI loops or audio loops is an essential part of the instant invention. Such a database of MIDI loops or a selection of individual styles referring to the MIDI loops will preferably be provided by the instant inventors.
- The type, layout and interaction possibilities of the graphical user interface will be accessible, in some embodiments, by the user with mouse and keyboard. However, the instant invention will be especially useful when used in connection with a touch interface, if the device(s) on which the instant invention is executed provide such a possibility.
- The music generation, or jam mode of the instant invention will react to the user interactions and user input in real time, incorporating the users activities into the music generation process. This will give the user instantaneous dynamic feedback and sense of achievement. Each different interaction of the user, the selection of a different tone pitch, the selection and de-selection of a different instrument or the definition of a different song part will be incorporated almost instantly by the instant invention, however in a preferred embodiment only after the next bar is reached.
- The present invention will preferably begin with the initiation by the user of the music generation process - the live jam mode. The live jam mode might be a component part of a general music generation software or it might be a standalone program
- In an embodiment, the user will be presented with a
graphical user interface 200 containing controls for all essential settings. All of these controls will be selectable and changeable by the user at any time both before and during the music generation process, with the selections made by the user being incorporated by the instant invention into the music generation process in real time. - The user preferably will preferably start with the selection of a
music style 210, a style which will then be used in connection with the created music work as long as it is selected. The user will also be able to define and change thetempo 220 of the music work. A vol ume control 230 will be provided which will allow the user to change the output vol ume. In some embodiments, the vol ume control 230 will affect only the voice that is currently selected in the instrument bank 270 (e.g., keyboard 275), thereby adjusting the vol ume of the selected voice as compared with the other currently active / selected voices 260. In other instances, it could be used to control the total audio volume as heard through an attached speaker or headphones. The overall volume setti ng will general ly not be stored by the instant invention during the music generati on process (i.e., preferably the setting of the output volume will not effect the volume of the musical work that is subsequently saved), although in some instances it might be tracked in order to adjust the relative vol ume of one instrument / part with respect to another during playback. - In addition, in an embodi ment there will be a number of controls and settings that allow the user to direct the music generation process, i.e., to influence the live jam mode on a deeper level. The user will be able to define and select
individual song parts 240 whi ch can be combined to create a complete song. That is, in some embodiments the user will be able to signal to the instant invention that he or she wants the music that is being created to transition into the next phase, e.g., from the introduction, to the verse. The instant invention will then select and play MIDI or music loops that are compatible with or representative of the selected song section. The instant invention will provide the user with an assortment of different song parts that allow the user to control the song structure. For example, by selecting the 'Intro_ option the instant invention will incorporate at least one music or MIDI loop that is tagged with the 'intro_ tag and has a tempo, chord structure, etc., compatible with an introduction. Other possibilities include such parts as verse, chorus, bridge, outro, etc., as those terms are known in the music arts. - Returning to
Figure 2 , the instant invention in this embodiment has provided parts labeled 'Ending_, 'VAR1_, 'Fill_ and 'VAR2,_ selection of which will cause the instant program to insert different music or MIDI loops that contain music that adheres to the selected style, and pitch and to the selected song part. For purposes of the instant disclosure, it should be noted that 'VAR1_ and 'VAR2_ stand for variations and 'Fill_ represents music material that is usable as fill material as those terms are known and understood in the art. In an embodi ment, the selection of 'Ending_ will insert a music or MIDI loop that provides an ending to the current music piece, allowing the user to end the generated song in a musically pleasing and/or correct fashion. It should be noted that after the selection of the 'VAR1_ or 'VAR2_ song part, the instant invention will preferably cycle through the audio loop material that is available to prevent the same audio loops from being sel ected for, by way of example, four consecutive bars. The instant invention will preferably do that for all audio loops, including both loops containing melody based instruments as well as loops contai ni ng drum and bass audio material. - For the novice user the selection of the 'automatic_ option regarding the song part setti ng 240 will also be possible. In this variation, the instant invention will preferably generate and select the song parts automatical ly (e.g., intro, verse, chorus, outro, ending) and the user will be provided with a fully rough structured song.
- Additionally, and in some embodiments, the instant invention will provide an
effects option 242. Preferably, when the user selects this on-screen icon an opportunity will be provided to apply, modify, or remove various audio effects (e.g., filters, reverb, flange, etc.). - In some embodiments, when the user selects the
PLAY control 292 the instant invention will automatically start the music generation process. In this example, this will cause the program to utilize the user's selected style, song parts, tone pitch / key and instruments in the generation process. Preferably when thePLAY control 292 is selected, theREC control 295 will be activated too, thereby initiating the recording process. By selecting theREC control 295 again, in some embodiments the user will indicate to the instant invention that the generation and recordi ng of the current music work is to be ended. In this instance, the instant invention will insert a music loop representing an ending in order to complete the music generation process for the current music piece. - A pitch setting 250 will also be provided in some embodiments. This option will allow the user to select individual tone pitch / note values manually or to select an automatic setting wherein the instant invention will automatically generate tone pitch / note changes accordingly. In an embodiment, the currently selected and active tone pitch will be displayed in the graphical user interface. However it is also preferable that the real key will also be displayed to the user, so that the user who playing along with the instant invention via a connected instrument will be able to play in the correct key. However it is also possible, and within the scope of the instant invention, for the user to ignore the key and play as he sees fit. In that case, the instant invention will preferably select tone pitch / note settings in order to adapt to generated song to the playing of the user, thereby generating a musically pleasing music work. If a specific tone pitch has been selected by the user then the instant invention will generate the music - i.e., it will initiate the jam mode in the selected tone pitch / key as long as it is selected.
- In addition to the al ready described user controls, in an embodiment the user will also be able to define and select the instruments that are to be used in generating the music. In some embodiments, the graphical user interface will provide an
instrument selection bar 260 displays the current voice selection (i.e., the instrument bank 270). In an embodiment the user will be able to select and de-select individual instruments dynamically which changes will be reflected in real time via external speakers, headphones, etc. Preferably, the inclusion / exclusion and change of instruments can be done at any time and without any restrictions, although the instant invention also provides an automatic setting, which when activated, utilizes and automatically selects instruments that are available according to the currently selected style. The instant invention will select / deselect and change these instruments in an alternating fashion to generate a pleasing music work. -
Figure 3 provides a summary of a preferred workflow of the instant invention. In a first preferred step the user will activate thejam mode 300. Next, and preferably, the user will select astyle 310, thetempo 320, thetone pitch 330, theindividual song parts 340 and thepreferred instruments 350. However, it should be noted that these settings do not necessarily need to be defined sequentially before the start of the music generation process. They will be selectable and changeable the whole time that the jam mode is active. This will encourage the user to alternate between multiple ones of theavailable settings 360. In an embodiment, the instant invention will automatically incorporate the changed settings into the jam mode - the music generation process will be modified dynamically in real time. - The instant invention will record 370 the generated music while jam mode is active and store it on a computer. The store might be hard disc, a flash based memory device, etc. The recording will then be available for the user for further additional processing if that is desi red.
-
Figure 4 illustrates a preferred data structure of the instant invention. The instant invention utilizes, in an embodiment, an organized and data structure. This database will be provided in some instances by the instant inventors. In some embodiments, a limited set of styles might be provided initially, with additional styles available on demand and for a fee. Where a user desires additional styles, the order will be initiated by the user, with the new styles being transmitted digitally directly to the user. Of course, it is also possible that the purchased styles might be shipped on a storage medi um after purchase. The user's styles could be stored either locally on the user's computer, or remotely and access via a LAN, Wi-Fi, the Internet, etc. Each style will preferably be stored and named internally according to a simple naming scheme. For example, seeStyle A 400, where each style has a specific number of audio loops associated with that style. However, each of these loops (audio loop A 405,audio loop B 410, and audio loop C 415) need not be strictly associated with a single style. It is preferable and possible that an audio loop might be associated with multiple different styles (style b 420). -
Figure 5 depicts a preferred data structure of one of theindividual styles 500 of the instant invention. The data record associated with astyle 500 will preferably contain information about the number of individual song parts that are available and selectable 510. Additionally, the number and type ofindividual instruments 520 that are part of the style will preferably stored in the data structure of each individual style. Each style will preferably have apredefined tempo 530. However it should be noted that once the user selects the style and interacts with the user controls, the predefined tempo might be changed automatically and/or manually by the user. - Additionally in some embodiments each style will have a
predefined tone pitch 540 or key that can be modi fi ed by the user. Further, in an embodiment each style will containicons 550 andanimations 560 that represent the corresponding instrument and/or the particular style. These pictures and animations will preferably be displayed in the graphical user interface as is generally indicated inFigure 5 . In some embodiments, the icons / images will be ani mated so that the user will be able to see a representation of a musician playing the selected instrument. A nother preferred data value that could be stored with each style is thename 570 of that style which will be the name that is displayed in the graphical user interface for selection by the user. -
Figure 6 depicts a preferred data structure for use with theindividual audio loops 600. In this embodi ment, the audio loops, in addition to the audio material , might contain information about thestyles 610 with which they have been associated. This might be a plurality of styles, or only one style. Furthermore the audio loop has a specific predefinedinherent tempo value 620, which is also stored in the data structure of the audio loop. Additionally information about the usability of the audio loop as a part 630 (intro,variance 1,variance 2, ending, etc.) in the music generation process will be stored in this data structure. - Each audio loop will additionally, and preferably, be associated with information regarding the instrument 640 the loop was taken from or created by. In some embodiments, an important value in the data structure of the audio loops will be the information about the
harmony suitability 650 or compatibility of each clip with respect to the others. This quantity will indicate, in a general way, whether or not one audio loop is musically compatible with another. The harmony suitability could either be provided by the user and inserted into the data structure, or designed by the creator of the database based on a scale that indicates compatibility with another currently selected audio loop. Additionally it might be possible that the instant invention will determine the harmony suitability by first analyzing the data values of a selected audio loop and then comparing those values to the data of another audio loop to determine respective pitch, tempo, note scale (e.g., blues, minor, rock, etc.). - In one example, and as is indicated in
Figure 8 , the user might be provided with the option to modify the filter effects in real time. In this example, theeffects button 242 has been activated which has brought afilter parameter window 810 to the forefront. In this figure, which would be most representative of the appearance of the instant invention while executing on a tablet computer, the user has been provided with the option of simultaneously adjusting two parameters by way of astroke 820 across the face of theopen window 810. In this instance, the user will be able to simultaneously modify two parameters, although it should also be clear that a single parameter (or three or more) could be similarly modified. InFigure 8 , the audio center frequency might be modified ('HIGH_ to 'LOW_) and/or the reverb ('HEAVY_ to 'LIGHT_). In this example, the user has elected to increase the center frequency and increase the reverb by virtue of thissingle stroke 820. Clearly, other variations of this approach are possible and have been specifically contemplated by the instant inventors. - Turning next to
Figure 7 , accordi ng to one embodiment, the instant invention will be used in practi ce generally as indicated in this figure. As a first step in this embodiment, the jam mode will be activated (step 700). This will initiate the computer program that implements the instant invention on the device selected by the user which might include a desktop computer, a laptop computer, a smart phone, a tablet computer, etc. Next, in this embodiment, the style, tempo, and musical key of the work will be selected (step 705). - Although the choice of instruments (
Figure 2 ) might be conventional orchestral instruments (e.g., brass, woodwinds, violins or other stringed instruments, drum or other percussion, etc.) in some embodiments the user-specified style will guide selection of those instruments. Such a selection will automatically populate, in some embodiments, aninstrument bank 270 with a selection of alternative instruments that are suitable for the sel ected style. For example, in the embodi ment ofFigure 2 , sel ecti on of the rock/pop style will popul ate theinstrument bank 270 with sounds from a rock drum set, guitars, electric or acoustic keyboards, etc. In other embodiments, e.g., where jazz is selected as the style, theinstrument bank 270 might include a vocal performance of scat lyrics. In still other embodiments, the selection of style might automatically populate the voice bank with horns, strings, woodwinds, etc., each of which will preferably have been selected to compliment the style selected by the user. - As a specific example, if the user selects, for example, a 'salsa_ style the associated instruments might include claves and/or drums (on percussion), acoustic guitar(s), bass, piano, flute, etc. The time signature will often be 12/8 or 6/8, but, in some embodiments it will be 4/4, 2/2, or 2/4.
- As another example, if the user selects a 'blues_ style, the default instruments might be drums, guitar(s) (e.g., electric, slide, or acoustic), bass, harmonica, keyboard, fiddle (violin), etc. The time signature would most commonly be 4/4, but other choices are certainly possible. The form of the music could follow, by default, the standard 12 bar or 8 bar blues chord progressions, as those terms are known in the art The solo instruments (e.g., lead guitar, harmonica, keyboard, etc.) would often be restricted to playing the standard blues scale (e.g., flatted third, fifth and seventh of the major scale), but other variations are certainly possible. The exact solo sequences that will be played in a given instance would be designed to follow and complete the current state of the underlyi ng chord progression and complement it. Those sequences might be provided by the instant invention using standard riffs or sequences, or determined in real-time (e.g., randomly or according to some deterministic sequence) according to the desi res of the user.
- As a next
preferred step 710, the user will accept thedefault instrument bank 270 or select alternative instruments or vocal s. That is, in some embodiments, the user might prefer, by way of example only, a distorted guitar sound as opposed to an acoustic guitar sound. In such an instance, the user will be abl e to either accept the instruments/vocals offered or replace any of those with another instrument or vocal sound. - Next, and preferably, the user will activate a first instrument (step 715) which will cause it to begin playing according to the parameters previously selected by the user. As is indicated in
Figure 2 , the activatedinstruments - Upon activation of the first instrument, the instant invention will preferably automatically begin a performance of that instrument accordi ng to the selected style, at the selected tempo, and in the selected key or according to a sel ected pitch (step 725), etc. In an embodiment, the current state of the work will be communicated to the user as it is generated via a local speaker or, if the user is wearing headphones, to the user's headphones. This first instrument that is selected will provide the foundation for what is to follow. In many cases, the first instrument that is activated will be the drums, but that is not a requirement.
- Next, the instant invention will, in some embodiments, begin to continuously record the performance (step 730). Of course, at the first iteration, there will typically be a single instrument that is playing but, subsequently, more instruments will be added as is discussed below. In some embodiments, for example, if the user has elected automatic generation of a music work, the recordi ng will be initiated using all appropriate instruments. In that case, the user will select/deselect and modify the instruments that are currently playing as described below.
- Returning to the example of
Figure 7 , in an embodiment, the starting and stopping of the recording will be under the control of the user and the recording may not commence until after the user has added several instruments to the mix, at which time it will begin automatically. In other cases, no recording will be performed until the user specifically selects that option (e.g., via the on-screen "Record_ button 295 inFigure 2 ). - Next, if the user has made a change in the settings (decision item 735), the instant invention will preferably follow the 'YES_ branch. On the other hand, if no change has been made in the settings (i.e., the 'NO_ branch of decision item 735), the instant invention will preferably branch to step 725 and thereby continue to perform and, optionally, record the performance accordi ng the
step 730. - If there has been a change in the settings per the: YES_ branch of
decision item 735, in some embodiments, a test will be performed to determi ne whether or not the change that has been made is an i ndi cati on that the user wishes to stop the music generation process (decision item 738). If the user has decided to end the music creation process (i.e., the 'YES_ branch of decision item 738), the instant invention will preferably write whatever recording was made to nonvolatile computer storage / computer readable media (e.g., magnetic di sk, flash memory, CD, DV D, etc.) and stop the music creation process (step 750). - On the other hand, if there is no indication that user wishes to stop (the 'NO_ branch of decision item 738), the user will be given an opportunity to make changes in the performance environment (step 740). In this instance, and according to some embodiments, the user will be given the option of replacing any or all of the items in the
instrument bank 270, adjusting the relative volume of each instrument in the mix, adjusting the tempo of the entire work, changing the style of the entire musical work, it's key, time signature, etc. In short, the user will be given the option of modifying any performance-related parameter in real time while the performance and/or recording is taking place (step 740). As is discussed in greater detail below, changed parameters will be instantly reflected in the audio output of the work, where 'instantly_ should be understood to mean that the changes will be performed at the next moment when it would make musical sense to do so (e.g., at the next beat, at the start of the next measure, etc.), i.e., at a moment that is musically synchronous with the instruments/style/tempo that are already in the performance. Said another way, in an embodiment, any change that the user makes in a performance variable will be immediately reflected in the audio output of the performance. - Next, and preferably, the change or changes i ndi cated by the user will be implemented in the musical performance (step 745) in such a way as to harmoniously combine the additional, changed, or removed (muted) instruments/vocals with the current performance. This might mean, by way of example, matching the tempo of the current performance, its key, etc. Preferably, the new instrument will not enter the mix until the start of the next full measure of the performance but that is not a requirement. Of course, it is not a requirement that the modifications be only introduced at measure boundaries and, in some embodiments, the next beat (or 1/8th note, 1/16th note, etc.) could be used as an entry point A ny point of entry might be used so long as the entry is at a musically synchronous moment so as to create a harmonious combined work. T hose of ordinary skill in the art will readily be able to devise other methods of smoothly adding new instruments to the output mix.
- In summary, the instant invention provides a highly creative work method for both novice and professional user when generating music or just enjoying music and the music generati on process. The instant invention will adapt to the knowledge and professionalism of the user providing individual options and features for selection that either adds complexity (for professionals) or minimizes complexity (for novice users).
- Of course, many modifications and extensions could be made to the instant i nventi on by those of ordi nary skill in the art For example in one preferred embodi ment the instant invention will enter a random creativity mode, wherein the method will automatically change and replace loops, instruments and pitch. This mode will be entered when the user interaction level is very low. In another preferred embodiment the user can link an instrument with the instant invention and the method will 'listen_to the input from the instrument of the user and will accordingly select song parts, instruments and pitch to therewith generate a music work.
- Thus, the present i nventi on is well adapted to carry out the obj ects and attain the ends and advantages mentioned above as well as those inherent therein.
Claims (12)
- A method_for automatically composing musical works with intuitive user interaction, comprising the steps of:a. accessing an audio loop database containing a plurality of audio loops (600), each of said plurality of audio loops (600) being associated with at least one of a music style (610) a harmony suitability value (650), an information about the usability of the audio loop as a song part (630) a tempo value (620) and an information regarding the at least one instrument (640) the loop was taken from or created by ;b. requiring a user to select a music style (210, 310, 705), said music style (210) being associated with a specific number of audio loops (600), a plurality of instruments (640), a plurality of song parts (240, 340), a tempo value (220, 320), a predefined tone pitch value (250, 330) related to the instruments and key related to the overall music score, a plurality of icons, or images (550), animations (560) and a name (570);c. requiring the user to select a tempo value (220, 320) and a tone pitch value (250, 330), thereby determining a tempo value of the musical work a tone pitch value of the musical work;d. assembling an instrument bank (270), wherein said instrument bank (270) is comprised of one or more of said plurality of audio loops (600), each of said assembled one or more audio loops (600) having at least one of said at least one associated music style (310) compatible with said selected music style (310).e. presenting a representation of one or more instruments (260) with said selected music style to the user on a display device and a representation of a plurality of individual sequential song parts (240), and a representation of a pitch value (250) and tempo (220) ;f. requiring the user to select from said displayed representation of said instruments at least one instrument, therewith selecting from said instrument bank (270) an initial audio loop from among said associated one or more audio loops (600).g. algorithmically modifying said selected initial audio loop (600) at least according to said tempo value (220, 320) and said key, thereby producing a modified initial audio loop (600);h. initiating a performance (725) of said selected modified initial audio loop (600) in said selected music style (210, 310);i. creating an audible representation of said performance of said selected modified initial audio loop (600) for the user;j. requiring (735) the user to select from said displayed representation of instruments (260, 350) at least one of another instrument and another song part (240, 340) and a tone pitch value (250, 330) and a tempo value (220, 320) and a style (210, 310) therewith selecting from said instrument bank (270) another audio loop (600), wherein said selection of said another loop (600) is depending on the harmony suitability value (650), said information about the usability of the audio loop as a song part (240,340) and the tempo value (220, 320) associated with each audio loop (600);k. algorithmically modifying (740, 745) said selected another audio loop (600) at least according to said tempo value (220, 320) or said instrument selection, or said song part selection (240, 340) or said tone pitch value (250, 330);l. adding said modified different audio loop (600) to said performance of said selected modified initial audio loop (600) in real time during said performance of said selected modified initial audio loop (600) to create a combined performance, wherein said modified different audio loop (600) is added at the next bar, musically synchronous with said performance of said modified initial audio loop (600) and corresponding with said information about the usability as a song part (630).m. creating (725) an audible representation of said combined performance for the user;n. performing at least steps (j) (725) through (1) (745) a plurality of times for a plurality of different instruments (640), therewith automatically composing musical works.
- A method_for automatically composing musical works with intuitive user interaction according to Claim 1, wherein said audio loops (600) are selected from a group consisting of a horn, a guitar, a vocal performance, a drum, a harmonica, and a violin.
- A method_for automatically composing musical works with intuitive user interaction according to Claim 1, further comprising the step of:n. storing at least a portion of said composed musical works on a computer readable medium.
- A method_for automatically composing musical works with intuitive user interaction according to Claim 1, wherein said harmony suitability value (650) is indicating whether or not one audio loop (600) is musically compatible with another.
- A method_for automatically composing musical works with intuitive user interaction to Claim 1, wherein said harmony suitability value (650) is either provided by the user, designed by the creator of the data structure, or determined by analysis of selected audio loops (600).
- A method for automatically composing musical works with intuitive user interaction according to claim 5, wherein said analysis is determining musical values of selected audio loops (600) and comparing these determined musical values to determined musical values of other audio loops (600).
- A method for automatically composing musical works with intuitive user interaction according to Claim 1, wherein said information about the usability of the audio loop (600) as a part (630) defines the usability of each audio loop (600) for individual parts of said musical works.
- A method_for automatically composing musical works with intuitive user interaction according to Claim 7, wherein said individual parts of said musical works are selected from a group consisting of intro, variance and ending.
- A method for automatically composing musical works with intuitive user interaction according to Claim 1, wherein said selection of another instrument (640), a tempo value (530) and a predefined tone pitch value (540) is carried out automatically after selection of an automatic mode.
- A method for automatically composing musical works with intuitive user interaction according to Claim 1, wherein said intuitive user interaction is based on an adjustable user interaction level.
- An apparatus configured to perform all the steps of the method of any of the previous claims.
- A computer program adapted to perform all the steps of the method of any of the previous claims.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/720,552 US10496250B2 (en) | 2011-12-19 | 2012-12-19 | System and method for implementing an intelligent automatic music jam session |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2793222A1 EP2793222A1 (en) | 2014-10-22 |
EP2793222B1 true EP2793222B1 (en) | 2018-06-06 |
Family
ID=49999665
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP13197613.6A Active EP2793222B1 (en) | 2012-12-19 | 2013-12-17 | Method for implementing an automatic music jam session |
Country Status (1)
Country | Link |
---|---|
EP (1) | EP2793222B1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230360619A1 (en) * | 2022-05-05 | 2023-11-09 | Lemon Inc. | Approach to automatic music remix based on style templates |
USD1071957S1 (en) | 2022-12-07 | 2025-04-22 | Hyph Ireland Limited | Display screen with graphical user interface |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20240194173A1 (en) * | 2022-12-07 | 2024-06-13 | Hyph Ireland Limited | Method, system and computer program for generating an audio output file |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877445A (en) * | 1995-09-22 | 1999-03-02 | Sonic Desktop Software | System for generating prescribed duration audio and/or video sequences |
US20040089141A1 (en) * | 2002-11-12 | 2004-05-13 | Alain Georges | Systems and methods for creating, modifying, interacting with and playing musical compositions |
EP1666967A1 (en) * | 2004-12-03 | 2006-06-07 | Magix AG | System and method of creating an emotional controlled soundtrack |
US20070261537A1 (en) * | 2006-05-12 | 2007-11-15 | Nokia Corporation | Creating and sharing variations of a music file |
US20120297959A1 (en) * | 2009-06-01 | 2012-11-29 | Matt Serletic | System and Method for Applying a Chain of Effects to a Musical Composition |
US20120312145A1 (en) * | 2011-06-09 | 2012-12-13 | Ujam Inc. | Music composition automation including song structure |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050132293A1 (en) * | 2003-12-10 | 2005-06-16 | Magix Ag | System and method of multimedia content editing |
JP4626376B2 (en) * | 2005-04-25 | 2011-02-09 | ソニー株式会社 | Music content playback apparatus and music content playback method |
US20100322042A1 (en) * | 2009-06-01 | 2010-12-23 | Music Mastermind, LLC | System and Method for Generating Musical Tracks Within a Continuously Looping Recording Session |
-
2013
- 2013-12-17 EP EP13197613.6A patent/EP2793222B1/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5877445A (en) * | 1995-09-22 | 1999-03-02 | Sonic Desktop Software | System for generating prescribed duration audio and/or video sequences |
US20040089141A1 (en) * | 2002-11-12 | 2004-05-13 | Alain Georges | Systems and methods for creating, modifying, interacting with and playing musical compositions |
EP1666967A1 (en) * | 2004-12-03 | 2006-06-07 | Magix AG | System and method of creating an emotional controlled soundtrack |
US20070261537A1 (en) * | 2006-05-12 | 2007-11-15 | Nokia Corporation | Creating and sharing variations of a music file |
US20120297959A1 (en) * | 2009-06-01 | 2012-11-29 | Matt Serletic | System and Method for Applying a Chain of Effects to a Musical Composition |
US20120312145A1 (en) * | 2011-06-09 | 2012-12-13 | Ujam Inc. | Music composition automation including song structure |
Non-Patent Citations (3)
Title |
---|
"E-35 MIDI Intelligent Synthesizer Owners's Manual", 1 January 1991 (1991-01-01), Germany, pages 1 - 70, XP055134872, Retrieved from the Internet <URL:http://media.rolandus.com/manuals/E-60_50_OM.pdf> [retrieved on 20140814] * |
DUPONT S ET AL: "AudioCycle: Browsing Musical Loop Libraries", CONTENT-BASED MULTIMEDIA INDEXING, 2009. CBMI '09. SEVENTH INTERNATIONAL WORKSHOP ON, IEEE, PISCATAWAY, NJ, USA, 3 June 2009 (2009-06-03), pages 73 - 80, XP031481665, ISBN: 978-1-4244-4265-2 * |
FLORENT BERTHAUT ET AL: "Advanced Synchronization of Audio or Symbolic Musical Patterns: An Algebraic Approach", SEMANTIC COMPUTING (ICSC), 2012 IEEE SIXTH INTERNATIONAL CONFERENCE ON, IEEE, 19 September 2012 (2012-09-19), pages 302 - 309, XP032265400, ISBN: 978-1-4673-4433-3, DOI: 10.1109/ICSC.2012.11 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230360619A1 (en) * | 2022-05-05 | 2023-11-09 | Lemon Inc. | Approach to automatic music remix based on style templates |
USD1071957S1 (en) | 2022-12-07 | 2025-04-22 | Hyph Ireland Limited | Display screen with graphical user interface |
Also Published As
Publication number | Publication date |
---|---|
EP2793222A1 (en) | 2014-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10496250B2 (en) | System and method for implementing an intelligent automatic music jam session | |
US11776518B2 (en) | Automated music composition and generation system employing virtual musical instrument libraries for producing notes contained in the digital pieces of automatically composed music | |
US11314936B2 (en) | System and method for assembling a recorded composition | |
CN111971740B (en) | "Method and system for generating audio or MIDI output files using harmonic chord diagrams" | |
US20120014673A1 (en) | Video and audio content system | |
CN106708894B (en) | Method and device for configuring background music for electronic book | |
US20050144016A1 (en) | Method, software and apparatus for creating audio compositions | |
Manzo et al. | Interactive composition: Strategies using Ableton live and max for live | |
GB2602118A (en) | Generating and mixing audio arrangements | |
EP2793222B1 (en) | Method for implementing an automatic music jam session | |
WO2025120371A4 (en) | Digital music composition, performance and production studio system network and methods | |
WO2005057821A2 (en) | Method, software and apparatus for creating audio compositions | |
US9905208B1 (en) | System and method for automatically forming a master digital audio track | |
Rando et al. | How do Digital Audio Workstations influence the way musicians make and record music? | |
Diaz | Analysis of Sampling Techniques by J Dilla in Donuts | |
Alspach | Electronic Music Subgenres for Music Providers | |
Nahmani | Logic Pro-Apple Pro Training Series: Professional Music Production | |
US20240194170A1 (en) | User interface apparatus, method and computer program for composing an audio output file | |
US20240153475A1 (en) | Music management services | |
SHI et al. | The Acceptance of the Fairlight CMI in Japan and its Influence on the Japanese Music Scene | |
Augspurger | Transience: an album-length recording for solo percussion and electronics | |
Souvignier | Loops and grooves: The musician's guide to groove machines and loop sequencers | |
Friedman | FL Studio Cookbook | |
Miller | “The Sound of Silence”: A Comparative Analysis of the Recordings by Simon and Garfunkel and Disturbed | |
Smith | A New Foundation of Music: Sampling and Its Impact on the Creative Process |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAB | Information related to the publication of an a document modified or deleted |
Free format text: ORIGINAL CODE: 0009199EPPU |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20131217 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
R17P | Request for examination filed (corrected) |
Effective date: 20150420 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
17Q | First examination report despatched |
Effective date: 20150625 |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: MAGIX GMBH & CO. KGAA |
|
RAP1 | Party data changed (applicant data changed or rights of an application transferred) |
Owner name: BELLEVUE INVESTMENTS GMBH & CO. KGAA |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/40 20060101ALN20161026BHEP Ipc: G10H 1/00 20060101AFI20161026BHEP Ipc: G10H 1/36 20060101ALN20161026BHEP |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/00 20060101AFI20161110BHEP Ipc: G10H 1/40 20060101ALN20161110BHEP Ipc: G10H 1/38 20060101ALN20161110BHEP Ipc: G10H 1/36 20060101ALN20161110BHEP |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/36 20060101ALN20170606BHEP Ipc: G10H 1/40 20060101ALN20170606BHEP Ipc: G10H 1/38 20060101ALN20170606BHEP Ipc: G10H 1/00 20060101AFI20170606BHEP |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/40 20060101ALN20171010BHEP Ipc: G10H 1/00 20060101AFI20171010BHEP Ipc: G10H 1/38 20060101ALN20171010BHEP Ipc: G10H 1/36 20060101ALN20171010BHEP |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/38 20060101ALN20171113BHEP Ipc: G10H 1/40 20060101ALN20171113BHEP Ipc: G10H 1/36 20060101ALN20171113BHEP Ipc: G10H 1/00 20060101AFI20171113BHEP |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/00 20060101AFI20171130BHEP Ipc: G10H 1/40 20060101ALN20171130BHEP Ipc: G10H 1/36 20060101ALN20171130BHEP Ipc: G10H 1/38 20060101ALN20171130BHEP |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R079 Ref document number: 602013038513 Country of ref document: DE Free format text: PREVIOUS MAIN CLASS: G06F0017300000 Ipc: G10H0001000000 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/00 20060101AFI20171219BHEP Ipc: G10H 1/38 20060101ALN20171219BHEP Ipc: G10H 1/36 20060101ALN20171219BHEP Ipc: G10H 1/40 20060101ALN20171219BHEP |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20180126 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: G10H 1/40 20060101ALN20180118BHEP Ipc: G10H 1/00 20060101AFI20180118BHEP Ipc: G10H 1/38 20060101ALN20180118BHEP Ipc: G10H 1/36 20060101ALN20180118BHEP |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP Ref country code: AT Ref legal event code: REF Ref document number: 1006928 Country of ref document: AT Kind code of ref document: T Effective date: 20180615 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 602013038513 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180906 Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180906 |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: FP |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180907 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MK05 Ref document number: 1006928 Country of ref document: AT Kind code of ref document: T Effective date: 20180606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20181006 Ref country code: AT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: IT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 602013038513 Country of ref document: DE |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
26N | No opposition filed |
Effective date: 20190307 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: LU Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181217 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
REG | Reference to a national code |
Ref country code: BE Ref legal event code: MM Effective date: 20181231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181231 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20181217 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20180606 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20131217 Ref country code: MK Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20180606 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: NL Payment date: 20231219 Year of fee payment: 11 Ref country code: FR Payment date: 20231025 Year of fee payment: 11 Ref country code: DE Payment date: 20231219 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IE Payment date: 20240307 Year of fee payment: 11 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20240110 Year of fee payment: 11 Ref country code: CH Payment date: 20240101 Year of fee payment: 11 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 602013038513 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MM Effective date: 20250101 |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20241217 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20250101 |