[go: up one dir, main page]

GB2460496A - Music device with contact sensitive sound creation regions - Google Patents

Music device with contact sensitive sound creation regions Download PDF

Info

Publication number
GB2460496A
GB2460496A GB0810323A GB0810323A GB2460496A GB 2460496 A GB2460496 A GB 2460496A GB 0810323 A GB0810323 A GB 0810323A GB 0810323 A GB0810323 A GB 0810323A GB 2460496 A GB2460496 A GB 2460496A
Authority
GB
United Kingdom
Prior art keywords
user
arrangement
sound creation
user interface
sound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB0810323A
Other versions
GB0810323D0 (en
Inventor
Mark John Lee Percival
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
INTELLIGENT MAT SYSTEMS Ltd
Original Assignee
INTELLIGENT MAT SYSTEMS Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by INTELLIGENT MAT SYSTEMS Ltd filed Critical INTELLIGENT MAT SYSTEMS Ltd
Priority to GB0810323A priority Critical patent/GB2460496A/en
Publication of GB0810323D0 publication Critical patent/GB0810323D0/en
Publication of GB2460496A publication Critical patent/GB2460496A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/0008Associated control or indicating means
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B15/00Teaching music
    • G09B15/02Boards or like means for providing an indication of notes
    • G09B15/04Boards or like means for providing an indication of notes with sound emitters
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H1/00Details of electrophonic musical instruments
    • G10H1/32Constructional details
    • G10H1/34Switch arrangements, e.g. keyboards or mechanical switches specially adapted for electrophonic musical instruments
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/096Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith using a touch screen
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2220/00Input/output interfacing specifically adapted for electrophonic musical tools or instruments
    • G10H2220/091Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith
    • G10H2220/101Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters
    • G10H2220/121Graphical user interface [GUI] specifically adapted for electrophonic musical instruments, e.g. interactive musical displays, musical instrument icons or menus; Details of user interactions therewith for graphical creation, edition or control of musical data or parameters for graphical editing of a musical score, staff or tablature
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10HELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
    • G10H2240/00Data organisation or data communication aspects, specifically adapted for electrophonic musical tools or instruments
    • G10H2240/171Transmission of musical instrument data, control or status information; Transmission, remote access or control of music data for electrophonic musical instruments
    • G10H2240/281Protocol or standard connector for transmission of analog or digital data to or from an electrophonic musical instrument
    • G10H2240/285USB, i.e. either using a USB plug as power supply or using the USB protocol to exchange data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Acoustics & Sound (AREA)
  • Business, Economics & Management (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Electrophonic Musical Instruments (AREA)

Abstract

A music device 10, for example a music teaching aid, comprises a user interface device 12 such as a floor mat which includes a plurality of sound creation regions 54 each associated with at least one sound. Each sound creation region 54 includes a sensor arrangement (53) for sensing when said region 54 is contacted by a user, preferably comprising: two conductive components, separated by an insulator having apertures to allow electrical connection of the conductive components upon application of pressure. A processing arrangement 16 is interactively associated with the user interface device 12, and is capable of being associated with a speaker arrangement 14 to cause the speaker arrangement 14 to emit a sound associated with the sound creation region 54 contacted by the user. The processing arrangement 16 may be configured to provide instructions to the user via a display arrangement 30 such as a television screen or computer monitor.

Description

INTELLECTUAL
. .... PROPERTY OFFICE Application No. GBO8 10323.6 RTM Date:24 September 2009 The following terms are registered trademarks and should be read as such wherever they occur in this document: Promethean Bluetooth Windows Intellectual Property Office is an operating name of the Patent Office www.ipo.gov.uk Music Device This invention relates to music devices. More particularly, but not exclusively, this invention relates to music devices incorporating data processing arrangements.
Embodiments of the invention relate to music teaching devices for example, for teaching users, who may be children.
Computerised systems are available that can be used to teach music. However, very few systems are available that allow active and interactive learning.
According to a first aspect of this invention, and there is provided a music device comprising: a user interface device comprising a plurality of sound creation regions, each sound creation region having at least one sound associated therewith, and each sound creation region comprising a sensor arrangement for sensing when said region is contacted by a user; a processing arrangement interactively associated with the user interface device and with a speaker arrangement to cause the speaker arrangement to emit a sound associated with the sound creation region contacted by the user.
A display arrangement may be provided to display the instructions to the user. The display arrangement may comprise a computer monitor, television screen, smart board, promethean board, film screen, or other suitable display arrangement. The processing arrangement may be configured to provide instructions to the user via the display arrangement. Alternatively, or in addition, the processing arrangement may be configured to provide instructions to the user verbally via the speaker arrangement.
The processing arrangement may be configured to provide instructions to the user to contact a selected one or more of said sound creation regions. Alternatively, or in addition, the processing arrangement may be configured to record the musical notes corresponding to the sounds corresponding to the sound creation regions contacted by the user. The recording of the aforesaid sounds may be such that the musical notes are displayed on the display arrangement by way of standard music notation.
The music device may include a processing arrangement. The processing arrangement may comprise a data processing arrangement, which may be an electronic data processing arrangement, such as a computer console. The processing arrangement may comprise means for storing data relating to the sound or sounds associated with each sound creation region.
The processing arrangement may be configured to determine whether the sound creation region contacted by the user corresponds to the selected sound creation region the processing arrangement has instructed the user to contact. The processing arrangement may be configured to provide messages via the display arrangement and/or the speaker arrangement reporting upon the progress of the user.
Thus, in one embodiment, where the user contacts the selected sound creation region, the processing arrangement may provide a message on the display arrangement and/or emitted by the speaker arrangement to the effect that that the selected sound creation region has been contacted. Also in this embodiment, where the user contacts a sound creation region other than the selected sound creation region, the processing arrangement may provide a message on the display arrangement and/or emitted by the speaker arrangement to the effect that the selected sound creation region has not been contacted.
The processing arrangement may comprise a data processing unit, a memory, and suitable operating software, which may be installed on a memory device, such as a hard drive or flash memory. The data processing unit may be a CPU.
In one embodiment, the processing arrangement may include connection means to connect the user interface device to the processing arrangement. The connection of the user interface device to the processing arrangement may be by suitable cabling, in which case the cabling may be inserted into suitable ports, such as USB ports, in the user interface device and the processing arrangement. Alternatively, the connector may be a wireless connector, such as Bluetooth, ZigBee or Wi-Fi.
According to a second aspect of this invention, there is provided a user interface device for use with a music device, the user interface device comprising a plurality of sound creation regions, and each sound creation region comprises a sensor arrangement for sensing when said region is contacted by a user.
Each sound creation region may have at least one sound associated therewith.
The user interface device according to the first or second aspects of the invention may, in one embodiment, comprise a substantially flat surface. The sound creation regions may be defined or indicated by substantially flat surface pressure sensitive regions. In one embodiment, the pressure sensitive regions may be sensitive to pressure when pressed by, for example, the user's foot or hand. It will be appreciated that the pressure sensitive regions can be sensitive to pressure applied by any part of the user's body.
In one embodiment, the user interface device may be separate from the processing arrangement, and may comprise a mat upon which the user can step. In this embodiment, the sound creation regions are configured to sense when the user steps thereon.
The mat may be of a suitable size to be arranged on a floor, as a floor mat, so that the user can step on selected regions, or jump from one selected region to another.
Desirably, the mat has a dimension that exceeds 1 metre.
Each sensor arrangement may comprise first and second electrically conductive components, and an electrically insulating component between the first and second electrically conductive components. The electrically insulating component may define at least one aperture to allow electrical connection between the first and second electrically conductive components when said sound creation region is contacted. In one embodiment, the electrically insulating component may define a plurality of apertures.
The first and second electrically conductive components may comprise any suitable components that can conduct electricity, for example a metallic foil, or a liquid conductor. The metallic foil may be any suitable metallic foil, such as aluminium foil and/or tin foil. The electrically insulating component may comprise an electrically insulating, resilient plastics material, such as a foam material.
The first and second electrically conductive components may be in the form of first and second electrically conductive layers. The electrically insulating component may comprise an electrically insulating layer, between said first and second layers.
The user interface device may comprise a cushioning member to provide cushioning for the user when the user contacts the sound creation regions. The cushioning member may comprise a foam material.
In one embodiment, the sound creation regions may be arranged on top of the cushioning member.
The user interface device may comprise a cover member which may extend over the sound creation regions. The cover member may comprise a sleeve that desirably extends over the sensor arrangement and may extend around the cushioning member.
In another embodiment, the user interface device may comprise a pad, which may be suitable to be arranged on a support surface, such as a table. The pad may have a dimension that is less than I metre. Conveniently, the pad may be generally of A4 size. In this embodiment, the processing arrangement and the user interface device may constitute a single unit. The processing arrangement and the user interface device may be held by a casing. The speaker arrangement may be held in the casing.
If desired, the music device may include connection facilities, such as connection ports or wireless connectivity to allow the music device to be connected to an external device, such as a display arrangement or another processing arrangement, for example a computer.
The user interface device may comprise control regions for controlling the music playing system. Each control region may comprise a sensor arrangement, which may be substantially the same as the sensor arrangement described above.
In one embodiment, the user interface device may comprise a touch sensitive arrangement for the sound creation regions and the control regions. The touch sensitive arrangement may comprise a touch sensitive screen.
The processing arrangement may be configured to effect running control of the music device, for example by selectively stopping, pausing, fast forwarding, rewinding, and playing the music device.
According to a third aspect of this invention, there is provided a method of using a music device comprising: providing a user interface device comprising a plurality of sound creation regions, each sound creation region having a sound associated therewith, and each sound creation region comprising a sensor arrangement for sensing when said region is contacted by a user; interactively associating a processing arrangement with the user interface device and a speaker arrangement, whereby when one of the sound creation regions is contacted by the user, the processing arrangement causes the speaker arrangement to emit the sound associated with the sound creation region contacted by the user.
The method may comprise providing instructions to the user via the processing arrangement to contact a selected one or more of said sound creation regions.
Alternatively, or in addition, the method may comprise recording musical notes corresponding to sounds associated with selected sound creation regions contacted by the user. The method may comprise recording the aforesaid notes such that the notes can be displayed on the display arrangement by way of standard music notation.
The instructions may comprise instructing the user to contact a plurality of said sound creation regions in a selected sequence. The instructions may further comprise instructing the user to contact each selected sound creation region for a respective selected approximate period of time. Thus, in one embodiment, the processing arrangement may instruct the user to play a tune.
The method may comprise storing data in the processing arrangement relating to the sound associated with each sound creation region.
The method may further comprise connecting a display arrangement to the processing arrangement to display the instructions to the user. The display arrangement may comprise a computer monitor, television screen, smart board, promethean board, film screen, or other suitable display arrangement. The method may comprise providing instructions to the user via the display arrangement.
Alternatively, or in addition, the method may comprise providing instructions to the user verbally via the speaker arrangement.
The method may comprise determining whether the sound creation region contacted by the user corresponds to the selected sound creation region the processing arrangement has instructed the user to contact. The method may comprise providing messages via the display arrangement and/or the speaker arrangement reporting upon the progress of the user.
Thus, in one embodiment, where the user contacts the selected sound creation region, the method may comprise providing a message on the display arrangement and/or emitted by the speaker arrangement to the effect that the selected sound creation region has been contacted. Also in this embodiment, where the user contacts a sound creation region other than the selected sound creation region, the method may comprise providing a message on the display arrangement and/or emitted by the speaker arrangement to the effect that the selected sound creation region has not been contacted.
The user interface device may comprise any substantially flat surface, and the sound creation regions may comprise pressure sensitive regions. In one embodiment, the method may comprise sensing pressure when the pressure sensitive region is pressed by, for example, the user's foot or hand.
In one embodiment, the user interface device may comprise a mat and the method may comprise instructing the user to step on the selected sound creation region on the mat. In this embodiment, the method may comprise sensing the pressure is applied thereto when the user steps on a sound creation region.
Each sensor arrangement may comprise first and second electrically conductive components, and an electrically insulating component between the first and second electrically conductive components. The electrically insulating component may define at least one aperture to allow electrical connection between the first and second electrically conductive components when said sound creation region is contacted. The method may comprise electrically connecting the first and second electrically conductive components to one another through the, or some of the, apertures. In one embodiment, the electrically insulating component defines a plurality of apertures.
The first and second electrically conductive components may comprise any suitable components that can conduct electricity, for example a metallic foil or a liquid conductor. The metallic foil may be any suitable metallic foil, such aluminium foil or tin foil. The electrically insulating component may comprise a plastics material, such as a foam material.
The first and second electrically conductive components may comprise first and second electrically conductive layers. The electrically insulating component may comprise an electrically insulating layer.
The user interface device may comprise a cushioning member to provide cushioning for the user when the user steps on the user interface device. The cushioning member may comprise a foam material.
In one embodiment, the sound creation regions may be arranged on top of the cushioning member.
The user interface device may comprise a cover member which may extend over the sound creation regions. The cover member may comprise a sleeve that desirably extends around the sound creation regions and the cushioning member.
In another embodiment, the user interface device may comprise a pad, which may be suitable to be arranged on a support surface, such as a table. The pad may have a dimension that is less than I metre. Conveniently, the pad may be generally of A4 size. In this embodiment, the processing arrangement and the user interface device may constitute a single unit. The processing arrangement and the user interface device may be held by a casing. The speaker arrangement may be held in the casing.
If desired, the music device may include connection facilities, such as connection ports or wireless connectivity to allow the music device to be connected to an external device, such as a display arrangement or another processing arrangement, for example a computer.
The user interface device may comprise control regions for controlling the music device. Each control region may comprise a sensor arrangement, which may be as described above.
In one embodiment, the user interface device may comprise a touch sensitive arrangement for the sound creation regions and the control regions. The touch sensitive arrangement may comprise a touch sensitive screen.
The processing arrangement may be configured to effect running control of the music device, for example by selectively stopping, pausing, fast forwarding, rewinding, and playing the music device. The method may comprise effecting said running control of the music device via the control regions on the user interface device.
Embodiments of the invention will now be described by way of example only, with reference to the accompanying drawings, in which: Figure 1 is a diagrammatic view of an embodiment of a music device; Figure 2 is a diagrammatic view of another embodiment of a music device; Figure 3 is a part sectional top plan view of a contact region of a user interface device; Figure 4 is a view along the lines IV-IV in Figure 3; Figure 5 is a sectional side view of the user interface device; Figure 6 is a top view of another embodiment of a music device; Figure 7 is a top view of the music device shown in Figure 6, but without the markings; and Figure 8 is a sectional view from above of the music device shown in Figures 6 and 7 showing the internal components.
Figure 1 shows an embodiment of a music device 10, comprising a user interface device 12, a speaker arrangement 14 and a data processing arrangement 16. The embodiment shown in Figure 1 is suitable for teaching music to children or other users.
In the embodiment shown in Figure 1, the user interface device 12 comprises a mat 18 upon which the user can step or jump. The construction of the user interface device 12 is described in full below.
The data processing arrangement 16 comprises a console 20, which comprises: a processing unit, in the form of a CPU; a memory in the form of a RAM; a hard disk upon which suitable operating software is installed, such as Windows (TM) XP. In the embodiment shown, the console 20 is external to, and separate from the user interface device 12.
The processing arrangement 16 is provided with suitable input sockets, which may be in the form of USB sockets 22. Data storage devices, in the form of first and I0 second USB memory sticks 24 and 26 are provided for insertion into the respective USB sockets 22.
The first USB memory stick 24 is provided with the computer programs for running the music device 10. The second USB memory stick 26 is provided to store information about the progress of each of the users or participants using the music device 10.
A hand held control device 28 is provided to control the music device 10, for example, to selectively play, stop, pause, rewind and fast forward the computer program running on the data processing arrangement 16.
The processing arrangement 16 is connected to a suitable display arrangement 30, which may be, for example, a computer monitor, a television screen, a smart board, a Promethean board, film screen, or any other suitable display arrangement. The processing arrangement 16 is also connected to the speaker arrangement 14, which comprises a pair of speakers 32, 34.
The program running on the processing arrangement 16 provides instruction to the user who is using the music device 10 via the display arrangement 30 and/or the speakers 32, 34. Feedback on the progress of the user can also be provided by the program via the display arrangement 30 and/or the speakers 32, 34.
The user interface device 12 comprises a substantially flat surface 36, and a music stave is marked on the flat surface 36. The music stave is a standard music stave having five lines designated respectively 38, 40, 42, 44 and 46. Similarly, the music stave 38 has four spaces between the lines, the spaces being designated respectively 39, 41, 43, and 45.
The five lines and the four spaces 38 to 46 represent the different notes of the music stave, as would be understood by those familiar with musical notation. For example, where the music stave is designated by a treble clef, the notes of the five lines and four spaces 38 to 46 are E, F, G, A, B, C, D, E, F respectively, and where the music stave is designated by a bass clef, the note of the five lines and four spaces 38 to 46 are G, A, B, C, D, E, F, G, A respectively.
In addition to the music stave, the user interface device 12 also shows a ledger line 48 above the music stave, and a ledger line 50 below the music stave. A space 47 is provided between the music stave and the upper ledger line 48, and a space 49 is provided between the music stave and the lower ledger line 50. The notes represented by the ledger lines 48, 50 and the spaces 47, 49 would be understood by those familiar with musical notation.
The user interface device 12 also includes separate sound creation regions 52 at each of the lines and spaces 38 to 50 of the music stave and the ledger lines. Each respective sound creation regions comprises a sensor arrangement 53 (see Figures 3 and 4), which are individually connected to the data processing arrangement 16 so that when one of the sound creation regions 52 is contacted by the user, this contact is sensed by the respective sensor arrangement 53 and the appropriate note represented by the respective sound creation region 52 is emitted by the speaker arrangement 14.
Thus, a user contacting, by touching or pressing, any of the lines or spaces 38 to 50 thereby applies pressure to one of the sound creation regions 52. The sensor arrangement 53 thereby sends a signal to the processing arrangement 16 which causes a note to be emitted via the speaker arrangement 14.
Each sensor arrangement 53 of the sound creation regions 52 at the lines 38, 14, 42, 44, 46, 48, 50 is provided directly under the respective line, and each sensor arrangement 53 of the sound creation regions 52 at the spaces 39, 41, 43, 45, 47, 49 is provided directly under the respective space, and is shown in Figure 1 by areas delimited by the broken lines 52A.
In addition, the user interface device 12 has three further sound creation regions 54 arranged at circular regions 56 along the left-hand side of the user interface device 12. The further sound creation regions 54 are delimited in Figure 1 by broken lines 54A.
Each of the circular regions 56 represent, for example, a percussion region at which the user can create percussive sounds by stepping on the further sound creation regions 54 at any of the circular regions 56.
In addition, the user interface device 12 comprises a plurality of control regions to provide control to various functions of the music device 10. There are two types of control region, namely first control regions for controlling whether the music stave is in the treble clef mode or the bass clef mode, and second control regions for controlling the program running on the data processing arrangement 16. As can be seen from Figures 1 and 2, the user interface device 12 includes representation of a treble clef 58, and a representation of a bass clef 60.
The first control regions are generally designated 70 and 72 respectively. The first control region 70 for controlling the treble clef 58 is arranged under the representation of the treble clef 58 on the user interface device 12, and the first control region 72 for controlling the bass clef 60 is arranged under the representation of the bass clef 60.
When the user steps on the first control region 70 for controlling the treble clef 58, the music stave is then switched to treble clef mode. Illumination means can be provided to illuminate the treble clef 58 to indicate that the music stave is in the treble clef mode. When the user steps on the first control region 72 for controlling the bass clef 60, the music stave is switched to bass clef mode, and illumination means is provided to illuminate the bass clef 60 to indicate that music stave is in the bass clef mode.
The five second control regions are generally designated 74, 76, 78, 80 and 82 respectively, and operate a stop function, a play function, a pause function, a fast forward function, and a rewind function. The second control regions 74 to 82 are used to control the program providing instructions to the user so that, for example, it is possible to pause the program, to rewind a program to repeat a particular stage, or to fast forward to another stage.
Referring to Figure 2, there is shown a second embodiment of the invention, which is generally the same as the embodiment shown in Figure 1, and the features in Figure 2 which correspond to features in Figure 1 have been designated with the same reference numerals as in Figure 1. The embodiment shown in Figure 2 differs from the embodiment shown in Figure 1 in that there are two user interface devices designated respectively by the numerals 12 and 112. The user interface device 12 is the same as the user interface device 12 shown in Figure 1 and is referred to in the embodiment shown in Figure 2 as the master user interface device 12, and the other user interface device 112 is referred to as the slave user interface device 112.
As can be seen in Figure 2, the master user interface device 12 includes the second control regions 74 to 82 to allow control of the program. The slave user interface device 112 does not include the second control regions 74 to 82, and the further sound creation regions 54 are arranged on the opposite side of the slave user interface device 112 to their position on the master user interface device 12.
In Figure 2, the display arrangement 30 has been omitted from Figure 2 for clarity but would be present in use, as would be appreciated by those skilled in the art.
Referring to Figures 3 and 4, there is shown the construction of one of the sensor arrangements 53. Figure 3 shows the sensor arrangement 53 from above, and is a part sectional top view of the sensor arrangement 53. Figure 4 shows a sectional side view along the lines IV-IV.
Referring to Figures 3 and 4, the sensor arrangement 53 comprises an upper layer formed of a metallic foil material or a liquid conductor, a lower layer 82 also of a metallic foil material or a liquid conductor, and an intermediate layer 84 of an electrically insulating material, such as a plastics or foam material. The intermediate layer 84 defines a plurality of apertures 86 arranged in a regular array.
The electrically insulating intermediate layer 84 maintains the upper and lower layers 80, 82 apart from each other. When the user presses onto the sound creation region, contact is made between the upper and lower layers 80, 82 via the apertures 86. This contact completes an electrical circuit, and a signal is sent to the data processing arrangement 16.
The control regions are constructed in the same way as the sound creation regions described above.
Figure 5 shows a sectional side view of the user interface device 12, which comprises an outer sleeve 90 upon which the markings for the stave, the treble and bass clef, and the other marking shown on the user interface device 12 in Figure 1 are provided. The outer sleeve 90 has an upper face 90A, and a lower face 90B The sensor arrangements 53 of the sound creation regions 52, 54 and the control regions 74 to 82 are attached to the underside of the upper face 90A. Only one of the sound creation regions 52 is shown in Figure 5. A resilient cushioning member 92 is provided under the sound creation regions 50, 54 and the control regions 74 to 82. The cushioning member 92 is sufficiently resilient to prevent injury to the users when stepping or jumping from one region of the user interface device 12 to another.
The connection of the user interface device 12 to the data processing arrangement 16 can be made by cabling 94, as shown in Figures 1 and 2. Alternatively, the connection can be made by a wireless form of connection, such as Bluetooth, ZigBee., WiFi, infrared or other suitable wireless form of connection. Similarly, the connection between the data processing arrangement 16 and the display arrangement 30 can be made either by cabling or by any suitable wireless form of connection.
The connection of the master user interface device 12 with the slave user interface device 112, as shown in Figure 2, can be made by cabling, or by any suitable wireless form of connection.
In use, the user interface device 12 is connected to the data processing arrangement 16 which, in turn, is connected to the speaker arrangement 14 and the display arrangement 30. The computer program is loaded onto the data processing arrangement 16 by inserting the USB memory stick 24 into one of the USB sockets 22. When the program has been loaded onto the data processing arrangement 16, the other USB memory stick 26 can be inserted into the other of the USB sockets 22 to load onto the data processing arrangement details of the users who are to be using the music device 10. As explained above, the USB sockets 22 are external to the master and slave user interface devices 12, 112.
If desired, the details of users stored on the USB memory stick 26 can be downloaded onto another computer (not shown) by plugging the USB memory stick 26 into a USB port of the other computer, and thereafter downloading the data from the memory stick 26.
When the program has been set to run, the processing arrangement 16 provides instructions to the user to carry out specified operations on the user interface device 12. The instructions may be provided verbally via the speaker arrangement 14 and/or via the display arrangement 30.
The instructions provided to the user may be in the form of getting the user to follow a simple tune the notes of which are displayed on the display arrangement 30 on a music stave. The user's task will be to step on the appropriate spaces or lines on the music stave on the user interface device 12 to create sounds corresponding to the notes of the respective spaces or lines of the music stave, so that the notes so created correspond to those of the tune displayed on the display arrangement 30.
The data processing arrangement 16 will recognise whether the user has correctly followed the program instructions on the user interface device 12, or any mistakes have been made. The program may be written such that the user is encouraged to improve.
The program has facilities to store information relating to a large number of users.
Information on the users can be stored on the second USB memory stick 26.
There is thus described an effective musical teaching system which can be used to teach the theory of music and which will enable users to play and compose music and rhythm in an enjoyable and stimulating way.
Various modifications can be made without departing from the scope of the invention, for example, the user interface device may be in the form of a wall hanging. Alternatively, the user interface device may be of any size, and, for example, may be a pad, which may be for use on a table top, or may be a hand operated device.
In a further modification, the sound creation regions may include illumination means to allow the sound creation regions to be illuminated when pressed. This would render the music device suitable for use with pre-school children, and for those with disabilities.
In some circumstances or embodiments, the processing arrangement may cause the sound creation regions to be illuminated by the illumination means to indicate the steps to be taken. The illumination of the sound creation regions may be switched off only if the steps are taken correctly.
In another modification, further sound creation regions may be provided on the music stave to allow for sharps and flats to be introduced into the notes played.
Alternatively, or in addition, the hand controller 28 may be configured to allow sharps and flats to be introduced. Alternatively, or in addition, the hand controller 28 may be configured to add resonance to the sound emitted, or to control the sound, for example to make the sound seem to be from different instruments.
In another modification, the music device may be configured to allow the playing of music and/or composition of music by contacting the appropriate sound creation regions in a desired order and for desired respective periods of time. Where the music device is configured to allow composition of music, the processing arrangement may be configured to display via the display arrangement music notation showing the music so composed. Facility may be provided to allow such composed music to be stored.
A further embodiment of the music device is shown in Figures 6, 7, and 8 is generally designated 210. The music device 210 is a hand held or lap top device and comprises a casing 211, a user interface device 212, a speaker arrangement 214, and a data processing arrangement 216 (see Figure 8). The data processing arrangement 216 may also incorporate a sound card.
The user interface device 212 comprises a touch sensitive screen 213, which is provided with, or through which can be seen, a plurality of markings, namely a music stave, having the five lines 38, 40, 42, 44 and 46, and the spaces 39, 41,43 and 45 between the lines, as described above. The markings also include ledger lines 48 and 50 above and below the music stave, with spaces 47 and 49 between the music stave and the ledger lines 48 and 50, and upper and lower spaces 51A and 51B above and below the ledger lines 48 and 50 respectively.
In addition, the markings include a treble clef 58 and a bass clef 60, which operate in the same way as described above.
The user interface device further includes circular markings 254, the purpose of which is explained below.
The user interface device 212 also includes separate sound creation regions 252 at each of the lines and the spaces 38 to 46 of the music stave, as well as at the ledger lines 48, 50, at the spaces 47, 49 between the music stave and the ledger lines 48, 50, and at the upper and lower spaces 51A, 51B above and below the ledger lines 48, 50.
The user interface device also includes further sound creation regions 256 at the circular markings 254.
In addition to the sound creation regions 252, 256, the user interface device 212 also includes control regions, namely first control regions 270 and 272 for controlling the treble clef 258, and second control regions for controlling the bass clef 260, and second control regions, in the form of control buttons 280, on the side of the casing for controlling the operation of the data processing arrangement 216.
Each of the sound creation regions 252, 256 and the first and second control regions 270, 272 comprises a sensor arrangement in the form of a touch sensor 253 on the touch sensitive screen 213, as shown in Figure 7.
The sound creation regions 252 at the spaces 51A and 516 above and below the ledger lines 48 and 50 also function as control regions control whether the music device 210 is in treble clef mode or bass clef mode.
When the music device 210 is in treble clef mode, the user can contact the sound creation region 252 at the lower space 51 B, to change the mode to bass clef mode.
This contact is sensed by the touch sensor 253 at the lower space 51 B, which causes the processing arrangement 216 to transmit an appropriate signal to the speaker arrangement 214 to emit the note associated with the sound creation region at the lower space 51 B. At the same time, the processing arrangement changes the mode of the music device 210 to the bass clef mode.
Similarly, when the music device 210 is in bass clef mode, the user can contact the sound creation region 252 at the upper space 51A, to change the mode to the treble clef mode. This contact is sensed by the touch sensor 253 at the upper space 51A, and the processing arrangement 216 thereby causes the speaker arrangement 214 to emit the note associated with the sound creation region 252 at the upper space 51A. At the same time, the processing arrangement 216 changes the mode of the music device 210 to the treble clef mode.
In use, the data processing arrangement 216 provides instructions via the speakers 214A and 214B for the user to contact the touch sensitive screen 213 by touching it at the touch sensor 253 of the respective sound creation regions 252 on the music stave. This action results in the touching of the touch sensitive screen 213 being transmitted to the data processing arrangement 216, which in turn, causes notes to be emitted by the speakers 214A and 214B that correspond to the notes on the music stave at the regions of the touch sensitive screen 213 touched by the user.
In addition to the music stave markings, further markings for sharps, flats and naturals are also provided. These markings comprise, a sharps column 296 comprising a vertical array of the sharps notation, a naturals column 298 comprising a vertical array of the naturals notations, and a flats column 300 comprising a vertical array of the flats notations. The sharps column 296, the naturals column 298, and the flats column 300 include illuminators which can be illuminated when sharps, naturals and/or flats are needed by the music being taught or played. The illuminators may comprise LEDs. Thus, when, for example the sharps column 296 is illuminated, the sound emitted by the speaker arrangement 214 when the one of the sound creation regions 252 is contacted will be the note corresponding to the sound creation region 252 contacted raised by a semi-tone.
If the user wants the music device 210 to play percussion, the user can contact the sound creation regions 256 at the circular markings 254 by touching them. This action results in the data processing arrangement 216 causing the speakers 214A, 214B to emit percussive sounds.
The music device 210 also includes USB ports 282 to allow the music device 212 to be connected to, for example a computer or other device. A metronome setting region 284 is provided which allows a user to set metronomic rhythm for the music.
The metronome setting region 284 allows the user to set the speed of the metronome.
The music device 210 further includes a scale mode setting region 286 to set whether the scale of the music stave is the C major or the chromatic scale.
Third control regions 257 are provided to control the metronome setting region 284 and the scale mode setting region 286. Each of the third control regions comprises a touch sensor 253.
The music device may be battery operated, in which case a battery holder 288 is provided to hold a battery 289 (see Figure 8) If desired, the battery 289 may be rechargeable, and a DC charging input socket 290 may be provided to allow a suitable charger to be connected to the music device 210. A headphone socket 292 can be provided to allow a pair of headphones to be connected to the music device 210.
If desired, the music device 210 can be connected to the console 20 of the first embodiment by a cable, or by a wireless transmitting device 294, which may provide a wireless link, such as Bluetooth, ZigBee, or Wi-Fi. The music device can, if desired, be connected to a separate computer (not shown) by the same modes of connection.
A memory device 302, such as a flash memory device is provided to store data, for example data relating to the use of the music device 210.
An on-off button 304 is also provided to switch the music device 210 on or off.
Figure 8 also shows a central connection member 306 to which all the touch sensors 253 are connected. The central connection member 306 is connected by an electrical connection 308 to the processing arrangement 216. Also shown in Figure 8 are the electrical connections 310 which connect the various other components of the music device 210 to the processing arrangement 216.

Claims (23)

  1. CLAIMS1. A music device comprising: a user interface device comprising a plurality of sound creation regions, each sound creation region having at least one sound associated therewith, and each sound creation region comprising a sensor arrangement for sensing when said region is contacted by a user; a processing arrangement interactively associated with the user interface device, and capable of being associated with a speaker arrangement to cause the speaker arrangement to emit a sound associated with the sound creation region contacted by the user.
  2. 2. A music device according to Claim 1, including a display arrangement to display the instructions to the user, wherein the processing arrangement is configured to provide instructions to the user via the display arrangement.
  3. 3. A music device according to Claim 2, wherein the display arrangement is selected from the group comprising a computer monitor, a television screen, a smart board, a promethean board, a film screen.
  4. 4. A music device according to Claim 1, wherein the processing arrangement is configured to provide instructions to the user to contact a selected one or more of said sound creation regions.
  5. 5. A music device according to Claim 1, wherein the processing arrangement is configured to record the musical notes corresponding to the sounds associated with the sound creation regions contacted by the user, the recording of the aforesaid sounds being such that the musical notes can be displayed on a display arrangement by way of standard music notation.
  6. 6. A music device according to Claim 1, wherein the processing arrangement comprises a data processing arrangement, comprising means for storing data relating to the sound or sounds associated with each sound creation region.
  7. 7. A music device according to Claim 6, wherein the processing arrangement is configured to determine whether the sound creation region contacted by the user corresponds to the selected sound creation regions the processing arrangement has instructed the user to contact, and wherein the processing arrangement is configured to provide messages reporting upon the progress of the user.
  8. 8. A user interface device for use with a music device as claimed in Claim 1, the user interface device comprising a plurality of sound creation regions, wherein each sound creation region has at least one sound associated therewith, and each sound creation region comprises a sensor arrangement for sensing when said region is contacted by a user.
  9. 9. A user interface device according to Claim 8, comprising a substantially flat surface wherein the sound creation regions are defined or indicated by substantially flat surface pressure sensitive regions.
  10. 10. A user interface device according to Claim 8, which is separate from the processing arrangement, and comprises a floor mat upon which the user can step.
  11. 11. A user interface device according to Claim 8, wherein each sensor arrangement comprises first and second electrically conductive components, and an electrically insulating component between the first and second electrically conductive components, the electrically insulating component defining at least one aperture to allow electrical connection between the first and second electrically conductive components when said sound creation region is contacted.
  12. 12. A user interface device according to Claim 11, comprising a cover member which extends over the sound creation regions, and a cushioning member below the sound creation regions, the cover member comprising a sleeve that extends over the sensor arrangements and extends around the cushioning member.
  13. 13. A user interface device according to Claim 8 comprising a pad, wherein the processing arrangement and the user interface device constitute a single unit, and the user interface device further includes a casing, wherein the processing arrangement and the user interface device are held by the casing.
  14. 14. A user interface device according to Claim 8 comprising control regions for controlling the music device, each control region comprising a further sensor arrangement for sensing contact thereon by the user.
  15. 15. A user interface device according to Claim 14, wherein each further sensor arrangement comprises first and second electrically conductive components, and an electrically insulating component between the first and second electrically conductive components, the electrically insulating component defining at least one aperture to allow electrical connection between the first and second electrically conductive components when said control region is contacted.
  16. 16. A user interface device according to Claim 8, wherein the processing arrangement is configured to effect running control of the music device, for example by selectively stopping, pausing, fast forwarding, rewinding, and playing the music device.
  17. 17. A method of using a music device comprising: providing a user interface device comprising a plurality of sound creation regions, each sound creation region having a sound associated therewith, and each sound creation region comprising a sensor arrangement for sensing when said region is contacted by a user; interactively associating a processing arrangement with the user interface device and a speaker arrangement, whereby when one of the sound creation regions is contacted by the user, the processing arrangement causes the speaker arrangement to emit the sound associated with the sound creation region contacted by the user.
  18. 18. A method according to Claim 17, comprising providing instructions to the user via the processing arrangement to contact a selected one or more of said sound creation regions.
  19. 19. A method according to Claim 17 comprising recording musical notes corresponding to sounds associated with selected sound creation regions contacted by the user, wherein the aforesaid notes are recorded such that the notes can be displayed on the display arrangement by way of standard music notation.
  20. 20. A method according to Claim 18 comprising providing instructions to the user to contact a plurality of said sound creation regions in a selected sequence, and for a respective selected approximate period of time.
  21. 21. A method according to Claim 20 comprising storing data in the processing arrangement relating to the sound associated with each sound creation region.
  22. 22. A method according to Claim 20 comprising determining whether the sound creation region contacted by the user corresponds to the selected sound creation region the processing arrangement has instructed the user to contact, and providing messages reporting upon the progress of the user.
  23. 23. A method according to Claim 20 wherein the user interface device comprises a mat, and the method comprises instructing the user to step on the selected sound creation region on the mat, and sensing the pressure applied thereto when the user steps on a sound creation region.
GB0810323A 2008-06-06 2008-06-06 Music device with contact sensitive sound creation regions Withdrawn GB2460496A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB0810323A GB2460496A (en) 2008-06-06 2008-06-06 Music device with contact sensitive sound creation regions

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB0810323A GB2460496A (en) 2008-06-06 2008-06-06 Music device with contact sensitive sound creation regions

Publications (2)

Publication Number Publication Date
GB0810323D0 GB0810323D0 (en) 2008-07-09
GB2460496A true GB2460496A (en) 2009-12-09

Family

ID=39638256

Family Applications (1)

Application Number Title Priority Date Filing Date
GB0810323A Withdrawn GB2460496A (en) 2008-06-06 2008-06-06 Music device with contact sensitive sound creation regions

Country Status (1)

Country Link
GB (1) GB2460496A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016138601A1 (en) * 2015-03-04 2016-09-09 Pontificia Universidad Catolica De Chile Electronic musical device
US9721552B2 (en) 2014-03-18 2017-08-01 O.M.B. Guitars Ltd. Floor effect unit
WO2023156841A1 (en) * 2022-02-17 2023-08-24 Ratsimaholizanany Fafy Iankinana Chromatic bass musical instrument played diagonally with the feet

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1535008A (en) * 1976-03-08 1978-12-06 Nep Co Ltd Step-on type tone scale play device
GB2033129A (en) * 1978-06-20 1980-05-14 Matsushita Seiko Kk Apparatus for generating musical scale sound by footsteps thereon
US4924743A (en) * 1989-04-13 1990-05-15 Tsai Chao Hsiung Musical dancing block set
US5739455A (en) * 1996-12-17 1998-04-14 Poon; Yiu Cheung Electronic guitar music simulation system
US5841051A (en) * 1995-08-17 1998-11-24 M. H. Segan Limited Partnership Apparatus for providing musical instruction
GB2349736A (en) * 1999-05-01 2000-11-08 Leary Laurence O Interactive music display device
US20020046638A1 (en) * 2000-07-28 2002-04-25 Glenda Wright Interactive music, teaching system, method and system
US20020088337A1 (en) * 1996-09-26 2002-07-11 Devecka John R. Methods and apparatus for providing an interactive musical game
US20080072156A1 (en) * 1996-07-10 2008-03-20 Sitrick David H System and methodology of networked collaboration

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1535008A (en) * 1976-03-08 1978-12-06 Nep Co Ltd Step-on type tone scale play device
GB2033129A (en) * 1978-06-20 1980-05-14 Matsushita Seiko Kk Apparatus for generating musical scale sound by footsteps thereon
US4924743A (en) * 1989-04-13 1990-05-15 Tsai Chao Hsiung Musical dancing block set
US5841051A (en) * 1995-08-17 1998-11-24 M. H. Segan Limited Partnership Apparatus for providing musical instruction
US20080072156A1 (en) * 1996-07-10 2008-03-20 Sitrick David H System and methodology of networked collaboration
US20020088337A1 (en) * 1996-09-26 2002-07-11 Devecka John R. Methods and apparatus for providing an interactive musical game
US5739455A (en) * 1996-12-17 1998-04-14 Poon; Yiu Cheung Electronic guitar music simulation system
GB2349736A (en) * 1999-05-01 2000-11-08 Leary Laurence O Interactive music display device
US20020046638A1 (en) * 2000-07-28 2002-04-25 Glenda Wright Interactive music, teaching system, method and system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9721552B2 (en) 2014-03-18 2017-08-01 O.M.B. Guitars Ltd. Floor effect unit
WO2016138601A1 (en) * 2015-03-04 2016-09-09 Pontificia Universidad Catolica De Chile Electronic musical device
WO2023156841A1 (en) * 2022-02-17 2023-08-24 Ratsimaholizanany Fafy Iankinana Chromatic bass musical instrument played diagonally with the feet

Also Published As

Publication number Publication date
GB0810323D0 (en) 2008-07-09

Similar Documents

Publication Publication Date Title
KR101206127B1 (en) Portable electronic device for instrumental accompaniment and evaluation of sounds
US20190212843A1 (en) Children's toys with capacitive touch interactivity
US11011145B2 (en) Input device with a variable tensioned joystick with travel distance for operating a musical instrument, and a method of use thereof
US20080146329A1 (en) Movement Information Processing System
JP6414917B2 (en) Floor effector
US8779270B2 (en) Operation detection apparatus
US9262940B2 (en) Musical notation interface for the visually impaired
US20110205184A1 (en) electronic interactive device, and related memory
GB2460496A (en) Music device with contact sensitive sound creation regions
US10283098B2 (en) Controller, sound source module, and electronic musical instrument
US20140123834A1 (en) Automated Music Displaying System for Musical Instrument
KR20170092000A (en) Learning aids for software programming education
CN112584281A (en) Touch sensitive audiovisual input/output device and method
KR100347837B1 (en) Apparatus of rythm and dance game machine and foot key
KR20120007703U (en) Automatic page turning-over apparatus for electronic music sheet
KR20160005941A (en) Finger-mounted musical scale generating system
Seol et al. Learning guitar with an embedded system
US11925875B2 (en) Interactive electronic toy system
KR101380159B1 (en) instrument playing system
JP4649870B2 (en) Portable electronic devices
KR20150003518A (en) Melodion with Apparatus for Recording and Playback
JP2011039248A (en) Portable sound output device, computer program, and recording medium
CN110889994A (en) Musical instrument played according to prompt signal
US12420176B2 (en) Systems and methods for a microphone input device and microphone accessibility features
US20240307793A1 (en) Interactive electronic toy system

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)