WO2009071919A1 - Controller - Google Patents
Controller Download PDFInfo
- Publication number
- WO2009071919A1 WO2009071919A1 PCT/GB2008/004044 GB2008004044W WO2009071919A1 WO 2009071919 A1 WO2009071919 A1 WO 2009071919A1 GB 2008004044 W GB2008004044 W GB 2008004044W WO 2009071919 A1 WO2009071919 A1 WO 2009071919A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- control
- vibrational
- controller
- user
- signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
Definitions
- the present invention relates to a controller and to methods of operation of such a controller.
- the invention has particular, but not exclusive, relevance to controllers that allow a user to control portable electronic devices such as media players.
- portable electronic devices have been controlled using an array of switches or buttons, each of these being dedicated in the sense that they have a single function. More recently, portable electronic devices have been provided with a controller in the form of multifunctional user interface that provides different functions depending on how it is used. An example of this is a touch screen, such as that used in certain portable media players of Apple, Inc. (formerly Apple Computer, Inc.) .
- Touch screens and touch pads> typically operate using capacitive sensing to detect the location of a user' s finger or thumb (collectively referred to herein as a user's digits). Some touch screens are capable of sensing the location of more than one user' s digit at a time on the screen, and providing the required functionality based on the way in which the user's digits move on the screen.
- Antonacci et al (Antonacci, F., Gerosa, L., Sarti, A., Tubaro, S., and Valenzise, G. "Sound-based classification of objects using a robust fingerprinting approach" Proceedings of EUSIPCO 2007: http : //suono . como .polimi .
- HMI human-machine interface
- the technique uses digital audio fingerprinting, which is computationally demanding.
- the classification technique is based on a distance metric, in comparison with a training set of signals.
- the signals assessed include scratching signals, collected from arbitrary objects such as a display board.
- Baudisch et al (Baudisch, P., Sinclair, M., and Wilson, A. "Soap: a pointing device that works in mid-air"
- UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology, ACM Press (New York, NY, USA, 2006, 43-46) disclose a computer pointing device that is held in a user's hand, not in contact with a flat surface.
- a core part moves relative to an outer shell.
- the shell includes fabric and may stretch elastically in use. This device may provide some tactile feedback to the user.
- Bornand et al (Bornand, C, Camurri, A., Castellano, G., Catheline, S., Crevoisier, A.,Roesch, E., Scherer, K., and Volpe, G. "Usability evaluation and comparison of prototypes of tangible acoustic interfaces" Proc. Of ENACTIVE05 www.taichi.cf.ac.uk/files/108 FinalPaper.pdf
- O'Modhrain and Essl O'Modhrain, S., and Essl, G. "Pebblebox and crumblebag: tactile interfaces for granular synthesis"
- NIME '04 Proceedings of the 2004 conference on New interfaces for musical expression, National University of Singapore (Singapore, 2004) , 74-79) disclose haptic controllers for computer-based musical instruments.
- the core of the technology in this document is to use granular or pebble- shaped material, and to detect the sounds produced by their interaction.
- Williamson et al disclose inertial sensing to detect motion of a control device, the user actively moving the device in order to elicit an output.
- the device responds by providing an auditory output.
- the example given in the document is of a user shaking a mobile phone, and if the phone "contains" a message, the phone emits a metallic clanking noise to indicate the presence of a message.
- Ronkainen et al (Ronkainen, S., Hakkila, J., Kaleva, S.,
- Colley, A., Linjama, J., "Tap input as an embedded interaction method for mobile devices” TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA, 263-270) disclose the use of tapping, detected by an accelerometer, to provide a user interface to control a device (e.g. mobile phone) .
- Other movements are also possible, such as slapping and shaking.
- this document discloses the use of different movements and contacts with the device, triggering different control functions. Feedback is provided to the user in the form of vibration.
- Hummels, C, et al (Hummels, C, Overbeeke, K.C.J. , Klooster, S., "Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction”
- Pers Ubiquit Comput (2007) 11:677-690 disclose a controller for a music player, in which a furry control surface is provided on the controller. The position of the user' s finger on the furry control surface is detected via capacitive sensing.
- a useful controller for a device may be based on a combination of tactile input and vibrational sensing. This means that the user can more easily control the device when it is out of sight, e.g. in a pocket, and the tactile feedback sensations experienced by the user are a close representation of the vibrations sensed by the controller. This is considered to be a first development of the present invention.
- the present invention provides a controller including at least one control surface for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface, at least one of the control surface and the user- operated surface being a textured surface, and processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, corresponding to the first characteristic class of vibrational signal.
- the present invention provides a device including a controller according to the first aspect, the controller being operable to control one or more functions of the device.
- the present invention provides a use of a controller according to the first aspect to control a device, the use including the a user interacting with the controller to provide a translational contact motion between the user-operated surface and the control surface.
- the present invention provides a method of operation of a controller according to the first aspect, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
- the present invention provides a computer program for carrying out the method of the fourth aspect.
- the computer program may, for example, be loaded onto a computer system.
- the computer program may, for example, be stored on a storage medium such as a computer disk.
- the present invention provides a system including a controller having at least one control surface for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-controlled surface, at least one of the control surface and the user-operated surface being a textured surface, and the system providing control means for classifying the first vibrational signal into a first characteristic class of vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal .
- the present invention provides an external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having a control surface for operation by a user, the control surface being a textured surface.
- the body of the controller includes at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface and processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, corresponding to the first characteristic class of vibrational signal.
- the vibrational sensor may be included as part of the external cover layer.
- the present invention provides a kit of one or more external cover layers according to the seventh aspect and the body of the controller as set out with respect to the seventh aspect.
- the present invention provides a system including a controller according to the first or sixth aspect and a device including the user-operated surface, for translational contact motion with the control surface.
- control surface is textured.
- user-operated surface may additionally or alternatively be textured, in order to provide the vibrational signals to the controller.
- the user-operated surface is a surface of an object which the user can control to move relative to the control surface.
- the user-operated surface may be a surface of the user himself or herself, e.g. skin.
- the user's clothing may be used.
- the user uses one or more digits of one or both hands to interact with the controller. "Digit" here is intended to include fingers and/or thumbs. However, it is not necessarily excluded that other parts of the user may interact with the device, e.g. parts of the face or head, non-digit parts of the hand (e.g. palm), etc.
- the controller may be operated most successfully with any part of the body that can provide the user with suitable feedback as to which part of the controller is being touched.
- a particularly preferred mode of operation of the controller involves the user using his or her finger or thumb nail(s) to interact with the controller.
- the user-operated surface need not be a part of the user or a part of the user's clothing. Additionally or alternatively, the user-operated surface may all or a part of the surface of another device. This is of particular interest where the controller needs to identify the device with which it is in translational contact motion. For example, when the user- operated surface includes texture, the texture may identify the device, via suitable encoding. Thus, rubbing (for example) of the user-operated surface against the control surface allows the controller (or ancillary equipment) to identify the device by the vibrational signals caused by the rubbing. This may be used, for example, to identify the device to the controller and may therefore take the place of, or be additional to, security procedures to identify the device to the controller.
- the device also includes a vibrational sensor in order to detect the vibrational signals for classification.
- the vibrational signals sensed in the controller and in the device are substantially identical.
- the controller includes a plurality of control surfaces.
- each of the plurality of control surfaces is a textured surface.
- the texture of the control surfaces may differ between the control surfaces. Suitable preferred textures are set out below. References to the texture of the control surface (s) is to be read as applying also to the possible texture (s) at the user- operated surface (s).
- the texture of the at least one control surface is one that can be discerned by a user.
- a user is capable of discriminating between the texture of the at least one control surface and another surface of the controller that is other than the control surface.
- Such other surfaces may be substantially smooth, or may be another control surface .
- the texture of the at least one control surface is preferably a systematic variation in the height profile of the control surface from an average height of the control surface. Suitable patterns of variation may be generated algorithmically .
- the systematic variation of the height profile of the control surface from an average height of the 0.5 cm 2 area has an amplitude of at least 0.1 mm. More preferably this amplitude is at least 0.2 mm, at least 0.3 mm, at least 0.4 mm or about 0.5 mm.
- the texture includes one or more of: ridges, troughs, dimples, pimples, pillars, rods. Ridges and troughs may be considered to be line-type texture. Dimples, pimples, pillars and rods may be considered to be island-type texture. Other suitable examples of each type of texture will be apparent. More than one class of each type of texture may be present. For example, there may be coarse line-type texture overlaid with relatively finer line-type texture.
- ridges and/or troughs When line-type texture is present, in the form of ridges and/or troughs for example, these may be arrayed as straight ridges and/or troughs. They may be curved ridges and/or troughs. They may be substantially parallel to each other. Alternatively, they may be arrayed at least partially in a fan-type array, or wheel-spoke array.
- the ridges and/or troughs when viewed in cross section in a direction in which the ridges and/or troughs extend (at least locally) , may be substantially symmetrical about a maximum or minimum height of the ridge and/or trough. However, preferably they are substantially asymmetrical when viewed in this way.
- the ridges and/or troughs may provide substantially different vibrational signals, depending on the direction of translation contact motion between the control surface and the user.
- the texture includes island-type texture
- the islands which may be areas of locally raised profile and/or areas of locally depressed profile, substantially surrounded by areas of non-raised or non-depressed profile
- the lattice array may have translational and/or rotational symmetry.
- the individual islands themselves may have rotational symmetry (e.g. 2-fold. 3-fold. 4-fold, 5-fold, 6-fold or higher) .
- it is preferred that the individual islands have only 1-fold rotational symmetry. This asymmetry of the shape of the islands allows the islands to provide substantially different vibrational signals, depending on the direction of translation contact motion between the control surface and the user.
- the islands may • have a tear-drop shape when viewed in plan view.
- the islands, when viewed in cross-section perpendicular to the plan view, may have a square, rounded, triangular, curved triangular or conic section shape.
- the islands, when viewed in this direction may be substantially asymmetrical, for a similar reason as explained above with respect to the line texture above.
- the spacing between adjacent lines or islands in the texture may be of the same order as the height of the texture.
- the centre-to-centre spacing between adjacent areas of maximum height may be 0.1 mm or more. More preferably this centre-to-centre spacing is at least 0.2 mm, at least 0.3 mm, at least 0.4 mm, at least 0.5 mm, at least 0.6 mm, at least 0.7 mm, at least 0.8 mm, at least 0.9 mm, or about 1 mm, or higher.
- the spacing of the texture may vary across the control surface. This may be a gradual variation. Alternatively, this may be a step-wise variation. For example, for the case of line-type texture, step-wise variation in spacing between adjacent lines may be achieved by systematically providing lines of different length in the texture. In this way, a first band of texture may be defined on the control surface having a first spacing and a second band of text may be defined on the control surface (e.g. adjacent to the first band) having a second spacing, different to the first spacing.
- a first band of texture may be defined on the control surface having a first spacing and a second band of text may be defined on the control surface (e.g. adjacent to the first band) having a second spacing, different to the first spacing.
- the controller includes two or more control surfaces, each control surface having a different texture. This allows the different control surfaces to provide substantially different vibrational signals when operated by the user, depending on the texture of the control surfaces. There may be three, four, five or more such control surfaces.
- the two or more control surfaces may be substantially coplanar.
- the controller has a body form and the two or more control surfaces are provided on non- coplanar surfaces of the body.
- the control surfaces are provided on different faces of the body.
- the control surfaces may be provided on different parts of the curved face, or on the curved face and on a different face. This allows the controller to be more easily operated by the user.
- at least one of the control surfaces has a convex shape.
- One or more of the control surfaces may have a concave shape.
- a single control surface may have a region of concavity and a region of convexity.
- control surfaces may be integrally formed with the body wall.
- the control surfaces may be moulded into the body wall. In this way, the control surfaces may be efficiently manufactured. This also allows the control surfaces to be formed of the same material at the body wall.
- the body may enclose a space. This space may be used to house the vibrational sensor and/or the processing means and/or the control means .
- the control surface may be formed on an external layer that is separable from the body.
- the control surface may for example be replaceable using a different external layer having a differently textured control surface.
- the control surface may be formed of a rigid material such as metal or alloy, rigid plastics or ceramics.
- the control surface may be formed of a resilient material, such as a synthetic or natural rubber material.
- the user may interact with the control surface by one or more of scratching, rubbing, tapping, stroking.
- the vibrational sensor may for example be a microphone, such as a contact microphone. Piezo microphones are suitable, for example.
- the vibrational sensor is preferably coupled to an internal wall of the body. In this way, the body may directly transmit vibrational signals from the control surfaces to the vibrational sensor.
- the coupling is preferably via a thin coupling layer, e.g. of thickness 5 mm or less. Suitable materials for this coupling layer include Bingham plastics, most preferably a Bingham plastic that provides adhesion between the vibrational sensor and the wall of the body. Such materials provide good signal quality for the vibrational sensor, and can assist in providing robustness against interference from air-borne sound. Alternatively, an adhesive sheet may be used.
- the processing means preferably receives an input signal from the vibrational sensor.
- this signal may correspond to vibrational signals across a range of frequencies.
- the processing means is operable to limit the bandwidth of the data corresponding to these signals for onward processing to a band at 10 kHz and below, more preferably 8 kHz and below, 6 kHz and below, 4 kHz and below and most preferably 2 kHz and below. This is because the most useful vibrational frequencies tend to be in the lower frequencies. Reducing the bandwidth in this way can reduce the load on later processing stages in the controller or system without losing a large amount of useful information.
- the classification of the vibrational signals into the characteristic classes of vibrational signals may take place within the body of the controller, so that the output of the controller is a control signal. Alternatively, this classification may take place elsewhere in the system, such as in a device to be controlled. Preferably the classification takes place in control means, operable to output a control signal based on the classification of the vibrational signal.
- the control means includes a store for storing training data.
- the training data includes training data sets corresponding to the different classes of vibrational signal, e.g. provided by the control surfaces of the controller under one or more type of interaction with a user. At least one of the training data sets is preferably associated with a non-control input such as miscellaneous noise .
- control means is operable to carry out a first classification on the vibrational signal (after processing) .
- This first classification may be a relatively rough classification, intended to be performed quickly, such as in real time.
- the output of the first classification may be a first classification output stream.
- the first classification output stream is subjected to a second (or further) classification.
- the second classification is intended to smooth out fluctuations in the first classification output stream.
- the output of the second classification may be a second classification output stream.
- the second classification output stream is used to determine the nature of a control signal to be applied to the device controlled by the controller.
- the control signal may be a discrete event control (e.g. "track number” or “next track” or “previous track” for a music player) or it may be a continuous (or quasi-continuous) value control (e.g. "volume up” or “volume down” for a music player”.
- the controller may include active feedback means.
- the controller may include active audio, visual and/or vibrating means, operable in the even that it is required to provide to the user an active feedback.
- Feedback such as an active vibration may provide an indication to the user that a task has been completed, or is in progress, for example.
- the active feedback may be provided at substantially the same time as the interaction of the user with the controller. However, it is preferred that the active feedback is delayed with respect to the interaction of the user with the controller. This delay may be at least 5 ms, for example. It may be longer, e.g. at least 50, 100, 200, 300, 400 or 500 ms . It is envisaged that the delay would not be greater than 5 seconds.
- the feedback is specific to the task being undertaken by the user, or the task that the user is instructing the controller (or a device controlled by the controller) to undertake, or a state of the controller (or of a device controlled by the controller) .
- the controller may include further sensing means, e.g. for sensing movement of the controller.
- the controller may include at least one accelerometer (e.g. one- axis, two-axis or three-axis accelerometers) and/or at least one angular rate sensor (e.g. one-axis, two-axis or three-axis angular rate sensors) .
- the controller may include at least one magnetometer (e.g. one-axis, two-axis or three-axis magnetometer) . This allows the controller or associated device to determine the orientation of the controller with respect to a magnetic field, e.g. the earth's magnetic field.
- the accelerometer allows the controller to determine its orientation with respect to the direction of gravity, when held steady.
- the magnetometer may then determine the direction of the earth's magnetic field at the device. This allows a bearing angle to be determined.
- other position-determining means may be provided such as a GPS locator or similar.
- the controller may include one or more touch sensing means.
- the touch sensing means may operate by capacitive sensing.
- Such sensing means may be operable to sense touch and translational contact movement of a user on the controller.
- the external surface of the controller has few apertures leading into the inside of the controller.
- the controller may be used in environments in which apertures in the controller could become clogged with extraneous matter such as dust, dirt, sand, etc.
- the use of the textured control 'surfaces, and/or other non-apertured controls such as the touch sensing means, allows the controller to be formed with few apertures.
- the controller may be substantially sealed. In this way, electronics within the controller may be protected from the external environment, and thus the controller may be more robust.
- the external casing of the controller including the control surfaces
- the internal components of the controller may additionally be protected from adverse electrical interference or electromagnetic interference .
- the device is preferably a portable device such as a mobile communications device and/or a portable media player such as a music player and/or video player.
- the device may be used for the remote control of a second device.
- the device may operate as a computer mouse (or other graphical user interface pointing and indicating device) for a computer.
- the device may operate as a handheld media controller for audio, video or web applications, for example.
- the device is hand-held.
- One particularly suitable application is a wireless headset, e.g. for mobile communications .
- the controller does not itself include the control means, but rather the control means is included in the device to be controlled, preferably there is provided a communications link between the controller and the device. This link may be a wireless link, for example a Bluetooth link.
- the present inventors have realised that the first development may be modified in order to provide touch sensing in place of vibrational sensing, the texture of the control surfaces providing non-visual feedback to the user of the control surface being operated.
- the present invention provides a controller including at least first and second control surfaces for operation by a user, the control surfaces each being a textured surface, having different texture to each other, the controller including at least one touch sensor for detecting a touch of a user-operated surface on one or more of the control surfaces and processing means for processing a touch signal corresponding to the touch to enable classification of the touch signal into a first characteristic class of touch signal corresponding to the first control surface or into a second characteristic class of touch signal corresponding to the second control surface, from which a control signal is derivable.
- the second to ninth preferred aspects of the first development correspondingly apply to the second development, with the proviso that in the second development it is not necessary to provide or sense vibrational signals, but that instead touch signals are provided and sensed.
- the controller may provide vibrational feedback to the user, based for example on the sensed signals. Such feedback may be used to indicate to the user that a signal has been (or is being) received or that a function has been (or is being) carried out.
- the touch sensor may be common to both the first and second control surfaces.
- the touch sensor may conveniently be a capacitive sensor.
- the preferred and/or optional features set out with respect to the first development may applied either singly or in any combination to the second development.
- one or more deformable and/or displaceable control elements may be used in a similar manner to the control surface (s) of the first development, in order to generate classifiable vibrational signals for use in generating corresponding control signals.
- the present invention provides a controller including at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation of the control element, the controller further including processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal corresponding to the first characteristic class of vibrational signal.
- the present invention provides a method of operating a controller, the controller including at least one control element, at least one vibrational sensor and processing means, the method including the step of the user deforming or displacing the control element by causing a contact action on the control element to take place, the vibrational sensor detecting a first vibrational signal caused by at least one of: the deformation of the control element; and a recovery from the deformation of the control element, wherein the processing means processes the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal corresponding to the first characteristic class of vibrational signal.
- the present invention provides a mobile electronic device including a controller according to the first aspect.
- the present invention provides a system including a controller having at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element, the system providing control means for classifying the first vibrational signal into a first characteristic class of vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal .
- the present invention provides a use of a controller according to the first aspect or of a system according to the fourth aspect to control a device, the use including a user interacting with the controller to deform or displace the control element to cause a first vibrational signal.
- the present invention provides a method of operating a controller according to the first aspect or a system according to the fourth aspect, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
- the present invention provides an external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having at least one control element, the control element being suitable for providing a detectable first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element.
- the at least one vibrational sensor is included as part of the external cover layer, the vibrational sensor being for detecting said first vibrational signal.
- the present invention provides a kit of one or more external cover layers according to the eighth aspect and a controller according to the first aspect, or a mobile electronic device according to the third aspect.
- the preferred and/or optional features set out with respect to the first development and/or second development may applied either singly or in any combination to the third development.
- the contact action which causes deformation or displacement of the control element is direct or indirect contact of the control element by the user. It is preferred that the control element is directly contacted by the user.
- the contact may be indirect contact, for example via one or more intermediate means, such as an outer wall of the controller or device.
- the contact between the user and the control element provides haptic feedback to the user.
- the haptic feedback takes the form of the user sensing by touch at least a component of the deformation and/or displacement of the control element.
- the control element may be pressed, stroked, twisted, scratched, pulled, rubbed, for example by the user' s hand(s) .
- the deformation takes place directly by the user's hands.
- the user may hold the controller and cause the control element to be deformed by manipulation of the control element against a different surface, e.g. a part of the user other than the hand(s), or a wall, door, window, seat, floor, etc.
- the controller includes a plurality of control elements.
- the shape of the control elements differs between the control elements.
- the at least one control element includes, at least in part, one of a concave and a convex shape.
- the control element preferably has a first configuration before deformation or displacement and a second, different configuration during deformation or displacement.
- the second configuration has, at least in part, a concave shape .
- the control element may comprise a flexible wall member.
- the control element may have a three-dimensional shaped form.
- the control element may have a first configuration before deformation and a second, different configuration during deformation.
- both the first and second configurations are characteristic configurations.
- the control element may be bistable, deformation of the control element from the first configuration allowing a second configuration to be reached, the second configuration being energetically relatively stable compared with other configurations, for the time that the user continues to apply a deformation force.
- An example of a bistable control element is a deformable dome. Such a dome typically has a first configuration in which the centre of the dome is upstanding.
- Deformation of the dome by pressing the centre of the dome causes the wall of the dome to deform locally. This is elastic deformation. With the application of sufficient pressure, the dome deforms to a partially inverted dome shape (the second configuration) . This occurs quickly once the dome is deformed beyond a threshold deformation, the second configuration of the dome being a stable configuration for as long as the pressure on the dome is maintained. Release of the pressure on the dome allows the dome to spring back from the second configuration to the first configuration.
- the present inventors have noted that the deformation from the first configuration to the second configuration provides a characteristic vibrational signal. Similarly, release of the dome from the second configuration to the first configuration also provides a characteristic vibrational signal, which may be the same as or different to the vibrational signal caused by deformation from the first configuration to the second configuration.
- deformable shapes can be used for the control elements. Such shapes may also have bistability (or, in general, multistability) . This is advantageous, because such multistable shapes more easily provide characteristic vibrational signals as they spring between first and second (and optionally third, fourth, etc.) configurations. Such characteristic vibrational signals may be classified with relative ease, thereby more surely determining the correct control signal corresponding to the vibrational signal.
- the characteristic vibrational signals provided by multistable shapes tend to have a low dependency on the speed of displacement and/or the pressure applied by the user.
- multistable shapes are preferred because they can provide the user with suitable tactile feedback, indicating to the user by touch that the control element has been suitably operated without need for the user to look at the controller.
- control elements that click between the first and second configurations.
- Suitable (e.g. multistable) control elements may be formed of a material having a relatively small thickness in at least one direction.
- suitable multistable control elements may be formed by shaping a sheet material (e.g. by pressing, moulding, etc.) .
- deformation of the control element from the first configuration to the second configuration may occur by locally elastically bending portions of the control element.
- the amount of elastic strain experienced at these portions may be significant.
- the maximum elastic bending strain experienced during the deformation may be at least 0.01%, more preferably at least 0.02%, more preferably at least 0.05%, more preferably at least 0.1%, more preferably at least 0.15% and more preferably at least 0.2%.
- the maximum elastic bending strain may be about 0.5%.
- the amount of bending strain experienced locally by the control element will depend on the first and second configurations of the control element and certain embodiments of the invention may not experience bending strains within the ranges set our above.
- the amount of displacement applied to the control element during operation of the device requires that at least a portion of the control element is displaced by at least 0.1 mm, more preferably at least 0.2 mm, more preferably at least 0.5 mm, more preferably at least 0.8 mm, more preferably at least 1 mm, more preferably at least 1.5 mm, more preferably at least 2 mm.
- the amount displacement experienced by the control element will depend on the first and second configurations of the control element (where the control element is multistable) and certain embodiments of the invention may not experience displacements within the ranges set our above.
- control element may be multistable.
- Other shapes for the control elements may provide suitable characteristic (and thus classifiable) vibrational signals.
- a sliding control element may be provided which at or near a first limit of its travel provides the classifiable vibrational signal.
- the sliding control element may provide a second classificable vibrational signal at or near a second limit of its travel (e.g. in the reverse direction) .
- the sliding control element need not necessarily spring back to an initial position after operation by the user. Instead, the user may choose when to move the sliding control element towards the first limit of travel and, independently, when to move the sliding control element towards the second limit of travel.
- suitable shapes provide the user with suitable feedback (e.g. tactile or haptic feedback) to indicate to the user that the control element has been suitably operated without need for the user to look at the controller.
- the controller may have more than one control element. For example, two, three, four, five, six, seven, eight, nine, ten, eleven, twelve, thirteen, fourteen, fifteen, sixteen, seventeen, eighteen, nineteen or twenty or more control elements may be provided.
- the control elements each provide a corresponding vibrational signal, different from and classifiable from the vibrational signals caused by operation of the other control element (s). The benefit of this is that the controller can then detect and classify the different vibrational signals in order to determine the appropriate control signal.
- a mobile music player such as an MP3 player may have 10 or fewer control elements, e.g. 5 or 6.
- a mobile communications device such as a mobile telephone may require more control elements, typically more than 10, e.g. 12-16 control elements.
- each element need not be electronically wired to the processor. Instead, the vibrational signals may be detected by the vibrational sensor via a (preferably solid) common vibration transmission path.
- a (preferably solid) common vibration transmission path may be simplified compared with a controller having electronic control buttons .
- Fig. 1 shows a plan view of a controller according to an embodiment of the invention.
- Fig. 2 shows a rear side view of the controller of Fig. 1.
- Fig. 3 shows a front side view of the controller of Fig. 1.
- Fig. 4 shows a perspective view of the controller of Fig. 1.
- Fig. 5 shows a left side view of the controller of Fig. 1.
- Fig. 6 shows a right side view of the controller of Fig. 1.
- Fig. 7 shows a plan view of a controller according to another embodiment of the invention.
- Fig. 8 shows a rear side view of the controller of Fig. 7.
- Fig. 9 shows a reverse plan view of the controller of Fig. 7.
- Fig. 10 shows a left side view of the controller of Fig. 7.
- Fig. 11 shows a right side view of the controller of Fig. 7.
- Fig. 12 shows a schematic cross sectional view through a control surface for use in an embodiment of the invention.
- Figs. 13-16 show greyscale images of algorithmically-generated texture for use with embodiments of the invention.
- Fig. 17 shows a spectrogram of vibrational signals recorded in an example of operation of an embodiment of the invention.
- Figs. 18 to 21 show vibrational classification results derived from the vibrational signals of Fig. 17.
- Fig. 22 shows a representation of the probability of the input signal of Fig. 17 corresponding to a "next track" input signal .
- Fig. 23 shows a representation of the probability of the input signal of Fig. 17 corresponding to a "previous track” input signal.
- Fig. 24 shows the control of volume with time due to the input signal of Fig. 17.
- Fig. 25 shows the control of track number with time due to the input signal of Fig. 17.
- Fig. 26 shows a representation of the input signal of Fig. 17.
- Figs. 27-29 show views of an alternative controlling mechanism for an embodiment of the present invention.
- Figs. 30-32 show the controlling mechanism of Figs. 27-29 when the dome is deformed elastically.
- Fig. 33 shows the time series and Fig. 34 shows the spectrogram for a first dome, on press and release.
- Fig. 35 shows the time series and Fig. 36 shows the spectrogram for a second dome, on press and release.
- Figs. 1 to 6 show a controller 10 according to one embodiment of the invention. Similar features are identified in these drawings with the same reference numerals.
- the controller 10 has an overall configuration similar to a partially flattened egg shape, and this provides curved surfaces for location of the control surfaces.
- Controller 10 has five differently-textured control surfaces 12, 14, 16, 18 and 20 visible in Fig. 1.
- Control surface 12 has no texture that is visible in Fig. 1, but the texture is shown partially in Fig. 4, consisting of a series of radially- arranged ridges 12a, centred on central feature 22. These ridges have a saw-tooth structure (preferably a smooth sawtooth structure) .
- the gap between the ridges varies uniformly as the user follows a ridge radially outwardly, due to the angle between the ridges. Thus, the user can better infer the position of his or her finger/thumb on this control surface
- Control surface 12 is located in a lenticular depression at the upper surface of the controller 10.
- Control surface 14 has a texture consisting of a series of rounded dimples 14a, arranged concentrically around control surface 12, but partially raised with respect to control surface 12.
- Control surface 16 is located at one curved side surface of the controller 10.
- the texture of this control surface includes a regular array of pyramidal protrusions 16a. As shown more clearly in Fig. 2, the array includes rows 16b of protrusions, the spacing between adjacent protrusions in each row varying from row to row, substantially systematically.
- Control surface 18 is located at the front side surface of the controller 10.
- the texture consists of an array of rounded pimples 18a.
- the array is shown more clearly in Fig. 3.
- the array is substantially elliptical, concentrically arranged on the apex point 24 of the front surface of the controller.
- Control surface 20 is located on the right side surface of the controller 10. It is shown more clearly in Fig. 6.
- the texture of control surface 20 comprises an interleaved array of raised lines 20a, 20b, 20c, of differing length. Lines 20a are relatively short, lines 20b are of medium length and lines 20c are relatively long.
- the lines are arranged so as to provide control surface 20 with three substantially parallel laterally-extending bands 2Od, 2Oe, 2Of of differing textures.
- the line-to-line spacing in band 2Od is relatively small
- the line-to-line spacing in band 2Oe is medium
- the line-to-line spacing in band 2Of is relatively large.
- Each band can therefore provide a different vibrational signal when in translational contact motion with the user-operated surface.
- the result is the provision of step-wise variation of the control surface, defining these three adjacent bands of line texture.
- These textures can, for example, be used for zoom and position control respectively.
- the controller 10 has a casing formed in two parts, joined around an equatorial line 26. Various apertures are formed in the casing, for an on-off switch 28, a data or power port 30, and other apertures 32, 34 as required.
- a controller according to Figs. 1-6 was designed and manufactured in order to provide a prototype device for testing.
- the prototype was designed in SolidworksTM (a 3D computer aided design software package) and manufactured using SLA (stereolithography) resin 3D-printing technology.
- the precision of the printing was 0.1 mm (a tolerance of ⁇ medium' , for DIN ISO 2768) .
- the body case of the controller 10 encloses an internal space (not shown) to house various components (not shown) . This is discussed in more detail below.
- the main aim of the case texture design is to provide a rich set of textures on the different control surfaces which can be easily recognised and accessed in a range of conditions, at least by touch, which fit appropriately with the form of the controller device.
- the textures used can be varied to provide different audio and vibration responses, and to invite different styles of interaction (rubbing back and forth, stroking, scratching, picking with finger nail etc) .
- the vibrations generated by the user acting on the textures should be as easy to classify as possible.
- the texture is composed of individual elements such as lines, dots, dimples or other geometric forms.
- the elements of the texture can be designed so that stroking the texture in different directions will give significantly different sounds.
- the material used for the case also has a significant effect.
- Different texture types, spacings and texture gradients offer different types of interface control .
- control surfaces 12-20 provide different constraints, and therefore encourage different types of stroking action, allowing the generation of very different vibrational signals for later classification.
- the controller 50 has an overall shape that provides both curved and substantially flat surfaces for location of the control surfaces .
- Control surfaces 52, 54, 56, 58, 60, 62 are provided for the controller 50.
- control surface 62 Similar to the control surface 12 of Fig. 1, the texture of control surface 62 is not shown in Fig. 7. However, as shown in Figs. 8 and 10. It consists of a partial radial array of saw-tooth ridges 62a (typically smooth saw-tooth ridges) .
- Control surface 52 is an array of rounded pimples 52a spaced regularly but having different sizes. Control surface 52 is formed substantially in a saddle shape.
- Control surface 54 includes an array of island texture, including tear-drop shaped islands 54a, arrayed in a substantially radial pattern.
- the tear-drop shaped islands are asymmetrical.
- the height of the islands above the remainder of the control surface is about 0.5 mm.
- Control surface 58 also includes an array of island texture in the form of hexagonal pyramidal islands 58a, regularly arrayed in a lattice formation.
- the hexagonal islands 58a are substantially symmetrical and have similar dimensions to the tear-drop islands 54a.
- Control surface 56 includes an array of ridges 56a. The spacing between the ridges is larger at one end of control surface 56 than at the other. This allows these different parts of the control surface to generate different vibrational signals in operation.
- Fig. 12 shows a partial enlarged sectional view though ridges 72 of a control surface 70.
- the ridges are asymmetrical, so that scratching or stroking the ridges in one direction provides a different vibrational response to the opposite direction.
- Figs. 13-16 show greyscale images of algorithmically-generated texture for use with embodiments of the invention.
- ridges having a relatively coarse spacing.
- a finer texture of ridges Fig. 14
- raised pimples Fig. 15
- dimples Fig. 16
- the coarse texture allows the user to orient themselves on the device, and provides some component of the vibration signal.
- the finer texture provides the remainder of the vibrational signal, allowing a richer signal to be generated, thus potentially conveying more information.
- a Bluetooth SHAKE (Sensing Hardware Accessory for Kinesthetic Expression) inertial sensor pack for sensing, as described in Williamson et al .
- the SHAKE model SK6 is a small form-factor wireless sensor-pack with integrated rechargeable battery, approximately the same size as a matchbox. It features tri- axis accelerometer, tri-axes angular rate sensor, tri-axis magnetometer, dual channel analog inputs, dual channel capacitive sensing and an internal vibrating motor. Communications are over a Bluetooth serial port profile.
- SHAKE includes a powerful DSP engine, allowing real time linear phase sample rate conversion.
- the vibrations caused by a user interacting with the shell are captured with a low cost film-style piezo contact microphone (the PZ-Ol, obtained from Images SI Inc., 109 Woods of Arden Road, Staten Island NY 10312, USA, see: http: //www. imagesco. com/catalog/sensors/film. html#pz-08 accessed 22 November 2007) which is attached to the inner exterior of the body with a thin (lmm) malleable layer of Bingham plastics material (BluetackTM in this case) covering 50% of the surface of the microphone.
- This provides significantly better signal quality than with a rigid mounting (there is an improvement of almost 12dB in the sensitivity and much increased high frequency response) . It also offers excellent robustness to interference from air-borne sound.
- a custom expansion module was designed for the SHAKE that includes a high impedance microphone data acquisition circuit and a vibration driver suitable for driving a linear vibration actuator. Since the purpose of the contact microphone is to sense the vibrations of the enclosure that surrounds the SHAKE, we limited the bandwidth to 2kHz as there is little useful information above this frequency and it reduces the load of further processing stages.
- the audio signal Once the audio signal has been acquired by the custom expansion module, it is digitized and passed to the SHAKE microcontroller where it is filtered, re-sampled ⁇ -law encoded and packaged to be sent to the host device (in this case a laptop computer running music playing software) over the Bluetooth radio using the serial port profile.
- the effective resolution of the microphone signal once received by the host is 13bits and the -3dB bandwidth is 1.5kHz.
- the sensed vibrations are classified in real-time, with signals from rubbing different areas of the device assigned to discrete classes.
- This structure is well suited to real-time audio and vibrotactile feedback which can be a function of instantaneous classifications.
- the incoming audio is windowed and transformed into a suitable feature space.
- the signal is windowed with a Hamming window, 512 samples long
- the classification stream therefore has a rate of 64 classifications a second.
- the Fourier transform of the windowed signal is taken, and the phase component discarded, leaving only the magnitude spectrum.
- the spectrum is then rebinned so that bins are four times their original size.
- the feature vectors are classified by a multi-layer-perceptron, with 64 hidden units .
- the low computational and memory requirements of such model produce very fast classification performance, suitable for implementation on mobile devices .
- Five different classes are trained and stored in a store in the SHAKE. These classes are: scratching circular front clockwise, scratching dimples on right side, scratching tip with fingernail and a miscellaneous noise class.
- Each class is trained on 120 seconds of input data, with a range of speeds of motion, and a variety of grip postures and pressures. The way the device is held may affects the body resonances of the exterior shell, and this the vibrational signals picked up by the vibrational " sensor .
- the miscellaneous noise class includes recordings of the device being manipulated in the hands, being placed in a pocket, picked up and replaced on a table and other background disturbances. We also tested sensitivity to loud noises near the device, but these had negligible effect.
- the classifier was trained on 26880 examples, and tested on 11520 unseen pairs, and identifies the different regions of the device with 75% accuracy for these five classes, based on a l/64 th of a second of, data. Although this seems relatively low, the high rate of classifications (64/s) means that simple integrators can aggregate evidence from the stream of instantaneous classifications into useful control signals. In the higher-level classification, the output stream of the low-level instantaneous classification is used as input to a simple dynamic system. This smooths out the fluctuations in the classifier.
- the dynamic system can support discrete events and continuous values. Complex recurrent classifiers can be used if desired. Here, for discrete events, the system functions as a leaky integrator, which triggers an event once the integrated value crosses a predetermined threshold. After this threshold is crossed, the integrator is inhibited for short period.
- Continuous outputs (such as the volume control in the following discussion) are directly integrated and then clipped to the appropriate range.
- the style of interaction with the controller is typically one where the device is held in one hand, and can either be activated by thumb and fingers of that hand, or in a bimanual fashion using both hands.
- the user scratches or rubs the device along its various control surfaces and this generates changes in the interaction.
- control surfaces may have different textures, there can be a mapping between these and equivalent key-presses. While possible, and in some cases useful, this is not the primary interaction mechanism envisaged. Stroking motions feel quite different to button-pushes, and are more appropriate for linking to gradual changes in values, such as volume control, zooming, browsing, etc. They are also useful for pushing, pulling and probing actions, and because of the drag in the texture, are a good fit to stretching actions (e.g. zooming) .
- the intention of using this style of interaction is that the user can navigate through a high-dimensional state space, generating incremental changes in state, being pulled or pushed by their stroking actions.
- the fact that there may be many different textures allows control of multiple degrees of freedom in this manner.
- the structure allows both discrete increments, when the user ⁇ picks' at a single textural feature of a control surface, and continuous ones, where the user brushes through several such features.
- partial completion of a stroke can give initial preview information about the consequences of continuing that action. When the user then continues the stroke, the threshold is reached, and the associated action is performed.
- the controller may therefore have an in-built pager motor in the SHAKE module, and an additional VBW32 actuator [http: //tactaid. com accessed 21 November 2007] for higher- frequency components.
- the augmentation of the raw texture with application-specific sound and vibration makes this more feasible, and thus we partition the classification component into multiple levels, so that we can provide instantaneous augmented feedback.
- the augmentation permits the component textures of a specific device to be taken and to make them appear to be a range of different media, which invite different styles of interaction, at different rates and rhythms.
- the user can potentially learn the affordances of the controller just by manipulating it, and feeling the changing responses to stroking actions, where each mode of the system might be associated with subtle changes in the response behaviour of the system.
- the controller was implemented as a user interface for a music player, which is controlled by scratch-based interaction with appropriate mappings from control surfaces to controls.
- the use case scenario is a user walking, listening to their music player, and controlling the volume and track choice while the controller is in their jacket pocket.
- the major actions used are start/stop (controlled by tapping) , volume adjustment and track change.
- Each of the classified outputs is fed to an integrator.
- the output of this integrator is either used directly (for volume control), or is thresholded to activate events (for track changes) . This results in reliable control, even though the underlying classification may have regular glitches.
- the textures are easily navigated by the user by touch alone, and the system was tested with five different users, who were able to use it without problems, despite the system being calibrated for a single user.
- Figs. 17-26 show data for a session in which the user flicks forward two tracks of music, lowers then raises the volume, then flicks back two tracks.
- Fig. 17 shows a photograph of a spectrogram of the vibrational signals picked up by the piezo microphone.
- Fig. 26 shows the corresponding amplitude waveform of the signal.
- Figs. 18 to 21 show classification events for four respective classifications.
- Figs 22 and 23 show the integrated values from the classification events, which approximate to:
- the controller may be fitted with capacitive sensors, in order to sense touch. Such sensors may be separate from or coincident with the control surfaces of the controller.
- the example described above demonstrated that it is possible to robustly classify stroking movements on the control surfaces of the controller, using vibration sensor information alone.
- the tactile feedback from the physical case can be augmented with context-dependent audio and vibration feedback.
- the texture provides immediate feedback to the user about the likely consequences of their actions, and they can be used in an eyes-free context, such as in the user's pocket.
- the controller also provides a research tool. Operation of the inertial sensors of the SHAKE allows the exploration of combinations of stroking movements with gross motor activity, such as shaking or twisting the device. Use of magnetometers for bearing allows the controller to be used for pointing at objects in mobile spatial interaction settings, where the rubbing can then be used to tease out properties of the content being pointed at.
- the controller also provides the possibility of initiating a link between a controller and a corresponding device.
- a controller For example, consider the controller 10 of Figs. 1-6 and the controller 50 of Figs. 7-11. It is possible to rub a control surface of controller 10 against a control surface of controller 50.
- the vibrational signal generated in each controller will be similar, but will strongly depend on the control surface used for the rubbing interaction.
- Such information may be used, for example, in place of Bluetooth pairing security information (typically such security information is inputted manually into each device to be paired, which is cumbersome) .
- Bluetooth pairing security information typically such security information is inputted manually into each device to be paired, which is cumbersome
- Figs. 27-29 show views of an alternative controlling mechanism for an embodiment of the present invention.
- a sheet of material 100 e.g. a metallic sheet or a plastics sheet
- the deformable dome is itself a permanently plastically deformed area of the sheet 100.
- Fig. 28 shows a plan view of the sheet 100 and dome 102.
- Fig. 29 shows a cross sectional view along lines X- X in Fig. 28.
- the dome is not yet deformed elastically.
- the dome is a control element, deformation of the dome providing classifiable sounds for providing a corresponding control signal.
- Figs. 30-32 show the sheet 100 and dome 102 when the dome is deformed elastically, e.g. by being pressed by a digit if a user.
- the central part 104 of the dome 102 is depressed downwardly into and possibly through the plane of the sheet 100. This is accommodated by elastic bending deformation of the side walls 106, 108 of the dome 102.
- the elastic bending deformation of these side walls will be at least 0.5% and may be significantly more.
- the dome is prevented from increasing its outer circumference (i.e. "spreading out") due to being constrained by the remainder of sheet 100.
- During depression of central portion 104 of the dome therefore, there becomes a point at which it is more energetically favourable for the centre of the dome to be depressed still further, due to the constraints of the sheet.
- This is accommodated as shown in Fig. 32 by the dome taking up an M-shaped configuration in cross section.
- the dome is bistable. Energetically, its most favoured configuration is as shown in Figs 27-29. However, when depressed past a certain point, it moves to the second most favoured configuration as shown in Figs. 30-32.
- the dome moves to the configuration shown in Fig. 32, it emits a click sound. This sound is characteristic of the dome. Also, when the dome is released, it emits another, different, characteristic click sound. These sounds can be classified in a similar manner to the discussion above for other vibrational sounds.
- the present inventors have confirmed that slightly different shaped or sized domes can provide different, classifiable sounds on press and release.
- a pair of flexible metallic domes of slightly different size were mounted on a firm surface.
- the domes had the same overal configuration, but the diameter of one was about 1.5 time the diameter of the other.
- a piezo microphone was also attached to the firm surface. On press and release of the domes, the clicks from the domes were recorded with minimal noise from the piezo microphone, which essentially only picks up vibrations in the surface and is insensitive to airborne sound.
- Fig. 33 shows the time series and Fig. 34 shows the spectrogram for the first dome, on press and release.
- Fig. 35 shows the time series and Fig. 36 shows the spectrogram for the second dome, on press and release.
- the dome response is different for the different domes. These differences are significant enough to allow each sound to be classifiable with respect to the others.
- the signals can be classified by transforming them to the frequency domain via FFT and then running them through a simple neural network. This can be trained with a number of example clicks from each of the flexible domes. In order to minimize the training required, the click events can be automatically detected via simple thresholding. Each event can then be aligned so that there are no offsets in the timing of the signals.
- control element is other than a dome.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A controller has at least one control element for operation by a user, for example by being deformed or displaced by the user's digits. The controller has a vibrational sensor for detecting a first vibrational signal caused by deformation or displacement of the control element due to a contact action caused by a user and/or a recovery from such a deformation or displacement of the control element. The controller may also have a textured surface for being scratched or stroked by the user to cause corresponding vibrational signals. The controller further includes processing means for processing the vibrational signal to enable classification into corresponding characteristic classes of vibrational signal from which a control signal is derivable. The controller may be used, for example, to control a music player or a mobile telephone.
Description
CONTROLLER
The present invention relates to a controller and to methods of operation of such a controller. The invention has particular, but not exclusive, relevance to controllers that allow a user to control portable electronic devices such as media players.
In the past, portable electronic devices have been controlled using an array of switches or buttons, each of these being dedicated in the sense that they have a single function. More recently, portable electronic devices have been provided with a controller in the form of multifunctional user interface that provides different functions depending on how it is used. An example of this is a touch screen, such as that used in certain portable media players of Apple, Inc. (formerly Apple Computer, Inc.) .
Touch screens and touch pads> typically operate using capacitive sensing to detect the location of a user' s finger or thumb (collectively referred to herein as a user's digits). Some touch screens are capable of sensing the location of more than one user' s digit at a time on the screen, and providing the required functionality based on the way in which the user's digits move on the screen.
Antonacci et al (Antonacci, F., Gerosa, L., Sarti, A., Tubaro, S., and Valenzise, G. "Sound-based classification of objects using a robust fingerprinting approach" Proceedings of EUSIPCO 2007: http : //suono . como .polimi . it/upload/ent3/l/fingerprinting -pdf accessed 5 December 2007] ) describe a system for classifying different sounds that may be sensed by a HMI (human-machine interface) . The technique uses digital audio fingerprinting, which is computationally demanding. The classification technique is based on a distance metric, in comparison with a training set of signals. The signals assessed include scratching signals, collected from arbitrary objects such as a display board.
Baudisch et al (Baudisch, P., Sinclair, M., and Wilson, A. "Soap: a pointing device that works in mid-air" UIST '06: Proceedings of the 19th annual ACM symposium on User interface software and technology, ACM Press (New York, NY, USA, 2006, 43-46) disclose a computer pointing device that is held in a user's hand, not in contact with a flat surface. In this device, a core part moves relative to an outer shell. The shell includes fabric and may stretch elastically in use. This device may provide some tactile feedback to the user.
Bornand et al (Bornand, C, Camurri, A., Castellano, G., Catheline, S., Crevoisier, A.,Roesch, E., Scherer, K., and Volpe, G. "Usability evaluation and comparison of prototypes of tangible acoustic interfaces" Proc. Of ENACTIVE05 www.taichi.cf.ac.uk/files/108 FinalPaper.pdf
[accessed 5 December 2007]) disclose the use of tangible acoustic interfaces. This document aims to use everyday- objects as control devices, e.g. table tops. The sound inputs include tapping at different locations, e.g. to control a media player. The aim of this document is to determine the location of the sound input. The location is used to determine, in effect, the position of a cursor on a display.
O'Modhrain and Essl (O'Modhrain, S., and Essl, G. "Pebblebox and crumblebag: tactile interfaces for granular synthesis"
NIME '04: Proceedings of the 2004 conference on New interfaces for musical expression, National University of Singapore (Singapore, 2004) , 74-79) disclose haptic controllers for computer-based musical instruments. The core of the technology in this document is to use granular or pebble- shaped material, and to detect the sounds produced by their interaction.
Williamson et al (Williamson, J., Murray-Smith, R., and Hughes, S. "Shoogle: Multimodal excitatory interaction on mobile devices" CHI '07: Proceedings of the SIGCHI conference
on human factors in computing systems, ACM Press (New York, NY, USA, 2007), 121-124) disclose inertial sensing to detect motion of a control device, the user actively moving the device in order to elicit an output. The device responds by providing an auditory output. The example given in the document is of a user shaking a mobile phone, and if the phone "contains" a message, the phone emits a metallic clanking noise to indicate the presence of a message.
Ronkainen et al (Ronkainen, S., Hakkila, J., Kaleva, S.,
Colley, A., Linjama, J., "Tap input as an embedded interaction method for mobile devices" TEI'07, 15-17 Feb 2007, Baton Rouge, LA, USA, 263-270) disclose the use of tapping, detected by an accelerometer, to provide a user interface to control a device (e.g. mobile phone) . Other movements are also possible, such as slapping and shaking. Thus, this document discloses the use of different movements and contacts with the device, triggering different control functions. Feedback is provided to the user in the form of vibration.
Hummels, C, et al (Hummels, C, Overbeeke, K.C.J. , Klooster, S., "Move to get moved: a search for methods, tools and knowledge to design for expressive and rich movement-based interaction" Pers Ubiquit Comput (2007) 11:677-690) disclose a controller for a music player, in which a furry control surface is provided on the controller. The position of the
user' s finger on the furry control surface is detected via capacitive sensing.
The present inventors have realised that a useful controller for a device may be based on a combination of tactile input and vibrational sensing. This means that the user can more easily control the device when it is out of sight, e.g. in a pocket, and the tactile feedback sensations experienced by the user are a close representation of the vibrations sensed by the controller. This is considered to be a first development of the present invention.
In a first preferred aspect of the first development, the present invention provides a controller including at least one control surface for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface, at least one of the control surface and the user- operated surface being a textured surface, and processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, corresponding to the first characteristic class of vibrational signal.
In a second preferred aspect of the first development, the present invention provides a device including a controller according to the first aspect, the controller being operable to control one or more functions of the device.
In a third preferred aspect of the first development, the present invention provides a use of a controller according to the first aspect to control a device, the use including the a user interacting with the controller to provide a translational contact motion between the user-operated surface and the control surface.
In a fourth preferred aspect of the first development, the present invention provides a method of operation of a controller according to the first aspect, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
In a fifth preferred aspect of the first development, the present invention provides a computer program for carrying out the method of the fourth aspect. The computer program may, for example, be loaded onto a computer system. The computer
program may, for example, be stored on a storage medium such as a computer disk.
In a sixth preferred aspect of the first development, the present invention provides a system including a controller having at least one control surface for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-controlled surface, at least one of the control surface and the user-operated surface being a textured surface, and the system providing control means for classifying the first vibrational signal into a first characteristic class of vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal .
In a seventh preferred aspect, the present invention provides an external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having a control surface for operation by a user, the control surface being a textured surface.
In this seventh aspect, it is preferred that the body of the controller includes at least one vibrational sensor for detecting a first vibrational signal corresponding to a
translational contact motion between the control surface and a user-operated surface and processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, corresponding to the first characteristic class of vibrational signal. Alternatively, the vibrational sensor may be included as part of the external cover layer.
In an eighth preferred aspect, the present invention provides a kit of one or more external cover layers according to the seventh aspect and the body of the controller as set out with respect to the seventh aspect.
In a ninth preferred aspect, the present invention provides a system including a controller according to the first or sixth aspect and a device including the user-operated surface, for translational contact motion with the control surface.
Preferred and/or optional features of the invention will now be set out. These are applicably singly or in any combination with any aspect of any development of the invention, unless the context demands otherwise.
It is preferred that the control surface is textured. However, the user-operated surface may additionally or
alternatively be textured, in order to provide the vibrational signals to the controller.
Preferably, the user-operated surface is a surface of an object which the user can control to move relative to the control surface. For example, the user-operated surface may be a surface of the user himself or herself, e.g. skin. However, the user's clothing may be used. Most preferably, the user uses one or more digits of one or both hands to interact with the controller. "Digit" here is intended to include fingers and/or thumbs. However, it is not necessarily excluded that other parts of the user may interact with the device, e.g. parts of the face or head, non-digit parts of the hand (e.g. palm), etc. In practical use, the controller may be operated most successfully with any part of the body that can provide the user with suitable feedback as to which part of the controller is being touched. A particularly preferred mode of operation of the controller involves the user using his or her finger or thumb nail(s) to interact with the controller.
The user-operated surface need not be a part of the user or a part of the user's clothing. Additionally or alternatively, the user-operated surface may all or a part of the surface of another device. This is of particular interest where the controller needs to identify the device with which it is in
translational contact motion. For example, when the user- operated surface includes texture, the texture may identify the device, via suitable encoding. Thus, rubbing (for example) of the user-operated surface against the control surface allows the controller (or ancillary equipment) to identify the device by the vibrational signals caused by the rubbing. This may be used, for example, to identify the device to the controller and may therefore take the place of, or be additional to, security procedures to identify the device to the controller. Of particular interest here is the pairing of devices, e.g. via a wireless link such as Bluetooth, which typically requires the entry of identification numbers (PINs) or other codes manually in order to achieve pairing. It is preferred that the device also includes a vibrational sensor in order to detect the vibrational signals for classification. Typically, the vibrational signals sensed in the controller and in the device are substantially identical.
Preferably, the controller includes a plurality of control surfaces. Preferably, each of the plurality of control surfaces is a textured surface. The texture of the control surfaces may differ between the control surfaces. Suitable preferred textures are set out below.
References to the texture of the control surface (s) is to be read as applying also to the possible texture (s) at the user- operated surface (s).
Preferably the texture of the at least one control surface is one that can be discerned by a user. For example, preferably a user is capable of discriminating between the texture of the at least one control surface and another surface of the controller that is other than the control surface. Such other surfaces may be substantially smooth, or may be another control surface .
The texture of the at least one control surface is preferably a systematic variation in the height profile of the control surface from an average height of the control surface. Suitable patterns of variation may be generated algorithmically . For example, for a portion of control surface of at least 0.5 cm2 area (in plan view), preferably the systematic variation of the height profile of the control surface from an average height of the 0.5 cm2 area has an amplitude of at least 0.1 mm. More preferably this amplitude is at least 0.2 mm, at least 0.3 mm, at least 0.4 mm or about 0.5 mm.
Preferably the texture includes one or more of: ridges, troughs, dimples, pimples, pillars, rods. Ridges and troughs
may be considered to be line-type texture. Dimples, pimples, pillars and rods may be considered to be island-type texture. Other suitable examples of each type of texture will be apparent. More than one class of each type of texture may be present. For example, there may be coarse line-type texture overlaid with relatively finer line-type texture.
When line-type texture is present, in the form of ridges and/or troughs for example, these may be arrayed as straight ridges and/or troughs. They may be curved ridges and/or troughs. They may be substantially parallel to each other. Alternatively, they may be arrayed at least partially in a fan-type array, or wheel-spoke array. The ridges and/or troughs, when viewed in cross section in a direction in which the ridges and/or troughs extend (at least locally) , may be substantially symmetrical about a maximum or minimum height of the ridge and/or trough. However, preferably they are substantially asymmetrical when viewed in this way. One reason for this is that the ridges and/or troughs may provide substantially different vibrational signals, depending on the direction of translation contact motion between the control surface and the user.
When the texture includes island-type texture, preferably the islands (which may be areas of locally raised profile and/or areas of locally depressed profile, substantially surrounded
by areas of non-raised or non-depressed profile) are systematically arrayed. This may be in a regular lattice array. The lattice array may have translational and/or rotational symmetry. The individual islands themselves may have rotational symmetry (e.g. 2-fold. 3-fold. 4-fold, 5-fold, 6-fold or higher) . However, it is preferred that the individual islands have only 1-fold rotational symmetry. This asymmetry of the shape of the islands allows the islands to provide substantially different vibrational signals, depending on the direction of translation contact motion between the control surface and the user. For example, the islands may • have a tear-drop shape when viewed in plan view. The islands, when viewed in cross-section perpendicular to the plan view, may have a square, rounded, triangular, curved triangular or conic section shape. In some cases, the islands, when viewed in this direction, may be substantially asymmetrical, for a similar reason as explained above with respect to the line texture above.
The spacing between adjacent lines or islands in the texture may be of the same order as the height of the texture. For example, where the lines or islands in the texture have areas of maximum height, the centre-to-centre spacing between adjacent areas of maximum height may be 0.1 mm or more. More preferably this centre-to-centre spacing is at least 0.2 mm, at least 0.3 mm, at least 0.4 mm, at least 0.5 mm, at least
0.6 mm, at least 0.7 mm, at least 0.8 mm, at least 0.9 mm, or about 1 mm, or higher.
The spacing of the texture may vary across the control surface. This may be a gradual variation. Alternatively, this may be a step-wise variation. For example, for the case of line-type texture, step-wise variation in spacing between adjacent lines may be achieved by systematically providing lines of different length in the texture. In this way, a first band of texture may be defined on the control surface having a first spacing and a second band of text may be defined on the control surface (e.g. adjacent to the first band) having a second spacing, different to the first spacing. An advantage of this is that the first and second bands may provide substantially different vibrational signals when operated by the user, depending on the spacing of the texture.
Preferably the controller includes two or more control surfaces, each control surface having a different texture. This allows the different control surfaces to provide substantially different vibrational signals when operated by the user, depending on the texture of the control surfaces. There may be three, four, five or more such control surfaces.
The two or more control surfaces may be substantially coplanar. However, more preferably, the controller has a body
form and the two or more control surfaces are provided on non- coplanar surfaces of the body. For example, if the body has flat faces, preferably the control surfaces are provided on different faces of the body. If the body has at least one curved face, the control surfaces may be provided on different parts of the curved face, or on the curved face and on a different face. This allows the controller to be more easily operated by the user. Preferably at least one of the control surfaces has a convex shape. One or more of the control surfaces may have a concave shape. A single control surface may have a region of concavity and a region of convexity.
Where the controller includes a body, the control surfaces may be integrally formed with the body wall. For example, the control surfaces may be moulded into the body wall. In this way, the control surfaces may be efficiently manufactured. This also allows the control surfaces to be formed of the same material at the body wall.
The body may enclose a space. This space may be used to house the vibrational sensor and/or the processing means and/or the control means .
The control surface may be formed on an external layer that is separable from the body. The control surface may for example be replaceable using a different external layer having a
differently textured control surface. The present inventors have realised that this is an independent feature of this development, and is the reason for the definition of the seventh and eighth preferred aspects, above.
The control surface may be formed of a rigid material such as metal or alloy, rigid plastics or ceramics. Alternatively, the control surface may be formed of a resilient material, such as a synthetic or natural rubber material.
The user may interact with the control surface by one or more of scratching, rubbing, tapping, stroking.
The vibrational sensor may for example be a microphone, such as a contact microphone. Piezo microphones are suitable, for example. Where the controller includes a body, the vibrational sensor is preferably coupled to an internal wall of the body. In this way, the body may directly transmit vibrational signals from the control surfaces to the vibrational sensor. The coupling is preferably via a thin coupling layer, e.g. of thickness 5 mm or less. Suitable materials for this coupling layer include Bingham plastics, most preferably a Bingham plastic that provides adhesion between the vibrational sensor and the wall of the body. Such materials provide good signal quality for the vibrational sensor, and can assist in providing robustness against
interference from air-borne sound. Alternatively, an adhesive sheet may be used.
The processing means preferably receives an input signal from the vibrational sensor. As will be understood, this signal may correspond to vibrational signals across a range of frequencies. Preferably the processing means is operable to limit the bandwidth of the data corresponding to these signals for onward processing to a band at 10 kHz and below, more preferably 8 kHz and below, 6 kHz and below, 4 kHz and below and most preferably 2 kHz and below. This is because the most useful vibrational frequencies tend to be in the lower frequencies. Reducing the bandwidth in this way can reduce the load on later processing stages in the controller or system without losing a large amount of useful information.
The classification of the vibrational signals into the characteristic classes of vibrational signals may take place within the body of the controller, so that the output of the controller is a control signal. Alternatively, this classification may take place elsewhere in the system, such as in a device to be controlled. Preferably the classification takes place in control means, operable to output a control signal based on the classification of the vibrational signal.
Preferably the control means includes a store for storing training data. Preferably the training data includes training data sets corresponding to the different classes of vibrational signal, e.g. provided by the control surfaces of the controller under one or more type of interaction with a user. At least one of the training data sets is preferably associated with a non-control input such as miscellaneous noise .
Preferably the control means is operable to carry out a first classification on the vibrational signal (after processing) . This first classification may be a relatively rough classification, intended to be performed quickly, such as in real time. The output of the first classification may be a first classification output stream.
Preferably the first classification output stream is subjected to a second (or further) classification. The second classification is intended to smooth out fluctuations in the first classification output stream. The output of the second classification may be a second classification output stream.
Preferably the second classification output stream is used to determine the nature of a control signal to be applied to the device controlled by the controller. The control signal may be a discrete event control (e.g. "track number" or "next
track" or "previous track" for a music player) or it may be a continuous (or quasi-continuous) value control (e.g. "volume up" or "volume down" for a music player".
The controller may include active feedback means. For example, the controller may include active audio, visual and/or vibrating means, operable in the even that it is required to provide to the user an active feedback. Feedback such as an active vibration may provide an indication to the user that a task has been completed, or is in progress, for example. The active feedback may be provided at substantially the same time as the interaction of the user with the controller. However, it is preferred that the active feedback is delayed with respect to the interaction of the user with the controller. This delay may be at least 5 ms, for example. It may be longer, e.g. at least 50, 100, 200, 300, 400 or 500 ms . It is envisaged that the delay would not be greater than 5 seconds. Preferably the feedback is specific to the task being undertaken by the user, or the task that the user is instructing the controller (or a device controlled by the controller) to undertake, or a state of the controller (or of a device controlled by the controller) .
The controller may include further sensing means, e.g. for sensing movement of the controller. For example, the controller may include at least one accelerometer (e.g. one-
axis, two-axis or three-axis accelerometers) and/or at least one angular rate sensor (e.g. one-axis, two-axis or three-axis angular rate sensors) . The controller may include at least one magnetometer (e.g. one-axis, two-axis or three-axis magnetometer) . This allows the controller or associated device to determine the orientation of the controller with respect to a magnetic field, e.g. the earth's magnetic field. The accelerometer allows the controller to determine its orientation with respect to the direction of gravity, when held steady. The magnetometer may then determine the direction of the earth's magnetic field at the device. This allows a bearing angle to be determined. Additionally or alternatively other position-determining means may be provided such as a GPS locator or similar.
The controller may include one or more touch sensing means. For example, the touch sensing means may operate by capacitive sensing. Such sensing means may be operable to sense touch and translational contact movement of a user on the controller.
For some applications, it is preferred that the external surface of the controller has few apertures leading into the inside of the controller. For example, in some circumstances, the controller may be used in environments in which apertures in the controller could become clogged with extraneous matter
such as dust, dirt, sand, etc. The use of the textured control 'surfaces, and/or other non-apertured controls such as the touch sensing means, allows the controller to be formed with few apertures. For example, the controller may be substantially sealed. In this way, electronics within the controller may be protected from the external environment, and thus the controller may be more robust. Where the external casing of the controller (including the control surfaces) are formed from electrically conducting material, the internal components of the controller may additionally be protected from adverse electrical interference or electromagnetic interference .
The device is preferably a portable device such as a mobile communications device and/or a portable media player such as a music player and/or video player. The device may be used for the remote control of a second device. Thus, the device may operate as a computer mouse (or other graphical user interface pointing and indicating device) for a computer. Alternatively, the device may operate as a handheld media controller for audio, video or web applications, for example. Preferably the device is hand-held. One particularly suitable application is a wireless headset, e.g. for mobile communications .
Where the controller does not itself include the control means, but rather the control means is included in the device to be controlled, preferably there is provided a communications link between the controller and the device. This link may be a wireless link, for example a Bluetooth link.
The present inventors have realised that the first development may be modified in order to provide touch sensing in place of vibrational sensing, the texture of the control surfaces providing non-visual feedback to the user of the control surface being operated.
Accordingly, in a first preferred aspect of the second development of the present invention, the present invention provides a controller including at least first and second control surfaces for operation by a user, the control surfaces each being a textured surface, having different texture to each other, the controller including at least one touch sensor for detecting a touch of a user-operated surface on one or more of the control surfaces and processing means for processing a touch signal corresponding to the touch to enable classification of the touch signal into a first characteristic class of touch signal corresponding to the first control surface or into a second characteristic class of touch signal
corresponding to the second control surface, from which a control signal is derivable.
As will be appreciated, the second to ninth preferred aspects of the first development correspondingly apply to the second development, with the proviso that in the second development it is not necessary to provide or sense vibrational signals, but that instead touch signals are provided and sensed. However, the controller may provide vibrational feedback to the user, based for example on the sensed signals. Such feedback may be used to indicate to the user that a signal has been (or is being) received or that a function has been (or is being) carried out.
In the second development, the touch sensor may be common to both the first and second control surfaces. The touch sensor may conveniently be a capacitive sensor.
The preferred and/or optional features set out with respect to the first development may applied either singly or in any combination to the second development.
In a third development of the present invention, the present inventors have realised that one or more deformable and/or displaceable control elements may be used in a similar manner to the control surface (s) of the first development, in order
to generate classifiable vibrational signals for use in generating corresponding control signals.
Accordingly, in a first aspect of the third development, the present invention provides a controller including at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation of the control element, the controller further including processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal corresponding to the first characteristic class of vibrational signal.
In a second aspect of the third development, the present invention provides a method of operating a controller, the controller including at least one control element, at least one vibrational sensor and processing means, the method including the step of the user deforming or displacing the control element by causing a contact action on the control
element to take place, the vibrational sensor detecting a first vibrational signal caused by at least one of: the deformation of the control element; and a recovery from the deformation of the control element, wherein the processing means processes the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal corresponding to the first characteristic class of vibrational signal.
In a third aspect of the third development, the present invention provides a mobile electronic device including a controller according to the first aspect.
In a fourth aspect of the third development, the present invention provides a system including a controller having at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element, the system providing control means for classifying the first vibrational signal into a first characteristic class of
vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal .
In a fifth aspect of the third development, the present invention provides a use of a controller according to the first aspect or of a system according to the fourth aspect to control a device, the use including a user interacting with the controller to deform or displace the control element to cause a first vibrational signal.
In a sixth aspect of the third development, the present invention provides a method of operating a controller according to the first aspect or a system according to the fourth aspect, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
In a seventh aspect of the third development, the present invention provides an external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having at least one control element, the control element being suitable for
providing a detectable first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element.
Preferably, the at least one vibrational sensor is included as part of the external cover layer, the vibrational sensor being for detecting said first vibrational signal.
In a ninth aspect of the third development, the present invention provides a kit of one or more external cover layers according to the eighth aspect and a controller according to the first aspect, or a mobile electronic device according to the third aspect.
The preferred and/or optional features set out with respect to the first development and/or second development may applied either singly or in any combination to the third development.
Typically, the contact action which causes deformation or displacement of the control element is direct or indirect contact of the control element by the user. It is preferred that the control element is directly contacted by the user.
However, the contact may be indirect contact, for example via
one or more intermediate means, such as an outer wall of the controller or device. In either case, it is preferred that the contact between the user and the control element provides haptic feedback to the user. Preferably the haptic feedback takes the form of the user sensing by touch at least a component of the deformation and/or displacement of the control element. The control element may be pressed, stroked, twisted, scratched, pulled, rubbed, for example by the user' s hand(s) . However, it is not always necessary that the deformation takes place directly by the user's hands. For example, the user may hold the controller and cause the control element to be deformed by manipulation of the control element against a different surface, e.g. a part of the user other than the hand(s), or a wall, door, window, seat, floor, etc.
Preferably, the controller includes a plurality of control elements. In this case, it is preferred that the shape of the control elements differs between the control elements.
Preferably, the at least one control element includes, at least in part, one of a concave and a convex shape. The control element preferably has a first configuration before deformation or displacement and a second, different configuration during deformation or displacement. For example, in the case of a convex control element, preferably
the second configuration has, at least in part, a concave shape .
Various different suitable formats for the control element are envisaged. The control element may comprise a flexible wall member. The control element may have a three-dimensional shaped form. The control element may have a first configuration before deformation and a second, different configuration during deformation. Preferably both the first and second configurations are characteristic configurations. For example, the control element may be bistable, deformation of the control element from the first configuration allowing a second configuration to be reached, the second configuration being energetically relatively stable compared with other configurations, for the time that the user continues to apply a deformation force. An example of a bistable control element is a deformable dome. Such a dome typically has a first configuration in which the centre of the dome is upstanding. Deformation of the dome by pressing the centre of the dome causes the wall of the dome to deform locally. This is elastic deformation. With the application of sufficient pressure, the dome deforms to a partially inverted dome shape (the second configuration) . This occurs quickly once the dome is deformed beyond a threshold deformation, the second configuration of the dome being a stable configuration for as long as the pressure on the dome is maintained. Release of
the pressure on the dome allows the dome to spring back from the second configuration to the first configuration. The present inventors have noted that the deformation from the first configuration to the second configuration provides a characteristic vibrational signal. Similarly, release of the dome from the second configuration to the first configuration also provides a characteristic vibrational signal, which may be the same as or different to the vibrational signal caused by deformation from the first configuration to the second configuration.
Other deformable shapes can be used for the control elements. Such shapes may also have bistability (or, in general, multistability) . This is advantageous, because such multistable shapes more easily provide characteristic vibrational signals as they spring between first and second (and optionally third, fourth, etc.) configurations. Such characteristic vibrational signals may be classified with relative ease, thereby more surely determining the correct control signal corresponding to the vibrational signal. The characteristic vibrational signals provided by multistable shapes tend to have a low dependency on the speed of displacement and/or the pressure applied by the user.
Furthermore, multistable shapes are preferred because they can provide the user with suitable tactile feedback, indicating to
the user by touch that the control element has been suitably operated without need for the user to look at the controller. Particularly preferred here are control elements that click between the first and second configurations.
Suitable (e.g. multistable) control elements may be formed of a material having a relatively small thickness in at least one direction. For example, suitable multistable control elements may be formed by shaping a sheet material (e.g. by pressing, moulding, etc.) . In this case, deformation of the control element from the first configuration to the second configuration may occur by locally elastically bending portions of the control element. In this case, the amount of elastic strain experienced at these portions may be significant. For example, the maximum elastic bending strain experienced during the deformation may be at least 0.01%, more preferably at least 0.02%, more preferably at least 0.05%, more preferably at least 0.1%, more preferably at least 0.15% and more preferably at least 0.2%. The maximum elastic bending strain may be about 0.5%. As the skilled person will appreciate, the amount of bending strain experienced locally by the control element will depend on the first and second configurations of the control element and certain embodiments of the invention may not experience bending strains within the ranges set our above.
Preferably, the amount of displacement applied to the control element during operation of the device requires that at least a portion of the control element is displaced by at least 0.1 mm, more preferably at least 0.2 mm, more preferably at least 0.5 mm, more preferably at least 0.8 mm, more preferably at least 1 mm, more preferably at least 1.5 mm, more preferably at least 2 mm. As the skilled person will appreciate, the amount displacement experienced by the control element will depend on the first and second configurations of the control element (where the control element is multistable) and certain embodiments of the invention may not experience displacements within the ranges set our above.
It is not always necessary for the control element to be multistable. Other shapes for the control elements may provide suitable characteristic (and thus classifiable) vibrational signals. For example, a sliding control element may be provided which at or near a first limit of its travel provides the classifiable vibrational signal. Optionally, the sliding control element may provide a second classificable vibrational signal at or near a second limit of its travel (e.g. in the reverse direction) . The sliding control element need not necessarily spring back to an initial position after operation by the user. Instead, the user may choose when to move the sliding control element towards the first limit of travel and, independently, when to move the sliding control
element towards the second limit of travel. Again, it is preferred that suitable shapes provide the user with suitable feedback (e.g. tactile or haptic feedback) to indicate to the user that the control element has been suitably operated without need for the user to look at the controller.
The controller may have more than one control element. For example, two, three, four, five, six, seven, eight, nine, ten, eleven, twelve, thirteen, fourteen, fifteen, sixteen, seventeen, eighteen, nineteen or twenty or more control elements may be provided. Preferably, in operation, the control elements each provide a corresponding vibrational signal, different from and classifiable from the vibrational signals caused by operation of the other control element (s). The benefit of this is that the controller can then detect and classify the different vibrational signals in order to determine the appropriate control signal.
For example, it is considered that a mobile music player such as an MP3 player may have 10 or fewer control elements, e.g. 5 or 6. A mobile communications device such as a mobile telephone may require more control elements, typically more than 10, e.g. 12-16 control elements.
A significant benefit of using different control elements in this way is that, unlike electronic control buttons, each
element need not be electronically wired to the processor. Instead, the vibrational signals may be detected by the vibrational sensor via a (preferably solid) common vibration transmission path. Thus the structure of the controller is simplified compared with a controller having electronic control buttons .
The preferred embodiments of the invention are described below, with reference to the accompanying drawings, in which: Fig. 1 shows a plan view of a controller according to an embodiment of the invention.
Fig. 2 shows a rear side view of the controller of Fig. 1.
Fig. 3 shows a front side view of the controller of Fig. 1.
Fig. 4 shows a perspective view of the controller of Fig. 1. Fig. 5 shows a left side view of the controller of Fig. 1.
Fig. 6 shows a right side view of the controller of Fig. 1.
Fig. 7 shows a plan view of a controller according to another embodiment of the invention.
Fig. 8 shows a rear side view of the controller of Fig. 7. Fig. 9 shows a reverse plan view of the controller of Fig. 7.
Fig. 10 shows a left side view of the controller of Fig. 7.
Fig. 11 shows a right side view of the controller of Fig. 7.
Fig. 12 shows a schematic cross sectional view through a control surface for use in an embodiment of the invention. Figs. 13-16 show greyscale images of algorithmically-generated texture for use with embodiments of the invention.
Fig. 17 shows a spectrogram of vibrational signals recorded in an example of operation of an embodiment of the invention.
Figs. 18 to 21 show vibrational classification results derived from the vibrational signals of Fig. 17. Fig. 22 shows a representation of the probability of the input signal of Fig. 17 corresponding to a "next track" input signal .
Fig. 23 shows a representation of the probability of the input signal of Fig. 17 corresponding to a "previous track" input signal.
Fig. 24 shows the control of volume with time due to the input signal of Fig. 17.
Fig. 25 shows the control of track number with time due to the input signal of Fig. 17. Fig. 26 shows a representation of the input signal of Fig. 17.
Figs. 27-29 show views of an alternative controlling mechanism for an embodiment of the present invention.
Figs. 30-32 show the controlling mechanism of Figs. 27-29 when the dome is deformed elastically. Fig. 33 shows the time series and Fig. 34 shows the spectrogram for a first dome, on press and release.
Fig. 35 shows the time series and Fig. 36 shows the spectrogram for a second dome, on press and release.
Figs. 1 to 6 show a controller 10 according to one embodiment of the invention. Similar features are identified in these
drawings with the same reference numerals. The controller 10 has an overall configuration similar to a partially flattened egg shape, and this provides curved surfaces for location of the control surfaces.
Controller 10 has five differently-textured control surfaces 12, 14, 16, 18 and 20 visible in Fig. 1. Control surface 12 has no texture that is visible in Fig. 1, but the texture is shown partially in Fig. 4, consisting of a series of radially- arranged ridges 12a, centred on central feature 22. These ridges have a saw-tooth structure (preferably a smooth sawtooth structure) . The gap between the ridges varies uniformly as the user follows a ridge radially outwardly, due to the angle between the ridges. Thus, the user can better infer the position of his or her finger/thumb on this control surface
12.- Control surface 12 is located in a lenticular depression at the upper surface of the controller 10.
Control surface 14 has a texture consisting of a series of rounded dimples 14a, arranged concentrically around control surface 12, but partially raised with respect to control surface 12.
Control surface 16 is located at one curved side surface of the controller 10. The texture of this control surface includes a regular array of pyramidal protrusions 16a. As
shown more clearly in Fig. 2, the array includes rows 16b of protrusions, the spacing between adjacent protrusions in each row varying from row to row, substantially systematically.
Control surface 18 is located at the front side surface of the controller 10. The texture consists of an array of rounded pimples 18a. The array is shown more clearly in Fig. 3. The array is substantially elliptical, concentrically arranged on the apex point 24 of the front surface of the controller.
Control surface 20 is located on the right side surface of the controller 10. It is shown more clearly in Fig. 6. The texture of control surface 20 comprises an interleaved array of raised lines 20a, 20b, 20c, of differing length. Lines 20a are relatively short, lines 20b are of medium length and lines 20c are relatively long. The lines are arranged so as to provide control surface 20 with three substantially parallel laterally-extending bands 2Od, 2Oe, 2Of of differing textures. Thus, the line-to-line spacing in band 2Od is relatively small, the line-to-line spacing in band 2Oe is medium and the line-to-line spacing in band 2Of is relatively large. Each band can therefore provide a different vibrational signal when in translational contact motion with the user-operated surface. The result is the provision of step-wise variation of the control surface, defining these three adjacent bands of
line texture. These textures can, for example, be used for zoom and position control respectively.
The controller 10 has a casing formed in two parts, joined around an equatorial line 26. Various apertures are formed in the casing, for an on-off switch 28, a data or power port 30, and other apertures 32, 34 as required.
A controller according to Figs. 1-6 was designed and manufactured in order to provide a prototype device for testing. The prototype was designed in Solidworks™ (a 3D computer aided design software package) and manufactured using SLA (stereolithography) resin 3D-printing technology. The precision of the printing was 0.1 mm (a tolerance of ^medium' , for DIN ISO 2768) .
The body case of the controller 10 encloses an internal space (not shown) to house various components (not shown) . This is discussed in more detail below.
The main aim of the case texture design is to provide a rich set of textures on the different control surfaces which can be easily recognised and accessed in a range of conditions, at least by touch, which fit appropriately with the form of the controller device. The textures used can be varied to provide different audio and vibration responses, and to invite
different styles of interaction (rubbing back and forth, stroking, scratching, picking with finger nail etc) . The vibrations generated by the user acting on the textures should be as easy to classify as possible.
As discussed above with reference to Figs. 1-6, the texture is composed of individual elements such as lines, dots, dimples or other geometric forms. The elements of the texture can be designed so that stroking the texture in different directions will give significantly different sounds. The material used for the case also has a significant effect. Different texture types, spacings and texture gradients offer different types of interface control .
The textures of control surfaces 12-20 provide different constraints, and therefore encourage different types of stroking action, allowing the generation of very different vibrational signals for later classification.
There will now be described an alternative embodiment of the controller, having a different array of control surfaces, with reference to Figs. 7 to 11.
In Figs. 7 to 11, similar features are identified in these drawings with the same reference numerals. The controller 50 has an overall shape that provides both curved and
substantially flat surfaces for location of the control surfaces .
Control surfaces 52, 54, 56, 58, 60, 62 are provided for the controller 50.
Similar to the control surface 12 of Fig. 1, the texture of control surface 62 is not shown in Fig. 7. However, as shown in Figs. 8 and 10. It consists of a partial radial array of saw-tooth ridges 62a (typically smooth saw-tooth ridges) .
These are asymmetric, so that the vibration signals generated are different depending on the direction of scratching or rubbing.
Control surface 52 is an array of rounded pimples 52a spaced regularly but having different sizes. Control surface 52 is formed substantially in a saddle shape.
Control surface 54 includes an array of island texture, including tear-drop shaped islands 54a, arrayed in a substantially radial pattern. The tear-drop shaped islands are asymmetrical. The height of the islands above the remainder of the control surface is about 0.5 mm.
Control surface 58 also includes an array of island texture in the form of hexagonal pyramidal islands 58a, regularly arrayed
in a lattice formation. The hexagonal islands 58a are substantially symmetrical and have similar dimensions to the tear-drop islands 54a.
Control surface 56 includes an array of ridges 56a. The spacing between the ridges is larger at one end of control surface 56 than at the other. This allows these different parts of the control surface to generate different vibrational signals in operation.
Fig. 12 shows a partial enlarged sectional view though ridges 72 of a control surface 70. The ridges are asymmetrical, so that scratching or stroking the ridges in one direction provides a different vibrational response to the opposite direction.
Figs. 13-16 show greyscale images of algorithmically-generated texture for use with embodiments of the invention. In at least Figs. 14-16, there are formed ridges having a relatively coarse spacing. Overlaid over this texture is a finer texture of ridges (Fig. 14), raised pimples (Fig. 15) or dimples (Fig. 16) . The coarse texture allows the user to orient themselves on the device, and provides some component of the vibration signal. The finer texture provides the remainder of the vibrational signal, allowing a richer signal to be generated, thus potentially conveying more information.
Inside the outer shell of each controller 10 and 50 of Figs. 1-6 and Figs 7-11 respectively is located a Bluetooth SHAKE (Sensing Hardware Accessory for Kinesthetic Expression) inertial sensor pack for sensing, as described in Williamson et al . The SHAKE model SK6 is a small form-factor wireless sensor-pack with integrated rechargeable battery, approximately the same size as a matchbox. It features tri- axis accelerometer, tri-axes angular rate sensor, tri-axis magnetometer, dual channel analog inputs, dual channel capacitive sensing and an internal vibrating motor. Communications are over a Bluetooth serial port profile. SHAKE includes a powerful DSP engine, allowing real time linear phase sample rate conversion.
The vibrations caused by a user interacting with the shell are captured with a low cost film-style piezo contact microphone (the PZ-Ol, obtained from Images SI Inc., 109 Woods of Arden Road, Staten Island NY 10312, USA, see: http: //www. imagesco. com/catalog/sensors/film. html#pz-08 accessed 22 November 2007) which is attached to the inner exterior of the body with a thin (lmm) malleable layer of Bingham plastics material (Bluetack™ in this case) covering 50% of the surface of the microphone. This provides significantly better signal quality than with a rigid mounting (there is an improvement of almost 12dB in the sensitivity and
much increased high frequency response) . It also offers excellent robustness to interference from air-borne sound. Even in very noisy environments, vibrations from physical contact with the shell are of much greater amplitude than those caused by external noise. A custom expansion module was designed for the SHAKE that includes a high impedance microphone data acquisition circuit and a vibration driver suitable for driving a linear vibration actuator. Since the purpose of the contact microphone is to sense the vibrations of the enclosure that surrounds the SHAKE, we limited the bandwidth to 2kHz as there is little useful information above this frequency and it reduces the load of further processing stages. Once the audio signal has been acquired by the custom expansion module, it is digitized and passed to the SHAKE microcontroller where it is filtered, re-sampled μ-law encoded and packaged to be sent to the host device (in this case a laptop computer running music playing software) over the Bluetooth radio using the serial port profile. The effective resolution of the microphone signal once received by the host is 13bits and the -3dB bandwidth is 1.5kHz.
There will now be described the process for classification of the vibrational signals.
The sensed vibrations are classified in real-time, with signals from rubbing different areas of the device assigned to
discrete classes. We used a two-stage classification process, with low-level instantaneous classification and higher-level classifiers which aggregate the evidence from the first stage over time. This structure is well suited to real-time audio and vibrotactile feedback which can be a function of instantaneous classifications.
Before classification, the incoming audio is windowed and transformed into a suitable feature space. The signal is windowed with a Hamming window, 512 samples long
(corresponding to 1/8 second of data), with 7N/8 overlap. The classification stream therefore has a rate of 64 classifications a second. The Fourier transform of the windowed signal is taken, and the phase component discarded, leaving only the magnitude spectrum. The spectrum is then rebinned so that bins are four times their original size. These features are sufficient to separately classify the vibrational signals .
In the low-level instantaneous classification, the feature vectors are classified by a multi-layer-perceptron, with 64 hidden units . The low computational and memory requirements of such model, produce very fast classification performance, suitable for implementation on mobile devices .
Five different classes are trained and stored in a store in the SHAKE. These classes are: scratching circular front clockwise, scratching dimples on right side, scratching tip with fingernail and a miscellaneous noise class. Each class is trained on 120 seconds of input data, with a range of speeds of motion, and a variety of grip postures and pressures. The way the device is held may affects the body resonances of the exterior shell, and this the vibrational signals picked up by the vibrational " sensor . All data in these examples is captured with the shell held in one hand, while being rubbed with the finger of the other hand. In these examples, the surface is stimulated with the back of the fingernail. The miscellaneous noise class includes recordings of the device being manipulated in the hands, being placed in a pocket, picked up and replaced on a table and other background disturbances. We also tested sensitivity to loud noises near the device, but these had negligible effect.
The classifier was trained on 26880 examples, and tested on 11520 unseen pairs, and identifies the different regions of the device with 75% accuracy for these five classes, based on a l/64th of a second of, data. Although this seems relatively low, the high rate of classifications (64/s) means that simple integrators can aggregate evidence from the stream of instantaneous classifications into useful control signals.
In the higher-level classification, the output stream of the low-level instantaneous classification is used as input to a simple dynamic system. This smooths out the fluctuations in the classifier. The dynamic system can support discrete events and continuous values. Complex recurrent classifiers can be used if desired. Here, for discrete events, the system functions as a leaky integrator, which triggers an event once the integrated value crosses a predetermined threshold. After this threshold is crossed, the integrator is inhibited for short period.
X0, = k{xci_]) otherwise, where the 0 < k < 1 governs the decay of the integrator and f gives the increase per classification.
Continuous outputs (such as the volume control in the following discussion) are directly integrated and then clipped to the appropriate range.
Suitable interaction techniques will now be described.
The style of interaction with the controller is typically one where the device is held in one hand, and can either be activated by thumb and fingers of that hand, or in a bimanual
fashion using both hands. The user scratches or rubs the device along its various control surfaces and this generates changes in the interaction.
Since the control surfaces may have different textures, there can be a mapping between these and equivalent key-presses. While possible, and in some cases useful, this is not the primary interaction mechanism envisaged. Stroking motions feel quite different to button-pushes, and are more appropriate for linking to gradual changes in values, such as volume control, zooming, browsing, etc. They are also useful for pushing, pulling and probing actions, and because of the drag in the texture, are a good fit to stretching actions (e.g. zooming) .
The intention of using this style of interaction is that the user can navigate through a high-dimensional state space, generating incremental changes in state, being pulled or pushed by their stroking actions. The fact that there may be many different textures allows control of multiple degrees of freedom in this manner. In many cases it will be of utility to map properties of the variable controlled to the type of texture. This can relate to the perceived nature of the texture, rough, smooth, spiky, compared to the function it controls, and also to the properties of the spacing of elements (e.g. a log-scale on separation for zooming tasks) .
The structure allows both discrete increments, when the user Λpicks' at a single textural feature of a control surface, and continuous ones, where the user brushes through several such features. Depending on the parameterisation of the classification dynamics, partial completion of a stroke can give initial preview information about the consequences of continuing that action. When the user then continues the stroke, the threshold is reached, and the associated action is performed.
Preferred features of augmented feedback will now be described.
While the proprioceptive feedback inherent in the texture is a key benefit of the technique, it can be important to augment this with software-controlled audio and vibrotactile feedback. The controller may therefore have an in-built pager motor in the SHAKE module, and an additional VBW32 actuator [http: //tactaid. com accessed 21 November 2007] for higher- frequency components. The augmentation of the raw texture with application-specific sound and vibration makes this more feasible, and thus we partition the classification component into multiple levels, so that we can provide instantaneous augmented feedback. The augmentation permits the component textures of a specific device to be taken and to make them
appear to be a range of different media, which invite different styles of interaction, at different rates and rhythms. The user can potentially learn the affordances of the controller just by manipulating it, and feeling the changing responses to stroking actions, where each mode of the system might be associated with subtle changes in the response behaviour of the system.
An example of a media player (in this case a music player) being controlled by the controller will now be described.
The controller was implemented as a user interface for a music player, which is controlled by scratch-based interaction with appropriate mappings from control surfaces to controls. The use case scenario is a user walking, listening to their music player, and controlling the volume and track choice while the controller is in their jacket pocket. The major actions used are start/stop (controlled by tapping) , volume adjustment and track change. Each of the classified outputs is fed to an integrator. The output of this integrator is either used directly (for volume control), or is thresholded to activate events (for track changes) . This results in reliable control, even though the underlying classification may have regular glitches. The textures are easily navigated by the user by touch alone, and the system was tested with five different
users, who were able to use it without problems, despite the system being calibrated for a single user.
Figs. 17-26 show data for a session in which the user flicks forward two tracks of music, lowers then raises the volume, then flicks back two tracks. Fig. 17 shows a photograph of a spectrogram of the vibrational signals picked up by the piezo microphone. Fig. 26 shows the corresponding amplitude waveform of the signal. Figs. 18 to 21 show classification events for four respective classifications. Figs 22 and 23 show the integrated values from the classification events, which approximate to:
P(C1 Ix(O) Figs. 24 and 25 show the changes in controlled variables (volume and track number) .
The controller may be fitted with capacitive sensors, in order to sense touch. Such sensors may be separate from or coincident with the control surfaces of the controller.
The example described above demonstrated that it is possible to robustly classify stroking movements on the control surfaces of the controller, using vibration sensor information alone. The tactile feedback from the physical case can be augmented with context-dependent audio and vibration feedback. The texture provides immediate feedback to the user about the
likely consequences of their actions, and they can be used in an eyes-free context, such as in the user's pocket.
The simplicity of the case technology provides the potential for user-driven design. Thus, it is possible to create λskins' for mobile devices that provide more than visual aesthetics - they allow designs customized for specific families of applications based on the controller described herein.
The controller also provides a research tool. Operation of the inertial sensors of the SHAKE allows the exploration of combinations of stroking movements with gross motor activity, such as shaking or twisting the device. Use of magnetometers for bearing allows the controller to be used for pointing at objects in mobile spatial interaction settings, where the rubbing can then be used to tease out properties of the content being pointed at.
The controller also provides the possibility of initiating a link between a controller and a corresponding device. For example, consider the controller 10 of Figs. 1-6 and the controller 50 of Figs. 7-11. It is possible to rub a control surface of controller 10 against a control surface of controller 50. The vibrational signal generated in each controller will be similar, but will strongly depend on the
control surface used for the rubbing interaction. Thus, it is possible to encode information in the relevant control surfaces, this information only becoming available when the two control surfaces are brought together and rubbed together as described. Such information may be used, for example, in place of Bluetooth pairing security information (typically such security information is inputted manually into each device to be paired, which is cumbersome) . Thus, rubbing the two controllers together allows them to be paired in a way that is intuitive and simple, yet secure.
Figs. 27-29 show views of an alternative controlling mechanism for an embodiment of the present invention. In Fig. 27, a sheet of material 100 (e.g. a metallic sheet or a plastics sheet) has a deformable dome 102 formed in it. The deformable dome is itself a permanently plastically deformed area of the sheet 100. Fig. 28 shows a plan view of the sheet 100 and dome 102. Fig. 29 shows a cross sectional view along lines X- X in Fig. 28. In Figs. 21-29, the dome is not yet deformed elastically.
The dome is a control element, deformation of the dome providing classifiable sounds for providing a corresponding control signal.
Figs. 30-32 show the sheet 100 and dome 102 when the dome is deformed elastically, e.g. by being pressed by a digit if a user. The central part 104 of the dome 102 is depressed downwardly into and possibly through the plane of the sheet 100. This is accommodated by elastic bending deformation of the side walls 106, 108 of the dome 102. Typically, the elastic bending deformation of these side walls will be at least 0.5% and may be significantly more.
The dome is prevented from increasing its outer circumference (i.e. "spreading out") due to being constrained by the remainder of sheet 100. During depression of central portion 104 of the dome, therefore, there becomes a point at which it is more energetically favourable for the centre of the dome to be depressed still further, due to the constraints of the sheet. This is accommodated as shown in Fig. 32 by the dome taking up an M-shaped configuration in cross section. Thus, the dome is bistable. Energetically, its most favoured configuration is as shown in Figs 27-29. However, when depressed past a certain point, it moves to the second most favoured configuration as shown in Figs. 30-32.
Δs the dome moves to the configuration shown in Fig. 32, it emits a click sound. This sound is characteristic of the dome. Also, when the dome is released, it emits another, different, characteristic click sound. These sounds can be
classified in a similar manner to the discussion above for other vibrational sounds.
The present inventors have confirmed that slightly different shaped or sized domes can provide different, classifiable sounds on press and release.
A pair of flexible metallic domes of slightly different size were mounted on a firm surface. The domes had the same overal configuration, but the diameter of one was about 1.5 time the diameter of the other. A piezo microphone was also attached to the firm surface. On press and release of the domes, the clicks from the domes were recorded with minimal noise from the piezo microphone, which essentially only picks up vibrations in the surface and is insensitive to airborne sound.
Fig. 33 shows the time series and Fig. 34 shows the spectrogram for the first dome, on press and release. Fig. 35 shows the time series and Fig. 36 shows the spectrogram for the second dome, on press and release. The dome response is different for the different domes. These differences are significant enough to allow each sound to be classifiable with respect to the others. The signals can be classified by transforming them to the frequency domain via FFT and then running them through a simple neural network. This can be
trained with a number of example clicks from each of the flexible domes. In order to minimize the training required, the click events can be automatically detected via simple thresholding. Each event can then be aligned so that there are no offsets in the timing of the signals.
As will be clear, alternative embodiments are envisaged in which the control element is other than a dome.
The above embodiments have been described by way of example only. On reading this disclosure, modifications of these embodiments, further embodiments and modifications thereof will be apparent to the skilled person and as such are within the scope of the present invention.
Claims
1. A controller including at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element, the controller further including processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal corresponding to the first characteristic class of vibrational signal.
2. A controller according to claim 1 wherein the contact action which causes deformation or displacement of the control element is direct or indirect contact of the control element by the user.
3. A controller according to claim 1 or claim 2, the controller being adapted for interaction with the user with one or more digits of one or both hands of the user.
4. A controller according to any one of claims 1 to 3 wherein the controller includes a plurality of control elements .
5. A controller according to claim 4 wherein the shape of the control elements differs between the control elements.
6. A controller according to any one of claims 1 to 5 wherein the at least one control element includes, at least in part, one of a concave and a convex shape.
7. A controller according to any one of claims 1 to 6 wherein, in use, the contact action is at least one of pressing, stroking, twisting, scratching, pulling, rubbing.
8. A controller according to any one of claims 1 to 7 wherein the control element has a first configuration before deformation or displacement and a second, different configuration during deformation or displacement. ■
9. A controller according to any one of claims 1 to 8 wherein the control element is multistable.
10. A controller according to any one of claims 1 to 9 wherein the control element is a deformable dome.
11. A controller according to any one of claims 1 to 10 wherein the controller has three or more control elements, in operation the control elements each providing a corresponding vibrational signal, different from and classifiable from the vibrational signals caused by operation of the other control elements.
12. A controller according to any one of claims 1 to 11 having a body, the at least one control, element being integrally formed with a wall of the body.
13. A controller according to claim 12 wherein the body encloses a space, the vibrational sensor and/or the processing means being housed in said space.
14. A controller according to any one of claims 1 to 13 wherein the processing means is operable to receive an input signal from the vibrational sensor, the processing means further being operable to limit the bandwidth of the data corresponding to these signals for onward processing to a band at 10 kHz and below.
15. A controller according to any one of claims 1 to 14 wherein classification of the vibrational signals into the characteristic classes of vibrational signals takes place in a control means, operable to output a control signal based on the classification of the vibrational signal.
16. A controller according to claim 15 wherein the control means includes a store for storing training data.
17. A controller according to any one of claims 1 to 16 including active audio, visual or vibrational feedback means.
18. A controller according to any one of claims 1 to 17 including further sensing means, for sensing movement, touch, location or bearing of the controller.
19. A mobile electronic device including a controller according to any one of claims 1 to 18.
20. A system including a controller having at least one control element for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element, the system providing control means for classifying the first vibrational signal into a first characteristic class of vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal .
21. Use of a controller according to any one of claims 1 to 18 or of a system according to claim 20 to control a device, the use including a user interacting with the controller to deform or displace the control element to cause a first vibrational signal.
22. A method of operating a controller according to any one of claims 1 to 18 or a system according to claim 20, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
23. A method of operating a controller, the controller including at least one control element, at least one vibrational sensor and processing means, the method including the step of the user deforming or displacing the control element by causing a contact action on the control element to take place, the vibrational sensor detecting a first vibrational signal caused by at least one of: the deformation or displacement of the control element; and a recovery from the deformation or displacement of the control element, wherein the processing means processes the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, the control signal correspbnding to the first characteristic class of vibrational signal.
24. An external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having at least one control element, the control element being suitable for providing a detectable first vibrational signal caused by at least one of: a deformation or displacement of the control element due to a contact action caused by a user; and a recovery from such a deformation or displacement of the control element,
25. An external cover layer according to claim 24 wherein the at least one vibrational sensor is included as part of the external cover layer, the vibrational sensor being for detecting said first vibrational signal.
26. A kit of one or more external cover layers according to claim 24 or claim 25 and a controller according to any one of claims 1 to 18.
27. A controller including at least one control surface for operation by a user, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface, at least one of the control surface and the user-operated surface being a textured surface, and processing means for processing the first vibrational signal to enable classification into a first characteristic class of vibrational signal from which a control signal is derivable, corresponding to the first characteristic class of vibrational signal.
28. A controller according to claim 27, the user-operated surface being a surface of the user' s hand, the controller being adapted for interaction with the user with one or more digits of one or both hands of the user.
29. A controller according to claim 27 or claim 28 wherein the controller includes a plurality of control surfaces, each of the plurality of control surfaces being a textured surface.
30. A controller according to claim 29 wherein the texture of the control surfaces differs between the control surfaces.
31. A controller according to any one of claims 27 to 30 wherein the texture of the at least one control surface is a systematic variation in the height profile of the control surface from an average height of the control surface.
32. A controller according to any one of claims 27 to 31 wherein the texture of at least one of the control surfaces is line-type texture.
33. A controller according to any one of claims 27 to 32 wherein the texture of at least one of the control surfaces is island-type texture.
34. A controller according to any one of claims 27 to 33 wherein the texture is substantially asymmetrical when viewed in cross section.
35. A controller according to any one of claims 27 to 34 wherein the spacing of the texture varies across the control surface .
36. A controller according to claim 35 wherein the spacing of the texture varies in a gradual variation.
37. A controller according to claim 35 wherein the spacing of the texture varies in a step-wise variation.
38. A controller according to any one of claims 27 to 37 wherein the at least one control surface has, at least partially, one of a concave and a convex shape.
39. A controller according to any one of claims 27 to 38 having a body, the at least one control surface being integrally formed with the body wall.
40. A controller according claim 39 wherein the body encloses a space, the vibrational sensor and/or the processing means being housed in said space.
41. A controller according to any one of claims 27 to 40 wherein the processing means is operable to receive an input signal from the vibrational sensor, the processing means further being operable to limit the bandwidth of the data corresponding to these signals for onward processing to a band at 10 kHz and below.
42. A controller according to any one of claims 27 to 41 wherein classification of the vibrational signals into the characteristic classes of vibrational signals takes place in a control means, operable to output a control signal based on the classification of the vibrational signal.
43. A controller according to claim 42 wherein the control means includes a store for storing training data.
44. A controller according to any one of claims 27 to 43 including active audio, visual or vibrational feedback means.
45. A controller according to any one of claims 27 to 44 including further sensing means, for sensing movement, touch, location or bearing of the controller.
46. A mobile electronic device including a controller according to any one of claims 27 to 45.
47. A system including a controller having at least one control surface for operation by a user, the control surface being a textured surface, the controller including at least one vibrational sensor for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface and the system providing control means for classifying the first vibrational signal into a first characteristic class of vibrational signal and thereby outputting a control signal corresponding to the first characteristic class of vibrational signal.
48. Use of a controller according to any one of claims 27 to 45 or a system according to claim 47 to control a device, the use including a user interacting with the controller to provide a translational contact motion between the user- operated surface and the control surface.
49. A method of operation of a controller according to any one of claims 27 to 45 or a system according to claim 47, the method including the steps of receiving first vibrational data corresponding to a first vibration signal, classifying the first vibrational data into a first characteristic class of vibrational data, and outputting control data corresponding to the first characteristic class of vibrational data.
50. An external cover layer for a controller, the cover being separate or separable from a body of the controller, the external cover layer having a control surface for operation by a user, the control surface being a textured surface.
51. An external cover layer according to claim 50 wherein the at least one vibrational sensor is included as part of the external cover layer, the vibrational sensor being for detecting a first vibrational signal corresponding to a translational contact motion between the control surface and a user-operated surface.
52. A kit of one or more external cover layers according to claim 50 or claim 51 and a controller according to any one of claims 27 to 45.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB0724005.4 | 2007-12-07 | ||
| GBGB0724005.4A GB0724005D0 (en) | 2007-12-07 | 2007-12-07 | Controller |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2009071919A1 true WO2009071919A1 (en) | 2009-06-11 |
Family
ID=38983152
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/GB2008/004044 Ceased WO2009071919A1 (en) | 2007-12-07 | 2008-12-08 | Controller |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB0724005D0 (en) |
| WO (1) | WO2009071919A1 (en) |
Cited By (25)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20110003550A1 (en) * | 2009-07-03 | 2011-01-06 | Sony Ericsson Mobile Communications Ab | Tactile input for accessories |
| WO2011144804A1 (en) * | 2010-05-20 | 2011-11-24 | Nokia Corporation | An apparatus for a user interface and associated methods |
| DE102011080518A1 (en) * | 2011-08-05 | 2013-02-07 | Sennheiser Electronic Gmbh & Co. Kg | Handset and method for controlling a handset |
| EP2661105A2 (en) * | 2012-05-03 | 2013-11-06 | DSP Group Inc. | A system and apparatus for controlling a device with a bone conduction transducer |
| US8773946B2 (en) | 2010-12-30 | 2014-07-08 | Honeywell International Inc. | Portable housings for generation of building maps |
| EP2393305A3 (en) * | 2010-06-01 | 2014-09-17 | Sony Corporation | Sound signal processing apparatus, microphone apparatus, sound signal processing method, and program |
| US8990049B2 (en) | 2010-05-03 | 2015-03-24 | Honeywell International Inc. | Building structure discovery and display from various data artifacts at scene |
| US8995678B2 (en) | 2010-04-30 | 2015-03-31 | Honeywell International Inc. | Tactile-based guidance system |
| WO2016048771A1 (en) | 2014-09-22 | 2016-03-31 | Qeexo Co. | Method and apparatus for improving accuracy of touch screen event analysis by use of edge classification |
| US9342928B2 (en) | 2011-06-29 | 2016-05-17 | Honeywell International Inc. | Systems and methods for presenting building information |
| EP3382507A1 (en) * | 2017-03-31 | 2018-10-03 | Immersion Corporation | Multi-stable haptic feedback systems |
| US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
| US10366583B2 (en) | 2015-08-25 | 2019-07-30 | Immersion Corporation | Bistable haptic feedback generator |
| US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
| US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
| US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
| US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
| US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
| US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
| US11029785B2 (en) | 2014-09-24 | 2021-06-08 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
| US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
| US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
| US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
| US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
| US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060071912A1 (en) * | 2004-10-01 | 2006-04-06 | Hill Nicholas P R | Vibration sensing touch input device |
-
2007
- 2007-12-07 GB GBGB0724005.4A patent/GB0724005D0/en not_active Ceased
-
2008
- 2008-12-08 WO PCT/GB2008/004044 patent/WO2009071919A1/en not_active Ceased
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20060071912A1 (en) * | 2004-10-01 | 2006-04-06 | Hill Nicholas P R | Vibration sensing touch input device |
Cited By (34)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2011001229A1 (en) * | 2009-07-03 | 2011-01-06 | Sony Ericsson Mobile Communications Ab | Tactile input for accessories |
| US20110003550A1 (en) * | 2009-07-03 | 2011-01-06 | Sony Ericsson Mobile Communications Ab | Tactile input for accessories |
| US8995678B2 (en) | 2010-04-30 | 2015-03-31 | Honeywell International Inc. | Tactile-based guidance system |
| US8990049B2 (en) | 2010-05-03 | 2015-03-24 | Honeywell International Inc. | Building structure discovery and display from various data artifacts at scene |
| US9367150B2 (en) | 2010-05-20 | 2016-06-14 | Nokia Technologies Oy | Apparatus and associated methods |
| WO2011144804A1 (en) * | 2010-05-20 | 2011-11-24 | Nokia Corporation | An apparatus for a user interface and associated methods |
| EP2393305A3 (en) * | 2010-06-01 | 2014-09-17 | Sony Corporation | Sound signal processing apparatus, microphone apparatus, sound signal processing method, and program |
| US9485569B2 (en) | 2010-06-01 | 2016-11-01 | Sony Corporation | Sound signal processing apparatus, microphone apparatus, sound signal processing method, and program |
| US8773946B2 (en) | 2010-12-30 | 2014-07-08 | Honeywell International Inc. | Portable housings for generation of building maps |
| US9342928B2 (en) | 2011-06-29 | 2016-05-17 | Honeywell International Inc. | Systems and methods for presenting building information |
| US10445933B2 (en) | 2011-06-29 | 2019-10-15 | Honeywell International Inc. | Systems and methods for presenting building information |
| US10854013B2 (en) | 2011-06-29 | 2020-12-01 | Honeywell International Inc. | Systems and methods for presenting building information |
| WO2013020792A1 (en) | 2011-08-05 | 2013-02-14 | Sennheiser Electronic Gmbh & Co. Kg | Earpiece and method for controlling an earpiece |
| DE102011080518A1 (en) * | 2011-08-05 | 2013-02-07 | Sennheiser Electronic Gmbh & Co. Kg | Handset and method for controlling a handset |
| US10642407B2 (en) | 2011-10-18 | 2020-05-05 | Carnegie Mellon University | Method and apparatus for classifying touch events on a touch sensitive surface |
| EP2661105A2 (en) * | 2012-05-03 | 2013-11-06 | DSP Group Inc. | A system and apparatus for controlling a device with a bone conduction transducer |
| US11175698B2 (en) | 2013-03-19 | 2021-11-16 | Qeexo, Co. | Methods and systems for processing touch inputs based on touch type and touch intensity |
| US10949029B2 (en) | 2013-03-25 | 2021-03-16 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers |
| US11262864B2 (en) | 2013-03-25 | 2022-03-01 | Qeexo, Co. | Method and apparatus for classifying finger touch events |
| US10599251B2 (en) | 2014-09-11 | 2020-03-24 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
| US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
| WO2016048771A1 (en) | 2014-09-22 | 2016-03-31 | Qeexo Co. | Method and apparatus for improving accuracy of touch screen event analysis by use of edge classification |
| EP3198384A4 (en) * | 2014-09-22 | 2018-04-25 | Qeexo, Co. | Method and apparatus for improving accuracy of touch screen event analysis by use of edge classification |
| US11029785B2 (en) | 2014-09-24 | 2021-06-08 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
| US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
| US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
| US10366583B2 (en) | 2015-08-25 | 2019-07-30 | Immersion Corporation | Bistable haptic feedback generator |
| EP3382507A1 (en) * | 2017-03-31 | 2018-10-03 | Immersion Corporation | Multi-stable haptic feedback systems |
| US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
| US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
| US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
| US11543922B2 (en) | 2019-06-28 | 2023-01-03 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
| US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
| US12163923B2 (en) | 2020-01-29 | 2024-12-10 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
Also Published As
| Publication number | Publication date |
|---|---|
| GB0724005D0 (en) | 2008-01-16 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2009071919A1 (en) | Controller | |
| Deyle et al. | Hambone: A bio-acoustic gesture interface | |
| JP6431126B2 (en) | An interactive model for shared feedback on mobile devices | |
| US8866788B1 (en) | Interactivity model for shared feedback on mobile devices | |
| EP2846219B1 (en) | Haptic conversion system using frequency shifting | |
| EP2778847B1 (en) | Contactor-based haptic feedback generation | |
| Kikuchi et al. | EarTouch: turning the ear into an input surface | |
| US20110267294A1 (en) | Apparatus and method for providing tactile feedback for user | |
| US20120223880A1 (en) | Method and apparatus for producing a dynamic haptic effect | |
| CN102549531B (en) | Processor interface | |
| WO2011135171A1 (en) | Apparatus and method for providing tactile feedback for user | |
| CN109478089A (en) | Multi-modal haptic effect | |
| Strachan et al. | BodySpace: inferring body pose for natural control of a music player | |
| Oh et al. | VibEye: Vibration-mediated object recognition for tangible interactive applications | |
| Guerreiro et al. | Mnemonical body shortcuts: improving mobile interaction | |
| Murray-Smith et al. | Rub the stane | |
| Kim et al. | A gestural input through finger writing on a textured pad | |
| US12416980B2 (en) | Hand customizable human input device | |
| Panduranga et al. | Sensors for virtual musical environment: A short survey |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08858134 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 08858134 Country of ref document: EP Kind code of ref document: A1 |