GB2572529A - Data over audio - Google Patents
Data over audio Download PDFInfo
- Publication number
- GB2572529A GB2572529A GB1801765.7A GB201801765A GB2572529A GB 2572529 A GB2572529 A GB 2572529A GB 201801765 A GB201801765 A GB 201801765A GB 2572529 A GB2572529 A GB 2572529A
- Authority
- GB
- United Kingdom
- Prior art keywords
- audio signal
- computing device
- embedded audio
- gaming
- control signals
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/30—Payment architectures, schemes or protocols characterised by the use of specific devices or networks
- G06Q20/32—Payment architectures, schemes or protocols characterised by the use of specific devices or networks using wireless devices
- G06Q20/327—Short range or proximity payments by means of M-devices
- G06Q20/3272—Short range or proximity payments by means of M-devices using an audio code
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/165—Management of the audio stream, e.g. setting of volume, audio stream path
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F17/00—Coin-freed apparatus for hiring articles; Coin-freed facilities or services
- G07F17/32—Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F17/00—Coin-freed apparatus for hiring articles; Coin-freed facilities or services
- G07F17/32—Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
- G07F17/3202—Hardware aspects of a gaming system, e.g. components, construction, architecture thereof
- G07F17/3223—Architectural aspects of a gaming system, e.g. internal configuration, master/slave, wireless communication
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F17/00—Coin-freed apparatus for hiring articles; Coin-freed facilities or services
- G07F17/32—Coin-freed apparatus for hiring articles; Coin-freed facilities or services for games, toys, sports, or amusements
- G07F17/3225—Data transfer within a gaming system, e.g. data sent between gaming machines and users
- G07F17/3227—Configuring a gaming machine, e.g. downloading personal settings, selecting working parameters
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C23/00—Non-electrical signal transmission systems, e.g. optical systems
- G08C23/02—Non-electrical signal transmission systems, e.g. optical systems using infrasonic, sonic or ultrasonic waves
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C25/00—Arrangements for preventing or correcting errors; Monitoring arrangements
- G08C25/02—Arrangements for preventing or correcting errors; Monitoring arrangements by signalling back receiving station to transmitting station
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Audiology, Speech & Language Pathology (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Business, Economics & Management (AREA)
- Multimedia (AREA)
- Accounting & Taxation (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Circuit For Audible Band Transducer (AREA)
Abstract
Methods and apparatus are disclosed related to modifying execution of a computing device or generating output of a gaming device by providing data over audio using audio signals. A computing device 300 can receive content that includes an embedded audio signal at a sound detector 340. The computing device can determine one or more control signals or operations to be performed from the embedded audio signal and modify execution of one or more software applications 310 of the computing device based on the one or more control signals. The computing device may be a gaming device (FIG. 7C) that can emit sounds including an embedded audio signal for reception by another nearby gaming device, which upon reception of the signal, performs control actions related to a winning wager on the other gaming device, thereby drawing more attention to the winning wager.
Description
DATA OVER AUDIO
BACKGROUND [0001] Data stored as binary data in a computer memory can be communicated from a source device to a destination device using a variety of techniques. One technique is to have the source device divide the binary data into packets and send the packets from the source device to the destination device, perhaps via one or more intermediate devices. Upon reception of the packets, the destination device can reassemble and extract the communicated data from the packets.
[0002] Another technique to communicate the stored binary data from a source device to a destination device involves the source device converting and/or modulating the binary data to some other medium, such as sound or light, and sending the other medium from the source device to the destination device, perhaps via one or more intermediate devices. Upon reception of the other medium, the destination device can convert and/or demodulate the other medium to a binary format to retrieve the communicated data.
OVERVIEW [0003] Example embodiments are described herein. In a first aspect, a method is provided that includes: (I) receiving content with an embedded audio signal at a sound detector of a computing device; (II) determining one or more control signals from the embedded audio signal at the computing device; and (III) modifying execution of one or more software applications of the computing device based on the one or more control signals.
[0004] In a second aspect, a computing device is provided that includes: (I) means for receiving content with an embedded audio signal; (II) means for determining one or more control signals from the embedded audio signal; and (III) means for modifying execution of one or more software applications of the computing device based on the one or more control signals.
[0005] In a third aspect, a computer-readable medium is provided. The computer-readable medium is configured to store instructions that, when executed by one or more processors of a computing device, cause the computing device to carry out functions. The functions include: (I) receiving content with an embedded audio signal at a sound detector; (II) determining one or more control signals from the embedded audio signal; and (111) modifying execution of one or more software applications based on the one or more control signals.
[0006] In a fourth aspect, a computing device is provided that includes: (I) one or more processors; and (II) data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions. The functions include: (A) receiving content with an embedded audio signal at a sound detector of a computing device; (B) determining one or more control signals from the embedded audio signal at the computing device; and (C) modifying execution of one or more software applications of the computing device based on the one or more control signals.
[0007] In a fifth aspect, a method is provided that includes: (I) receiving content with an embedded audio signal at a sound detector of a gaming device; (II) determining one or more control signals from the embedded audio signal at the gaming device; and (III) generating an output of the gaming device that is based on the one or more control signals.
[0008] In a sixth aspect, a gaming device is provided that includes: (I) means for receiving content with an embedded audio signal; (II) means for determining one or more control signals from the embedded audio signal; and (III) means for generating an output that is based on the one or more control signals.
[0009] In a seventh aspect, a computer-readable medium is provided. The computer-readable medium is configured to store instructions that, when executed by one or more processors of a gaming device, cause the gaming device to carry out functions. The functions include (I) receiving content with an embedded audio signal; (II) determining one or more control signals from the embedded audio signal; and (III) generating an output that is based on the one or more control signals.
[0010] In an eighth aspect, a gaming device is provided that includes: (I) one or more processors; and (II) data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the gaming device to carry out functions. The functions include: (A) receiving content with an embedded audio signal at a sound detector; (B) determining one or more control signals from the embedded audio signal; and (C) generating an output that is based on the one or more control signals.
[0011] In a ninth aspect, a method is provided that includes: (I) receiving, at a computing device, content including an embedded audio signal; (II) determining one or more operations to be performed by the computing device, the one or more operations based on the embedded audio signal; and (III) modifying execution of the computing device by performing the one or more operations using the computing device.
[0012] In a tenth aspect, a computing device is provided that includes: (1) means for receiving content including an embedded audio signal; (II) means for determining one or more operations to be performed, the one or more operations based on the embedded audio signal; and (III) means for modifying execution of the computing device by performing the one or more operations.
[0013] In an eleventh aspect, a computer-readable medium is provided. The computer-readable medium is configured to store instructions that, when executed by one or more processors of a computing device, cause the computing device to carry out functions. The functions include: (I) receiving content including an embedded audio signal; (II) determining one or more operations to be performed, the one or more operations based on the embedded audio signal; and (III) modifying execution of the computing device by performing the one or more operations.
[0014] In a twelfth aspect, a computing device is provided that includes: (I) one or more processors; and (II) data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions. The functions include: (A) receiving content including an embedded audio signal; (B) determining one or more operations to be performed, the one or more operations based on the embedded audio signal; and (C) modifying execution of the computing device by performing the one or more operations.
BRIEF DESCRIPTION OF THE DRAWINGS [0015] Example embodiments are described herein with reference to the drawings, in which:
[0016] FIG. 1 is a flow chart illustrating a set of functions related to using embedded audio signals, in accordance with example embodiments;
[0017] FIG. 2 is a flow chart illustrating a set of functions related to retransmitting embedded audio signals, in accordance with example embodiments;
[0018] FIG. 3 is a block diagram of a computing device, in accordance with example embodiments.
[0019] FIG. 4 illustrates an environment, in accordance with example embodiments;
[0020] FIG. 5 illustrates another environment, in accordance with example embodiments;
[0021] FIGS. 6A and 6B illustrate a scenario taking place in the environment of FIG. 4, in accordance with example embodiments;
[0022] FIGS. 7A, 7B, and 7C illustrate a scenario taking place in the environment of FIG. 5, in accordance with example embodiments;
[0023] FIG. 8 is a flow chart of functions to carry out a method, in accordance with example embodiments;
[0024] FIG. 9 is another flow chart of functions to carry out a method, in accordance with example embodiments; and [0025] FIG. 10 is yet another flow chart of functions to carry out a method, in accordance with example embodiments.
DETAILED DESCRIPTION I. INTRODUCTION [0026] Herein are disclosed techniques for communicating particular data within a data stream of content, such as a binary data stream and/or an audio data stream. The particular data can be encoded into control signals that in turn can be combined into a particular audio signal. The particular audio signal can be embedded into the data stream of content. A platform, such as a gaming device, smartphone, intemet-of-things, and/or other computing device, can include software and/or hardware to deliver content that includes embedded audio signals through a loudspeaker or other sound production device. The platform can also include a sound detector (e.g. a microphone) for listening for the embedded audio signals and act on them.
[0027] In response to reception of the data stream of content at a platform, such as a destination computing device, content in the data can be presented by the destination computing device; e.g., using a web browser, content player, displays, loudspeakers, and/or other software and/or hardware of the destination computing device. When the content is presented, a sound detector of the destination computing device can receive the embedded audio signal, obtain the control signals from the received embedded audio signal, and take one or more control-signal-related actions. In some examples, embedded audio signals can provide about 100 bits of data per second to the destination computing device.
[0028] The one or more control-signal-related actions can include initiating or launching execution of a software application, modifying execution of an already-executing software application (e.g., by providing data decoded from the control signals to the software application), and/or generating one or more audible and/or visible outputs. In some examples, the one or more control-signal-related actions can be related to gaming, such as sports betting, games of chance including slot machine play, and/or other gaming-related activities. In one gaming-related example, content related to gaming, such as information about sports betting, a game of chance, or other gaming information, can be presented to (and perhaps using) a destination computing device, where the content includes an embedded audio signal.
[0029] In some examples, the destination computing device can receive the embedded audio signal, determine control signals based on the embedded audio signal, and cause one or more gaming-related actions to occur. The gaming-related actions can include, but are not limited to, initiating or launching execution of a gaming-related software application (e.g., a betting application, a browser directed to a poker website, an application for playing roulette or another game of chance), providing data to a gaming-related software application (e.g., odds and/or sporting-event-related data provided to the betting application, data about upcoming tournaments or other events, data related to free and/or reduced priced games, bets, and/or credits, vouchers for casino meals and/or other “comps”), causing gaming-related displays to be generated, and/or presenting (additional) gaming-related content.
[0030] Some gaming-related examples can involve a computing devices acting as a gaming device; e.g., a slot machine, video poker machine, etc. In these examples, the one or more control-signal-related actions can be related to gaming and involve actions taken by the gaming device. For example, suppose gaming device GDI has a winning wager of a relatively-large amount. In response to the relatively-large winning wager, gaming device GDI can flash lights and emit sounds (e.g., ring bells, play pre-recorded audio) that include an embedded audio signal that is received at a (nearby) gaming device GD2. Upon reception of the embedded audio signal, gaming device GD2 can perform control-signal-related actions involving output audible and/or visible signals related to the relatively-large winning wager at gaming device GDI. In this way, multiple gaming devices (e.g., at least GDI and GD2) can output audible and/or visible signals related to the relatively-large winning wager, thereby drawing more attention to the winning wager.
[0031] In other examples, the one or more control-signal-related actions can be unrelated to gaming. For example, suppose an embedded audio signal is embedded in content related to a travel destination. In this example, the one or more control-signal-related actions could include initiating or launching execution of a non-gaming-related software application (e.g., a web browser or other application for making hotel, flight, car rental, and/or other reservations related to the travel destination), providing data to a non-gaming-related software application (e.g., data related to the travel destination), causing non-gaming-related displays (e.g., displays related to the travel destination) to be generated, and/or presenting (additional) non-gaming-related content (e.g., content related to the travel destination, such as content about nearby locations or tourist activities at the travel destination).
[0032] As another example, suppose an embedded audio signal is embedded in content related to a particular product. In this example, the one or more control-signal-related actions can include initiating or launching execution of a non-gaming-related software application (e.g., a web browser or other application enabling purchase the particular product), providing data to a nongaming-related software application (e.g., data related to the particular product), causing nongaming-related displays (e.g., displays related to the particular product) to be generated, and/or presenting (additional) non-gaming-related content (e.g., content related to the particular product, such as content about using and/or installing the particular product).
[0033] In some scenarios, the one or more control-signal-related actions can be emergencyrelated actions. For example, an announcement can be broadcast using a public address system related to a severe weather event, such as a hurricane, tornado, typhoon, or blizzard, with one or more audio signals embedded in the announcement. A computing device can use a microphone or other sound reception device to receive the embedded audio signals of the announcement. In accordance with control signals included with the embedded audio signals of the announcement, the computing device can then perform one or more emergency-related actions related to the severe weather event. For example, the computing device can generate a display of safety information and/or directions to safety or other personnel, initiate or launch execution of a weather-related application, and/or provide data to the weather-related application about a location of the announcement. Other emergency-related actions can be caused by embedded audio signals as well.
[0034] Embedded audio signals can be provided as one or more audible audio signals using one or more audible frequencies (e.g., one or more frequencies in an audible frequency range, such as a 20-20,000 Hz range) and/or one or more inaudible audio signals using one or more inaudible frequencies (e.g., one or more frequencies outside of the audible frequency range). For example, inaudible audio signals can be recorded in ultrasonic frequencies (e.g., frequencies greater than 20,000 Hz).
[0035] Data about the embedded audio signals, such as counts and/or locations of computing devices that received the embedded audio signals, duration of received content, and/or counts and/or durations of embedded audio signals, can be determined. The data about the embedded audio signals can then be logged to a backend system for later visualization and/or auditing.
[0036] In some scenarios, embedded audio signals can provide information about presented content. For example, suppose that content C includes inaudible embedded audio signals, perhaps inserted at pre-determined intervals and/or throughout the content. Then, when the content C is presented by a platform, such as a computing device, the computing device can receive the embedded audio signals in content C and track counts and/or durations of the received embedded audio signals. Based on the received embedded audio signals, the computing device (or another platform) can extrapolate what portions of content C were presented and for what duration. Such extrapolated information can be used to provide and/or suggest laterprovided content. Further, determining a number of platforms that provide such extrapolated information could be used to infer (or otherwise determine) how many users had been exposed to specific content, such as content C, related to the extrapolated information.
[0037] The systems and methods described herein can provide a solution to the problem of remote and/or delayed control of a computing device. The solution can include the computing device receiving an embedded audio signal, determining control signals from the embedded audio signal, and performing one or more controls in accord with the embedded audio signal. Such approaches provide the advantage of not requiring specific instruction from a user, such as would be expected from a graphical user interface (GUI) or voice-based interface. Also, these embedded audio signals can be included or embedded in content, and the remote and/or delayed control can be activated by presenting the content, thereby causing the embedded audio signal to be emitted (and thus received) by the computing device during content presentation.
[0038] Embedded audio signals, with or without related content, can be provided via a data network, such as embedded in a content stream provided via the Internet and/or one or more other data networks. Alternatively, embedded audio signals could be provided via means other than a data network. For example, the embedded audio signals could be broadcast over a loudspeaker for reception by the computing device, or the embedded audio signals could be included in content previously stored on the computing device and received by the computing device upon playback of the previously-stored content.
[0039] In some examples, the embedded audio signal can be provided with sufficient audio volume to be detected even in a noisy environment (e.g., a crowded location, an entertainment and/or sporting venue, a casino) where ambient sound can be relatively loud (e.g., 85 decibels or more), thus enabling remote and/or delayed control of computing devices in a variety of environments. Beneficially, sound technology is widespread and robust, enabling a wide variety of devices to reliably transmit and/or receive embedded audio signals. Further, in examples, audio signals can be embedded in data transmitted by a data network and/or in audio signals transmitted through a sound interface, such as air or water. Other examples of providing and receiving embedded audio signals for remote and/or delayed control of a computing device are possible as well.
II. EXAMPLE DATA OVER AUDIO METHODS [0040] FIG. 1 is a flow chart depicting a set of functions 100 that can be carried out in accordance with example embodiments. For example, software of a computing device, such as computing device 300 described below, can be executed by the computing device to carry out some or all the set of functions 100. The set of functions 100 can be performed to receive an embedded audio signal, decode the embedded audio signal to obtain one or more control signals, and to modify behavior of the computing device based on the one or more control signals. The set of functions are shown within blocks 110 through 170. A description of those blocks now follows.
[0041] At block 110, the computing device can start operation in a state where the computing device is ready to receive content. For example, the computing device can be fully initialized or restarted and/or a software application used to receive and/or present content can be initiated.
[0042] At block 120, the computing device can determine whether content is available. If the computing device determines that content is available, the computing device can proceed to block 130. Otherwise, the computing device can determine that content is unavailable and proceed to block 170.
[0043] At block 130, the computing device can receive content, such as information in the form of one or more web pages, audio signals, audio files, video signals, video files, text, music, and/or binary data. For example, the computing device can receive content via a data network (e.g., the Internet) in the form of one or more web pages, audio and/or video files, and/or text files. As another example, the computing device can detect and/or receive audio content over a sound interface, which can be a medium capable of carrying sound, such as air and/or water, using a sound detector, such as a microphone. Audio content provided over the sound interface need not be provided by a data network. For example, a sound detector of the computing device can detect and/or receive spoken or other audio content solely over the sound interface. Other examples of content and detect and/or receive content are possible as well.
[0044] At block 140, the computing device can determine whether the received content contains one or more embedded audio signals.
[0045] For example, the content received at block 130 can be presented by the computing device. Presentation of the content displaying and/or audibly playing the content as text, imagery, video content, and/or audio content. For example, the computing device can present the content using one or more displays, loud speakers, and/or other devices configured for presenting content (examples of such devices are discussed below in the context of user interface module 304 of computing device 300). In some examples, presentation of the content includes presenting an audio signal embedded in or otherwise included with the content. The presented and embedded audio signal can then be detected by a sound detector, such as a microphone, of the computing device. Then, if an embedded audio signal is received by the sound detector during presentation of the content, the computing device can determine that the received content contains an embedded audio signal. Otherwise, if an embedded audio signal is not received by the sound detector during presentation of the content, the computing device can determine that the received content does not contains an embedded audio signal. Other techniques for determine whether the received content contains one or more embedded audio signals are possible as well.
[0046] Embedded audio signals can be provided as one or more audible audio signals using one or more audible frequencies (e.g., one or more frequencies in an audible frequency range, such as a 20-20,000 Hz range) and/or one or more inaudible audio signals using one or more inaudible frequencies (e.g., one or more frequencies outside of the audible frequency range). In some examples, embedded audio signals can provide about 100 bits of data per second to a receiving computing device. In other examples, the embedded audio signal can be provided with sufficient audio volume to be detected even in a noisy environment where ambient sound can be relatively loud (e.g., 85 decibels or more) [0047] If the computing device determines that received content does contain one or more embedded audio signals, then the computing device can proceed to block 150. Otherwise, the computing device determines that received content does not contain one or more embedded audio signals, and the computing device can proceed to block 120. In some examples, when the computing device determines that the received content does not contain one or more embedded audio signals, the computing device can proceed to block 130 rather than block 120, such as when the computing device is to continue receiving content previously determined to be available.
[0048] At block 150, the computing device can obtain one or more control signals from the one or more embedded audio signals. In some examples, the one or more embedded audio signals can be used as the one or more control signals. In other examples, the one or more embedded audio signals can encode the one or more control signals; then, at block 150, the computing device can decode the embedded audio signals to obtain the one or more control signals. For example, the one or more embedded audio signals and/or the one or more control signals can be in compressed and/or encrypted form; then, the computing device can decompress and/or decrypt the one or more embedded audio signals and/or the one or more control signal to obtain one or more control signals. As another example, the one or more embedded audio signals and/or one or more control signals can be modulated, such as by use of quadrature amplitude modulation and/or one or more other modulation techniques, for insertion into content, for encoding, and/or for decoding. In some examples, the one or more embedded audio signals and/or one or more control signals can be serialized and/or de-serialized; e.g., using JavaScript Object Notation (JSON) and/or one or more other serialization/deserialization techniques.
[0049] At block 160, the computing device can modify operation and/or execution of the computing device based on the one or more control signals. For example, the computing device can utilize set of functions 200, discussed below in the context of FIG. 2, to modify operation and/or execution of the computing device based on the one or more control signals.
[0050] Upon completion of the procedures of block 160, the computing device can proceed to block 120. In some examples, the computing device can proceed to block 130 rather than block 120, such as when the computing device is to continue receiving content previously determined to be available.
[0051] At block 170, the computing device can exit, thereby ceasing performance of set of functions 100. In some examples, at block 170, the computing device can return to block 120, perhaps after waiting some period of time. In particular of these examples, the computing device can return to block 120 a predetermined number of times and/or for a predetermined amount of time before exiting.
[0052] FIG. 2 is a flow chart illustrating a set of functions 200 related to retransmitting embedded audio signals, in accordance with example embodiments. For example, software of a computing device, such as computing device 300 described below, can be executed by the computing device to carry out some or all of set of functions 200.
[0053] Set of functions 200 can be used by the computing device to determine whether to carry out one or more operations to be performed based on embedded audio signal EAS and/or one or more control signals CS of embedded audio signal EAS. The one or more operations can include, but are not limited to, one or more of: an operation of initiating execution of a first software application of the computing device; an operation of providing data to a second software application of the computing device, where the data is based on the embedded audio signal; an operation of generating a visual and/or audible output of the computing device; an operation to carry out one or more control-signal-related actions, and/or an operation of generating a ticket using a ticket printer of the computing device.
[0054] After determining to carry out the one or more operations, the computing device can carry out the one or more operations, thereby modifying operation and/or execution of the computing device based on embedded audio signal EAS and/or one or more control signals CS of embedded audio signal EAS. After carrying out the one or more operations to be performed based on embedded audio signal EAS and/or one or more control signals CS, the computing device can determine whether to perform an operation of transmitting at least part of embedded audio signal EAS based on a value of a broadcast counter BC. For example, if BC is greater than a threshold value of zero, then the computing device can determine to perform the operation of transmitting at least part of embedded audio signal EAS. If BC is less than the threshold value, the computing device can determine not to perform the operation of transmitting at least part of embedded audio signal EAS.
[0055] Set of functions 200 are shown within blocks 210 through 280. At block 210, the computing device can receive one or more embedded audio signals EAS and/or related control signals CS. Embedded audio signals and control signals are discussed in more detail at least above in the context of FIG. 1 and below in the context of at least FIGS. 6A, 7B, 7C, 8, 9, and 10. In some examples, the computing device can begin set of functions 200 after already having received one or more embedded audio signals EAS. In these examples, the procedures of block 210 may be omitted.
[0056] At block 220, the computing device can determine whether it has recently (e.g., within a predetermined amount of time, such as but not limited to, within the last 15 minutes, 30 minutes, hour, two hours, 24 hours, or week) transmitted embedded audio signal. EAS. For example, embedded audio signal EAS can include an identifier or other information that the computing device can use, along with records of previously transmitted embedded audio signals, to determine whether the computing device has recently transmitted embedded audio signal EAS.
[0057] If the computing device determines that the computing device has recently transmitted embedded audio signal EAS, the computing device can proceed to block 280. Otherwise, the computing device may determine that the computing device has not recently transmitted embedded audio signal EAS, and can proceed to block 230.
[0058] At block 230, the computing device can modify its operation and/or execution based on the one or more control signals CS, while refraining from modifying its operation and/or execution to retransmit embedded audio signal EAS. For example, if the one or more control signals CS indicate that operations 01, 02, and 03 are to be performed, where 03 is to retransmit embedded audio signal EAS, then the computing device can modify its operation and/or execution to carry out operations 01 and 02 at block 230. The operation of retransmitting embedded audio signal EAS (or omitting retransmission of embedded audio signal EAS) can carried out by performing the procedures of blocks 240, 250, 260, 270, and 280 described below. [0059] For example, the computing device can interpret the one or more control signals as instructions to take one or more control-signal-related actions to thereby modify operation and/or execution of the computing device. The one or more control-signal-related actions can include the computing device: initiating or launching execution of one or more software applications, modifying execution of one or more already-executing software applications (e.g., by providing data decoded from the control signals to the software application), and/or generating one or more audible and/or visible outputs.
[0060] In some examples, the one or more control-signal-related actions can be related to gaming. In other examples, the one or more control-signal-related actions can be unrelated to gaming. For instance, the one or more control-signal-related actions can be emergency-related actions. Additional examples of the one or more control-signal-related actions are discussed herein.
[0061] At block 240, the computing device can determine whether embedded audio signals EAS and/or control signals CS include a broadcast counter BC. If the computing device determines that embedded audio signal EAS and/or control signals CS include broadcast counter BC, the computing device can proceed to block 250. Otherwise, the computing device may determine that embedded audio signals EAS and/or control signals CS do not include broadcast counter BC and the computing device can proceed to block 280.
[0062] At block 250, the computing device can determine whether broadcast counter BC is less than or equal to zero. If broadcast counter BC is less than zero (i.e., broadcast counter BC is negative), then embedded audio signal EAS can be considered to have been broadcast over a sound interface (e.g., using a public address system or similar device) and so embedded audio signal EAS should not be retransmitted. If broadcast counter BC equals zero, then the computing device can be considered to be a final destination for embedded audio signal EAS, and so embedded audio signal EAS should not be retransmitted. If the computing device determines that broadcast counter BC is less than or equal to zero, the computing device can proceed to block 280. Otherwise, the computing device may determine that broadcast counter BC is greater than zero and can proceed to block 260.
[0063] At block 260, the computing device can decrement broadcast counter BC by one. After decrementing broadcast counter BC, the computing device can generate an updated embedded audio signal EAS_U that includes the decremented broadcast counter BC as well as all other control signals than the broadcast counter BC of embedded audio signal EAS.
[0064] At block 270, the computing device can transmit embedded audio signal EAS_U using a sound interface (e.g., using a loud speaker or other device for transmitting sounds). In some examples, the computing device can update records of recently transmitted embedded audio signals to indicate a time that embedded audio signal EAS was transmitted as embedded audio signal EAS_U.
[0065] In some examples, at block 250, when the computing device determines that broadcast counter BC is negative, then the embedded audio signal EAS can be retransmitted at least once to ensure that the broadcast signal is received by as many devices as possible. In these examples, at block 250, the computing device can copy or rename embedded audio signal EAS as embedded audio signal EAS_U, and the computing device can proceed from block 250 to block 270 to transmit embedded audio signal EAS_U using a sound interface.
[0066] At block 280, the computing device can exit, thereby ceasing performance of set of functions 200.
[0067] In set of functions 200, the value of zero for broadcast counter BC is used as a threshold value for determining whether to retransmit embedded audio signal EAS. If broadcast counter BC is greater than the threshold value of zero, embedded audio signal EAS is retransmitted (after BC is decremented) as indicated at blocks 260 and 270. Otherwise, if the broadcast counter BC is less than or equal to the threshold value of zero, embedded audio signal EAS is not retransmitted, as the computing device will proceed from block 250 to block 280 without retransmitting embedded audio signal EAS. In other examples, other threshold values than zero for broadcast counter BC can be used.
III. EXAMPLE ARCHITECTURE [0068] FIG. 3 is a block diagram of computing device 300, in accordance with example embodiments. In particular, computing device 300 can be configured to perform one or more function related to: sets of functions 100, 200, environments 400, 500, computing devices 410, 412, 556, 560, 562, 564, voice-controlled speaker system 420, data networks 430, 570, server(s) 440, 550, 580, environment 500, gaming devices 510, 512, 514, 516, 518, 520, 522, 524, 526, 528, 530, 532, 534, 536, 538, 540, 542, 544, 546, 548, a software application, scenarios 600, 700, methods 800, 900, 1000.
[0069] Computing device 300 may include one or more mobile computing devices and/or one or more stationary computing devices. A mobile computing device is a computing device configured to be readily portable by a person and can include portable components, such as mobile hardware, mobile (and perhaps rechargeable) power supplies, and mobile software. For example, a mobile computing device could be a smartphone, a handheld computing device, a tablet computing device, or notebook computing device. A stationary computing device is a computing device that is not a mobile computing device, such as a desktop computing device, a rack-mounted server, or a gaming device installed in a casino or other location.
[0070] Computing device 300 can include one or more processors 302, a user interface module 304, data storage 306, and network communications interface module 308, all of which can be linked together via a system bus, network, or other connection mechanism.
[0071] Processor(s) 302 can include one or more general purpose processors and/or one or more special purpose processors. For example, processor(s) 302 can comprise one or more general purpose processors (e.g., Intel® single core microprocessors or Intel® multicore microprocessors) and/or one or more special purpose processors (e.g., application-specific integrated circuits (ASICs), graphics processing units (GPUs), field-programmable gate array (FPGAs), and/or digital signal processors (DSPs)). Processor(s) 302 can be configured to execute computer-readable program instructions 310 that are stored in data storage 306, software applications, and/or other instructions as described herein.
[0072] User interface module 304 can be operable to send data to and/or receive data from external user input/output devices. For example, user interface module 304 can be configured to send and/or receive data to and/or from user input devices such as a keyboard, a keypad, a touch screen, a computer mouse, a track ball, a joystick, a camera, a voice recognition module, a microphone and/or other sound detector, and/or other similar devices. User interface module 304 can also be configured to provide output to user display devices, such as one or more cathode ray tubes (CRT), liquid crystal displays (LCDs), light emitting diodes (LEDs), displays using digital light processing (DLP) technology, printers, light bulbs, and/or other similar devices. User interface module 304 can also include one or more devices configured to generate audible output(s), such as a speaker, speaker jack, audio output port, audio output device, earphones, a bell, a horn, a siren, and/or other similar devices. Further, in some examples, user interface module 304 can also include one or more devices configured to generate haptic output(s) such as forces, vibrations, and/or motions detectable at least by a user’s sense of touch. In some examples, computing device 300 can utilize user interface module 304 to present and/or receive content, such as audio content and/or video content.
[0073] Data storage 306 can include one or more computer-readable storage media that can be read, written, and/or otherwise accessed by at least one of processor(s) 302. The one or more computer-readable storage media can include volatile and/or non-volatile storage components, such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with at least one of processor(s) 302. In some examples, data storage 306 can be implemented using a single physical device (e.g., one optical, magnetic, organic or other memory or disc storage unit), while in other examples, data storage 306 can be implemented using two or more physical devices. In particular, data storage 306 can store computer-readable program instructions 310 and perhaps additional data. In some examples, data storage 306 can additionally include at least enough storage to perform at least part of the herein-described methods and techniques and/or at least part of the functionality of the herein-described computing devices and/or gaming devices.
[0074] Network communications interface module 308 can include one or more wireless interfaces 320 and/or one or more wireline interfaces 322 that are configurable to communicate via a network. Wireless interfaces 320, if present, can utilize an air interface, such as a Bluetooth®, Wi-Fi®, ZigBee®, and/or WiMAX™ interface to a data network, such as a wide area network (WAN), a local area network (LAN), one or more public data networks (e.g., the Internet), one or more private data networks, or any combination of public and private data networks. Wireline interfaces 322, if present, can comprise a wire, cable, fiber-optic link and/or similar physical connection(s) to a data network, such as a WAN, LAN, one or more public data networks, one or more private data networks, or any combination of such networks. In some examples, some or all of wireless interfaces 320 and/or wireline interfaces 322 can interface with and enable communication with one or more voice networks.
[0075] In some examples, network communications interface module 308 can be configured to provide reliable, secured, and/or authenticated communications. For example, each message can include information for ensuring reliable communications (i.e., guaranteed message delivery), perhaps as part of a message header and/or footer (e.g., packet/message sequencing information, encapsulation header(s) and/or footer(s), size/time information, and transmission verification information such as cyclic redundancy check (CRC) and/or parity check values). Communications can be made secure (e.g., be encoded or encrypted) and/or decrypted or decoded using one or more cryptographic protocols and/or algorithms, such as but not limited to, Data Encryption Standard (DES), Advance Encryption Standard (AES), Rivest-Shamir-Adleman (RSA), Diffie-Hellman, and/or Digital Signature Algorithm (DSA). Other cryptographic protocols and/or algorithms can be used as well or in addition to those listed herein to encrypt and/or encode (and then decrypt and/or decode) communications.
[0076] Each computer-readable storage medium (or, more simply “readable medium”) described in this disclosure can include a non-transitory computer-readable medium that includes volatile and/or non-volatile storage components such as optical, magnetic, organic or other memory or disc storage, which can be integrated in whole or in part with a processor. Additionally or alternatively, each computer-readable medium described in this disclosure can include a transitory computer-readable medium. The transitory computer-readable medium can include, but is not limited to, a communications medium such as a digital or analogue communications medium (e.g., a fiber optic cable, a waveguide, a wired communication link, or a wireless communication line).
[0077] A network interface, such as network communication interface module 308 or any other network interface disclosed herein, can include an interface to one or more networks and/or communication channels. For example, the network interface can include one or more transmitters configured for transmitting data using the one or more networks and/or communication channels, one or more receivers configured for receiving data using the one or more networks and/or communication channels, and/or one or more transceivers configured to both transmit and receive data using the one or more networks and/or communication channels. In particular, the network interface can be used enable communications between one or more computing devices used by players and one or more gaming servers used to play games of chance and/or provide player accounting services (e.g., player identification, settling of wagers, etc.) [0078] The network interface can further include one or more receivers configured to receive data transmitted over the network or communication channel from another device within or on the network or communication channel. Any of the network interfaces disclosed herein can include circuitry, for example electronic circuitry, for converting data received from the network or communication channel to data that can be provided to a processor for processing the received data. For example, the circuitry of the network interfaces can include a modulator and/or demodulator (modem). Any of the network interfaces disclosed herein can include circuitry, for example electronic circuitry, for converting data received from another device, such as a processor or a computer-readable medium, to data in a form that can be transmitted over a network or communication channel.
[0079] In some examples, computing device 300 can include one or more payment-related devices 330. Payment-related device(s) 330 can be configured to obtain, accept, and/or dispense payments at computing device 300. Payments can take the form of coins, currency, printed payment tickets, credit cards, and/or electronic payments. Payment-related device(s) 330 include, but are not limited to, one or more: bill acceptors, coin acceptors, bill dispensers, coin dispensers, payment ticket printers, payment ticket readers, credit card readers, and electronic payment processing hardware and/or software. Other examples of payment-related device(s) 330 are possible as well.
[0080] In some examples, computing device 300 can include one or more player identity devices 332. Player identity device(s) 332 can be configured to obtain information that can identify a player. Player identity device(s) 332 include, but are not limited to, one or more: input devices configured to accept at least a player identifier, passphrase information, telephone numbers, and/or other data that can identify a player; card and/or ticket readers configured to read one or more cards and/or tickets that have player identification data that identifies a player; cameras and/or barcode scanners configured to read barcode and/or a quick response (QR) code that have player identification data that identifies a player; biometric sensors configured to obtain facial and/or other biometric information that can identify a player; and/or software related to a Game
To System (G2S) and/or and a System To System (S2S) standard promulgated by the Gaming Standards Association that can be used to identify a player. Other examples of player identity device(s) 332 are possible as well.
[0081] In some examples, computing device 300 can include one or more sensors 340. Sensor(s) 340 can be configured to measure conditions in an environment for computing device 300 and provide data about that environment. For example, sensor(s) 340 can include one or more of: (i) an identification sensor to identify other objects and/or devices, such as but not limited to, a Radio Frequency Identification (RFID) reader, proximity sensor, one-dimensional barcode reader, two-dimensional barcode (e.g., Quick Response (QR) code) reader, and a laser tracker, where the identification sensor(s) can be configured to read identifiers, such as RFID tags, barcodes, QR codes, and/or other devices and/or object configured to be read and provide at least identifying information; (ii) a location sensor to measure locations and/or movements of the computing device 300, such as but not limited to, a gyroscope, an accelerometer, a Doppler sensor, a Global Positioning System (GPS) device, a sonar sensor, a radar device, a laserdisplacement sensor, and a compass; (iii) an environmental sensor to obtain data indicative of an environment of computing device 300, such as but not limited to, an infrared sensor, an optical sensor, a light sensor, a camera, a biosensor, a capacitive sensor, a touch sensor, a temperature sensor, a wireless sensor, a radio sensor, a movement sensor, a microphone, a sound sensor, an ultrasound sensor, and/or a smoke sensor. Many other examples of sensor(s) 340 are possible as well.
IV. EXAMPLE SCENARIOS [0082] FIG. 4 illustrates environment 400, in accordance with example embodiments. Environment 400 includes location 402 and one or more servers 440 connected to data network 430. Data network 430 enables communication among computing devices inside location 402 (e.g., computing devices (CDs) 410, 412, voice-controlled speaker system (VCSS) 420) and computing devices outside location 402 (e.g., server(s) 440).
[0083] Audio communication, such as voice commands and/or queries, audio content, and embedded audio signals, between entities within location 402 is enabled by sound interface 432. In environment 400, sound interface 432 is air. FIG. 4 shows that sound interface 432 enables communication at location 402 between user 404, computing devices 410, 412, and voicecontrolled speaker system 420.
[0084] In the example illustrated in FIG. 4, computing device 410 is in location 402 and is not being held by user 404. Computing device 410 can be a mobile computing device (e.g., a smartphone, tablet computer, or notebook computer) or a stationary computing device (e.g., a desktop computer). Computing device 412 is shown being close to and perhaps held by user 404. For example, computing device 412 could be a mobile computing device.
[0085] Voice-controlled speaker system 420 is a device that can communicate using sound interface 432. For example, voice-controlled speaker system 420 can include one or more loudspeakers and/or other sound emitting devices to emit sounds to be carried using sound interface 432 and can include one or more microphones and/or other sound detectors to detect and/or receive sounds carried to voice-controlled speaker system 420 via sound interface 432.
[0086] In operation, voice-controlled speaker system 420 can receive voice commands and/or queries (e.g., from user 404) and output audible responses to the commands and/or queries. In some examples, computing device 410 and/or computing device 412 can act as a voicecontrolled speaker system. In such examples, computing device 410 and/or computing device 412 can receive voice commands and/or queries and output audible responses to the commands and/or queries.
[0087] In response to the commands and/or queries, voice-controlled speaker system 420 can either act upon the commands and/or queries locally (e.g., determine a response to a command or query without communicating outside of location 402) or can determine a response to a command or query based on data received in response to sending one or more messages related to the command or query to server(s) 440. In some examples where voice-controlled speaker system 420 acts on the commands and/or queries locally, voice-controlled speaker system 420 can retrieve and/or utilize computing resources, such as stored data, from computing device 410 and/or 412.
[0088] FIG. 5 illustrates environment 500, in accordance with example embodiments. Environment 500 includes location 502 and one or more servers 580 connected to data network 570. Data network 570 enables communication among computing devices inside location 502 (e.g., one or more of gaming devices (GDs) 510, 512, 514, 516, 518, 520, 522, 524, 526, 528, 530, 532, 534, 536, 538, 540, 542, 544, 546, 548, one or more servers 550, and computing devices 556, 560, 562, 564) and computing devices outside location 502 (e.g., one or more of server(s) 580).
[0089] Within location 502, each of gaming devices 510, 512, 514, 516, 518, 520, 522, 524, 526, 528, 530, 532, 534, 536, 538, 540, 542, 544, 546, 548 is connected to server(s) 550 and to sound interface 552. In environment 500, sound interface 552 is air. Also connected to sound interface 552 is public address system (PAS) 554. For sake of clarity, connections between computing devices made through data network 570 and sound interface 552 are not shown in FIG 5.
[0090] Gaming devices 510, 512, 514, 516, 518, 520, 522, 524, 526, 528, 530, 532, 534, 536, 538, 540, 542, 544, 546, 548 can be stationary computing devices used to play one or more games of chance and/or skill, such as slot machines, video poker games, computerized roulette machines, and/or computerized blackjack machines.
[0091] Computing device 556 can be a stationary computing device or a mobile computing device associated with and perhaps connected to public address system 554 and can be used for providing data and/or content that can be communicated as sounds via sound interface 552 throughout environment 500.
[0092] Public address system 554 includes sound producing equipment and loudspeakers that can play sounds carried via sound interface 552 throughout location 502. Public address system 554 is associated with computing device 556, which can be used for providing information and/or content that can be communicated as sound via sound interface 552 throughout environment 500.
[0093] FIG. 5 shows that gaming devices 510, 512, 514, 516, and 518 are in a region that is partially separated from the remainder of location 502 via walls. Public address system 554 has one or more loudspeakers in the partially separated region, thereby enabling sounds produced by public address system 554 to be heard or otherwise detected in the partially separated region.
[0094] Server(s) 550 can be one or more stationary and/or mobile computing devices used to communicate data and/or software between gaming devices 510, 512, 514, 516, 518, 520, 522, 524, 526, 528, 530, 532, 534, 536, 538, 540, 542, 544, 546, 548 and/or server(s) 580 via data network 570. Computing devices 560, 562, 564 are stationary and/or mobile computing devices that can be used by persons within location 502, such as smartphones, automatic teller machines (ATMs), and/or point of sale terminals.
[0095] FIGS. 6A and 6B illustrate scenario 600 taking place in environment 400, in accordance with example embodiments. In scenario 600, computing device 412 at location 402 of environment 400 requests and receives content related to a hotel “Hotell”. An embedded audio signal El in the content related to Hotell is emitted during playback of the content. Embedded audio signal El causes computing device 412 to initiate execution of a travel application Tl, where Tl is initialized with data related to Hotell provided via El. Subsequently, computing device 410 is activated (e.g., powered up) and plays gaming-related content for game Gl. Playback of gaming-related content cause embedded audio signal E2 to be emitted. Embedded audio signal E2 is received by both computing devices 410 and 412 and causes each of computing devices 410 and 412 to initiate execution of gaming application GA. Computing device 410 is then deactivated (e.g., powered down). Scenario 600 continues with user 404 speaking a voice query about “Today’s Football Odds”. The voice query is received by voicecontrolled speaker system 420, which responds to the voice query by audibly presenting betting data that includes embedded audio signal E3. Embedded audio signal E3 is received by computing device 412, and E3 causes computing device 412 to update gaming application GA to display bet Bl with betting data obtained from E3. User 404 uses computing device 412 to make bet Bl using gaming application GA, and scenario 600 ends.
[0096] FIG. 6A shows that scenario 600 begins with computing device 412 identified as “CD412” sending GetContent message 610 via data network 430 to server(s) 440. The GetContent message 610 requests content related to a hotel “Hotell”. In response, server(s) 440 send Content message 612 via data network 430 to “CD412” (computing device 412) with the requested content related to “Hotell”.
[0097] After receiving Content message 612, computing device 412 begins playing the received Hotell content. In scenario 600, computing device 412 uses a web browser to playback the Hotell content. In other scenarios, gaming-related content for game G1 can be provided by other software applications of computing device 410 (e.g., a travel application, such as travel application Tl).
[0098] In scenario 600, the Hotell content includes embedded audio signal El, where embedded audio signal El is an inaudible embedded audio signal. In other scenarios, embedded audio signal El can be an audible embedded audio signal or a combination of audible and inaudible audio signals.
[0099] Block 620 indicates that the Hotell content is played. While the Hotell content is played, embedded audio signal El is emitted, as indicated at block 622. Block 624 of FIG. 6A indicates that emitted embedded audio signal El is received by computing device 412. Computing device 412 utilizes the procedures of blocks 150 and 160 of set of functions 100 to obtain control signals from embedded audio signal El related to initiating execution of and providing data to a travel application Tl and subsequently modifying execution of computing device 412 based on these control signals to initiate execution and provide data to travel application Tl. In scenario 600, travel application Tl is identified using at least one control signal obtained from embedded audio signal El. For example, the at least one control signal could indicate a name, path, Uniform Resource Identifier (UR1), Uniform Resource Locator (URL), or other identifier that identifies travel application Tl as the application to be executed.
[0100] FIG. 6B at upper left shows display 660 of computing device 412 after computing device 412 has begun execution of and provided data to travel application Tl. Display 660 is a display of “Travel App” Tl that shows data provided from control signals from embedded audio signal
El. In this example, the data includes data related to Hotel 1, including the text “We have great rates for Hotell at Locationl. Check out our specials!” and a direction to travel application T1 to provide button 662 to “Reserved Room Now”. Display 660 also shows that travel application T1 also has button 664 to close or deactivate travel application Tl. In scenario 600, after display 660 is provided by computing device 412, user 404 selects button 664 to close or deactivate travel application Tl as indicated at block 626 of FIG. 6A.
[0101] Scenario 600 continues with user 404 activating computing device 410, as shown at block 630, and using computing device 410 to play gaming-related content for game Gl, as shown at block 632. In scenario 600, gaming-related content for game Gl is provided by a web browser application. In other scenarios, gaming-related content for game Gl can be provided by other software applications of computing device 410 (e.g., a gaming application, such as gaming application GA).
[0102] In scenario 600, the gaming-related content for game Gl includes embedded audio signal E2. In this example, embedded audio signal E2 is an audible embedded audio signal. In other examples, embedded audio signal E2 can be an inaudible embedded audio signal or a combination of audible and inaudible audio signals.
[0103] Block 632 indicates that the gaming-related content for game Gl is played by computing device 410. While the gaming-related content for game Gl is played, embedded audio signal E2 is emitted, as also indicated at block 632. Embedded audio signal E2 is received by computing device 412, as indicated by message 634 including E2. In scenario 600, message 634 with embedded audio signal E2 is transmitted from computing device 410 to computing device 412 via sound interface 432.
[0104] Computing device 412 utilizes the procedures of blocks 150 and 160 of set of functions 100 to obtain control signals from embedded audio signal E2. Then, as indicated by block 636, computing device 412 modifies its execution based on the obtained control signals to initiate execution of gaming application GA. In scenario 600, gaming application GA is identified using at least one control signal obtained from embedded audio signal E2 using the techniques discussed above in the context of block 620 and embedded audio signal El.
[0105] FIG. 6B at upper right shows display 670 of computing device 412 after gaming application GA has been executed. Display 670 includes a display of “Gaming App” GA that welcomes a potential player to “Come play with us!” Display 670 also includes close button 672 to close or deactivate the gaming application GA.
[0106] In scenario 600, computing device 410 both plays and receives embedded audio signal E2. Block 638 of FIG. 6A indicates that computing device 410 utilizes the procedures of blocks
150 and 160 of set of functions 100 to obtain control signals from embedded audio signal E2 relate to initiating execution of gaming application GA. Subsequently, computing device 410 modifies its execution of computing device 412 based on these control signals to initiate execution of gaming application GA. In scenario 600, computing device 410 is deactivated after executing gaming application GA, as indicated at block 640.
[0107] Scenario 600 continues with user 404 speaking a voice query about “Today’s Football Odds”. This voice query, which is represented in FIG. 6A by message 650 from user 404 to voice-controlled speaker system 420, is carried via sound interface 432. Upon reception of the voice query, voice-controlled speaker system 420 responds to the voice query by audibly emitting betting data to user 404, as represented in FIG. 6A by message 652 from voicecontrolled speaker system 420 to user 404.
[0108]In scenario 600, the audibly emitted betting data provided by voice-controlled speaker system 420 includes embedded audio signal E3. In this example, embedded audio signal E3 is an audible embedded audio signal. In other examples, embedded audio signal E3 can be an inaudible embedded audio signal or a combination of audible and inaudible audio signals.
[0109] Embedded audio signal E3 is received by computing device 412, as indicated in FIG. 6A by message 654. The message 654 is transmitted from voice-controlled speaker system 420 to computing device 412 via sound interface 432. Computing device 412 utilizes the procedures of blocks 150 and 160 of set of functions 100 to obtain control signals from embedded audio signal E3 related to updating gaming application GA with data obtained from the control signals from embedded audio signal E3 and subsequently updating gaming application GA with data obtained from these control signals. In scenario 600, gaming application GA is identified using at least one control signal obtained from embedded audio signal E3 using the techniques discussed above in the context of block 620 and embedded audio signal El. In addition, the betting-related data is provided in at least one control signal obtained from embedded audio signal E3.
[0110] Computing device 412 then updates gaming application GA with the betting-related data, where the betting-related data includes data about one or more wagers capable of being made using gaming application GA. Gaming application GA receives the betting-related data and updates its display based on the betting-related data, as indicated at block 656.
[0111] FIG. 6B at lower left shows display 680 of computing device 412 after gaming application GA has updated its display with betting-related data received via embedded audio message E3. In particular, display 680 shows that the betting-related data includes information about “Today’s Football Games,” specifically a match between “Team 1” and ’’Team 2” in which the odds on the “Home” team are “6/1”, the odds on a “Draw” are “7/2”, and the odds on the “Away” team are “6/11”. Display 680 also includes bet button 682 which, if selected, enables a player to bet on the match between Team 1 and Team 2, as well as previously-described close button 672.
[0112] Block 658 of FIG. 6A indicates that scenario 600 continues with user 404 selecting bet button 682 of display 680 provided by gaming application GA to place a bet Bl. FIG. 6B at lower right shows display 690 of computing device 412 after gaming application GA has placed bet Bl in response to the selection of bet button 682. Display 690 indicates that application GA has been used to place bet Bl on “Teaml for today’s game”. Display 690 also includes another bet button 692 which, if selected, enables a player to place another bet, as well as previouslydescribed close button 672.
[0113] After user 404 places bet Bl on Team 1 for today’s game, user 404 deactivates computing device 412 and scenario 600 ends.
[0114] FIGS. 7A, 7B, and 7C illustrate scenario 700 taking place in environment 500, in accordance with example embodiments. Scenario 700 begins with gaming device 510 awarding a 1,000 credit jackpot to a player and providing content by audibly and visually notifying the player about the jackpot. An embedded audio signal E10 is included with the content audibly notifying the player about the jackpot, and nearby gaming device 518 receives the content with embedded audio signal E10 via sound interface 552. Gaming device 518 then generates audible and visual notifications about the 1,000 credit jackpot won at gaming device 510. Scenario 700 continues with gaming device 520 awarding a 100,000 credit jackpot to a player and audibly and providing content by audibly and visually notifying the player about the jackpot. An embedded audio signal Ell is included with the content audibly notifying the player, and gaming devices 510, 518, 530, and 540 directly or indirectly receive embedded audio signal Ell via sound interface 552. Each of gaming devices 510, 518, 530, and 540 then generates audible and visual notifications about the 100,000 credit jackpot won at gaming device 520.
[0115] Scenario 700 proceeds with public address system 554 providing content by making an announcement of a gas leak at location 502. The announcement includes embedded audio signal E12. The announcement with embedded audio E12 is received by each gaming device at location 502, including gaming devices 510, 518, 520, 530, and 540, directly from public address system 554 via sound interface 552. Each of gaming devices 510, 518, 520, 530, and 540 generates respective audible and visual warnings of the gas leak, and if necessary, cashes out a player of the respective gaming device. After all audible and visual warnings of the gas leak have been provided, scenario 700 ends.
[0116] FIG. 7A illustrates, at block 710, that scenario 700 begins with gaming device 510 awarding a 1,000 credit jackpot to a player. Gaming device 510 also provides content audibly and visually notifying the player about the jackpot, and an embedded audio signal E10 is included with the content audibly notifying the player about the jackpot. In this example, embedded audio signal E10 is an inaudible embedded audio signal. In other examples, embedded audio signal E10 can be an audible embedded audio signal or a combination of audible and inaudible audio signals.
[0117] As gaming device 510 emits the audio notification about the 1,000 credit jackpot, gaming device 510 also emits embedded audio signal E10. The embedded audio signal E10 is received by nearby gaming device 518 via sound interface 552, as illustrated by message 712. Message 712 includes an identifier “E10” for embedded audio signal E10, a number of credits won “1000”, and a broadcast counter set to “0”. In scenario 700, the identifier, the number of credits won, and the broadcast counter are encoded as control signals in the embedded audio signal E10. [0118] Block 714 indicates that gaming device 518 utilizes the procedures of blocks 150 and 160 of set of functions 100 to obtain control signals from embedded audio signal E10, including the identifier, the number of credits won, and the broadcast counter. Subsequently gaming device 518 modifies its operation based on the control signals to generate audible and visual notifications to notify the player of gaming device 518 about the 1,000 credit jackpot won at gaming device 510.
[0119] FIG. 7C shows, at upper left, a depiction of gaming device 518 while generating audible and visual notifications about the 1,000 credit jackpot won at gaming device 510. The audible and visual notifications include bell rings and light flashes 784 emitted by bell/light 782 of gaming device 518. The visual notifications also include display 786 that states that “One of your nearby fellow players just won 1000 credits”. Display 786 also includes close button 788 to close or stop displaying the display 786.
[0120] Gaming device 518 utilizes the procedures of block 160 of set of functions 100 and set of functions 200 to determine whether to retransmit embedded audio signal E10. In this example, as the broadcast counter for embedded audio signal E10 received via message 712 is set to zero, so gaming device 518 does not retransmit embedded audio signal E10 as indicated at least by blocks 250 and 280 of set of functions 200.
[0121] Scenario 700 continues with gaming device 520 awarding a 100,000 credit jackpot to a player and providing content by audibly and visually notifying the player about the jackpot, as indicated at block 720 of FIG. 7A. The content audibly notifying the player about the 100,000 credit jackpot emitted by gaming device 520 includes embedded audio signal Ell. In this example, embedded audio signal Ell is an inaudible embedded audio signal. In other examples, embedded audio signal Ell can be an audible embedded audio signal or a combination of audible and inaudible audio signals.
[0122] Embedded audio signal Ell is received via sound interface 552 by nearby gaming devices 518 and 530 as illustrated by respective messages 722 and 724 of FIG. 7A. Each of messages 722 and 724 includes an identifier “Ell” of the embedded audio signal, a number of credits won “100000”, and a broadcast counter set to “2”. In scenario 700, the identifier, the number of credits won, and the broadcast counter are encoded as control signals in embedded audio signal Ell.
[0123] Each of gaming devices 518 and 530 utilizes the procedures of blocks 150 and 160 of set of functions 100 to obtain control signals from embedded audio signal Ell, including the identifier, the number of credits won, and the broadcast counter, as indicated by respective blocks 726 and 728. Blocks 726 and 728 indicate that respective gaming devices 518 and 530 each modified its operation based on the control signals to generate audible and visual notifications about the 100,000 credit jackpot won at gaming device 520.
[0124] FIG. 7C shows, at upper right, a depiction of gaming device 518 while generating audible and visual notifications about the 100,000 credit jackpot won at gaming device 520. The audible and visual notifications include bell rings and light flashes 790 emitted by bell/light 782 of gaming device 518. The visual notifications also include display 792 that states that a “One of your nearby fellow players just won 100000 credits”. Display 792 includes close button 788 to close or stop displaying display 792.
[0125] Each of gaming devices 518 and 530 utilizes the procedures of block 160 of set of functions 100 and set of functions 200 to determine whether to retransmit embedded audio signal El 1. In this example, the broadcast counter for embedded audio signal E10 received via message 712 is set to two. As a result, each of gaming devices 518 and 530 retransmits embedded audio signal Ell as indicated at least by blocks 250, 260, and 270 of set of functions 200 after decrementing each respective broadcast counter of messages 722 and 724 from two to one.
[0126] FIG. 7A shows that gaming device 518 retransmits embedded audio signal Ell as messages 730 and 732 with a broadcast counter of “1” to respective gaming devices 520 and 510, as well as to other gaming devices not shown in FIG. 7A. Upon reception of message 730, gaming device 520 uses the procedures of at least procedures of block 160 of set of functions 100 and block 220 of set of functions 200 to determine that gaming device 520 has recently transmitted embedded audio signal Ell. As a result, gaming device 520 discards or does not act upon or retransmit message 730, as indicated at block 734.
[0127] Upon reception of message 732, gaming device 510 uses the procedures of at least procedures of block 160 of set of functions 100 and set of functions 200 to determine that gaming device 510 has not recently transmitted embedded audio signal Ell and that the broadcast counter of message 730 is one. In response to the broadcast counter being greater than zero, gaming device 510 acts upon embedded audio signal Ell by modifying its operation based on the control signals of embedded audio signal Ell to generate audible and visual notifications about the 100,000 credit jackpot won at gaming device 520, as indicated at block 736.
[0128] Gaming device 510 also decrements the broadcast counter of one of message 732 and retransmits embedded audio signal Ell with broadcast counter equal to “0” as message 738 via sound interface 552 to at least gaming device 518. Upon reception of message 738, gaming device 518 uses the procedures of at least procedures of block 160 of set of functions 100 and block 220 of set of functions 200 to determine that gaming device 518 has recently transmitted embedded audio signal Ell. Therefore, gaming device 518 discards or does not act upon or retransmit message 738 as indicated at block 740.
[0129] FIG. 7A also shows that gaming device 530 retransmits embedded audio signal Ell with a broadcast counter of “1” as messages 742 and 744 to respective gaming devices 520 and 540, as well as to other gaming devices not shown in FIG. 7A. Upon reception of message 742, gaming device 520 uses the procedures of at least procedures of block 160 of set of functions 100 and block 220 of set of functions 200 to determine that gaming device 520 has recently transmitted embedded audio signal Ell. Therefore, gaming device 520 discards or does not act upon or retransmit message 730 as indicated at block 746.
[0130] Upon reception of message 744, gaming device 540 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 540 has not recently transmitted embedded audio signal Ell and that the broadcast counter of message 730 is one. Because the broadcast counter is greater than zero, gaming device 540 acts upon embedded audio signal Ell by modifying its operation based on the control signals of embedded audio signal Ell to generate audible and visual notifications to notify the player of gaming device 540 about the 100,000 credit jackpot won at gaming device 520, as indicated at block 748.
[0131] As shown in FIG. 7B, gaming device 540 also decrements the broadcast counter of one of message 744 and retransmits embedded audio signal Ell with broadcast counter equal to “0” as message 750 via sound interface 552 to at least gaming device 530. Upon reception of message 750, gaming device 530 uses the procedures of at least block 160 of set of functions 100 and block 220 of set of functions 200 to determine that gaming device 530 has recently transmitted embedded audio signal Ell. Therefore, gaming device 530 discards or does not act upon or retransmit message 750, as indicated at block 752.
[0132] Scenario 700 proceeds with a gas leak occurring at location 502. Upon detection of the gas leak, public address system 554 is used to make an announcement about the gas leak, as indicated by block 760. At the time of the announcement, gaming device 530 is not in use to play a game, whereas gaming devices 510, 518, 520, and 540 are in use to play games.
[0133] In scenario 700, computing device 556 is used to generate an embedded audio signal E12 that is played over public address system 554 during the announcement. As such, content provided by public address system 554 includes both the announcement about the gas leak and embedded audio signal E12. In this example, embedded audio signal E12 includes both audible and inaudible audio signals. In other examples, embedded audio signal E12 can be provided as an audible audio signal or as an inaudible embedded audio signal.
[0134] In scenario 700, the announcement with embedded audio signal E12 is received, via sound interface 552, by each gaming device at location 502, including gaming devices 510, 518, 520, 530, and 540. FIG. 7A illustrates the reception of the announcement and embedded audio signal E12 at gaming devices 510, 518, 520, 530, and 540 by respective messages 778, 774, 770, 766, and 762.
[0135] Upon reception of message 762, gaming device 540 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 540 has not recently transmitted embedded audio signal E12. Therefore, gaming device 540 acts upon embedded audio signal E12 by modifying its operation based on the control signals of embedded audio signal E12 to generate audible and visual warnings to notify the player of gaming device 540 about the gas leak, as indicated at block 764. Because gaming device 540 is in use when message 762 is received and message 762 is related to an emergency situation (i.e., a gas leak), gaming device 540 also cashes out its player in accordance with the control signals of embedded audio signal E12, as indicated at block 764. In one example, gaming device 540 cashes out its player by printing a credit ticket for the number of credits due to the player at the time of the announcement and ceases playing the game. In other examples, a gaming device can cash out its player by dispensing coins, tokens, and/or currency rather than printing a credit ticket. In scenario 700, the broadcast counter of message 762 is “-1”. Because the broadcast counter is less than zero, gaming device 540 does not retransmit message 762, based on at least by block 250 of set of functions 200.
[0136] Upon reception of message 766, gaming device 530 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 530 has not recently transmitted embedded audio signal E12. Therefore, gaming device 530 acts upon embedded audio signal E12 by modifying its operation based on the control signals of embedded audio signal E12 to generate audible and visual warnings about the gas leak as indicated at block 768. Because gaming device 530 is not in use when message 766 is received, gaming device 530 does not cash out a player, as also indicated at block 768. As the broadcast counter of message 766 is “-1”, gaming device 540 does not retransmit message 766, based on least by block 250 of set of functions 200.
[0137] Upon reception of message 770, gaming device 520 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 520 has not recently transmitted embedded audio signal E12. Therefore, gaming device 520 acts upon embedded audio signal E12 by modifying its operation based on the control signals of embedded audio signal E12 to generate audible and visual warnings to notify the player of gaming device 520 about the gas leak, as indicated at block 772. Because gaming device 520 is in use when message 770 is received and message 770 is related to an emergency situation, gaming device 520 cashes out its player in accordance with the control signals of embedded audio signal E12, as indicated at block 772. In scenario 700, gaming device 520 cashes out its player by printing a credit ticket using the techniques discussed above at least in the context of message 762 and block 764. As the broadcast counter of message 770 is “-1”, gaming device 520 does not retransmit message 770, based on at least by block 250 of set of functions 200.
[0138] Upon reception of message 774, gaming device 518 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 518 has not recently transmitted embedded audio signal E12. Therefore, gaming device 518 acts upon embedded audio signal E12 by modifying its operation based on the control signals of embedded audio signal E12 to generate audible and visual warnings to notify the player of gaming device 518 about the gas leak, as indicated at block 776. Because gaming device 518 is in use when message 774 is received and message 774 is related to an emergency situation, gaming device 518 cashes out its player in accordance with the control signals of embedded audio signal E12, as indicated at block 776. In scenario 700, gaming device 518 cashes out its player by printing a credit ticket using the techniques discussed above at least in the context of message 762 and block 764.
[0139] FIG. 7C shows, at lower center, a depiction of gaming device 518 while generating audible and visual warnings of the gas leak. The audible and visual warnings include bell rings and light flashes 794 emitted by bell/light 782 of gaming device 518. The visual warnings also include display 796 that informs a player (and/or other persons in sight of gaming device 518) of “an issue with a possible gas leak” and directs the player of gaming device 518 to “see casino safety personnel now”. Display 796 also informs the player of gaming device 518 that the player has been cashed out, stating that “this game is closed until the issue is fixed. A ticket for your balance of 75 credits is being printed”.
[0140] Returning to FIG. 7B, as the broadcast counter of message 774 is “-1”, gaming device 518 does not retransmit message 774 based on at least by block 250 of set of functions 200.
[0141] Upon reception of message 778, gaming device 510 uses the procedures of at least block 160 of set of functions 100 and set of functions 200 to determine that gaming device 510 has not recently transmitted embedded audio signal E12. Therefore, gaming device 510 acts upon embedded audio signal E12 by modifying its operation based on the control signals of embedded audio signal E12 to generate audible and visual warnings to notify the player of gaming device 510 about the gas leak as indicated at block 780. Because gaming device 510 is in use when message 778 is received and message 778 is related to an emergency situation, gaming device 510 cashes out its player in accordance with the control signals of embedded audio signal E12, as indicated at block 780. In scenario 700, gaming device 510 cashes out its player by printing a credit ticket using the techniques discussed above at least in the context of message 762 and block 764. As the broadcast counter of message 778 is “-1”, gaming device 510 does not retransmit message 778 as indicated at least by block 250 of set of functions 200.
[0142] After all audible and visual warnings of the gas leak have been provided and all players cashed out at location 402, scenario 700 ends.
V. EXAMPLE OPERATION [0143] FIG. 8 is a flow chart of functions to carry out method 800, in accordance with example embodiments. The functions are shown within blocks 810 to 830. The functions of method 800 can be carried out by a computing device, such as computing device 300 described above in the context of at least FIG. 3.
[0144] Method 800 can begin at block 810, where a computing device can receive content with an embedded audio signal at a sound detector of the computing device, as described above in the context of at least FIGS. 1 and 6A-7C. In some examples, the embedded audio signal can include an audible audio signal, as described above in the context of at least FIG. 1. In other examples, the embedded audio signal can include an inaudible audio signal, as described above in the context of at least FIG. 1.
[0145] In some examples, receiving the content with the embedded audio signal include can receiving the content with the embedded audio signal from at least one other device, as described above in the context of at least FIGS. 1 and 6A-7C. The at least one other device can include one or more of: an additional computing device and a loudspeaker, as described above in the context of at least FIGS. 1 and 6A-7C. In particular examples, the at least one other device can include a voice-controlled speaker system, and the voice-controlled speaker system can include a loudspeaker, as described above in the context of at least FIGS. 4, 6A, and 6B. In some examples, receiving the content with the embedded audio signal from the at least one other device includes receiving, at the voice-controlled speaker system, an audible input related to the one or more software applications, as described above in the context of at least FIGS. 6A and 6B. In some examples, the audible input can include a query related to a particular type of gaming, and the one or more software applications can include a gaming-related software application that is related to the particular type of gaming, as described above in the context of at least FIGS. 6A and 6B.
[0146] At block 820, the computing device can determine one or more control signals from the embedded audio signal, as described above in the context of at least FIGS. 1 and 6A-7C.
[0147] At block 830, the computing device can modify execution of one or more software applications of the computing device based on the one or more control signals, as described above in the context of at least FIGS. 1, 2, and 6A-7C.
[0148] In some examples, modifying execution of the one or more software applications of the computing device includes initiating execution of at least one software application of the one or more software applications, as described above in the context of at least FIGS. 1, 6A, and 6B. In some examples, the at least one software application is identified using at least one control signal of the one or more control signals, as described above in the context of at least FIGS. 1, 6A, and 6B.
[0149] In other examples, modifying execution of the one or more software applications of the computing device can include providing data to at least one software application of the one or more software applications, where the data is encoded in at least one control signal of the one or more control signals, as described above in the context of at least FIGS. 1, 6A, and 6B.
[0150] In still other examples, the one or more software applications can include a gamingrelated software application, as described above in the context of at least FIGS. 1, 6A, and 6B. In some of these examples, modifying execution of one or more software applications can include: initiating execution of the gaming-related software application on the computing device; and providing data to the gaming-related software application, where the data is encoded in the one or more control signals, as described above in the context of at least FIGS. 1, 6A, and 6B. In other of these examples, the content with the embedded audio signal includes gaming-related content, as described above in the context of at least FIGS. 1, 6A, and 6B. In still other of these examples, the gaming-related software application includes a software application related to sports wagering, and the one or more control signals can encode information about one or more wagers capable of being made using the software application related to sports wagering, as described above in the context of at least FIGS. 1, 6A, and 6B. In yet other examples, the one or more software applications can include a travel-related software application and/or a software application related to online purchasing, as described above in the context of at least FIGS. 1, 6A, and 6B.
[0151] In some examples, method 800 further includes: presenting the content with the embedded audio signal using the computing device, as described above in the context of at least FIGS. 1 and 6A-7C.
[0152] FIG. 9 is a flow chart of functions to carry out method 900, in accordance with example embodiments. The functions are shown within blocks 910 to 930. The functions of method 900 can be carried out by a computing device acting as a gaming device, such as computing device 300 described above in the context of at least FIG. 3.
[0153] Method 900 can begin at block 910, where a gaming device can receive content with an embedded audio signal at a sound detector of the gaming device, as described above in the context of at least FIGS. 1, 7A, 7B, and 7C. In some examples, the embedded audio signal can include an audible audio signal, as described above in the context of at least FIG. 1. In other examples, the embedded audio signal can include an inaudible audio signal, as described above in the context of at least FIG. 1.
[0154] In some examples, receiving the content with the embedded audio signal can include receiving the content with the embedded audio signal from at least one other device, as described above in the context of at least FIGS. 7A-7C. In some of these examples, the at least one other device can include one or more of: an additional gaming device and a loudspeaker, as described above in the context of at least FIGS. 7A, 7B, and 7C. In other of these examples, the at least one other device can include an additional gaming device and the embedded audio signal can include an embedded audio signal related to a winning wager on the additional gaming device, as described above in the context of at least FIGS. 7A, 7B, and 7C. In still other of these examples, the embedded audio signal can include an embedded audio signal related to an emergency condition; and determining the one or more control signals can include determining one or more control signals related to the emergency condition from the embedded audio signal related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C. In yet other of these examples, generating the output of the gaming device can include generating a visual and/or audible output of the gaming device related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C. Further, the gaming device can include a ticket printer; and generating the output of the gaming device can include generating a ticket using the ticket printer based on the one or more control signals related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C.
[0155] At block 920, the gaming device can determine one or more control signals from the embedded audio signal, as described above in the context of at least FIGS. 1, 2, 7A, 7B, and 7C. In some examples, the one or more control signals include a broadcast counter related to broadcasting at least a portion of the embedded audio signal, as described above in the context of at least FIGS. 2, 7A, 7B, and 7C.
[0156] At block 930, the gaming device can generate an output of the gaming device based on the one or more control signals, as described above in the context of at least FIGS. 1, 2, 7A, 7B, and 7C. In some examples, the gaming device can include an audible output device, and generating the output of the gaming device can include generating an output related to the winning wager using the audible output device, as described above in the context of at least FIGS. 7A and 7C.
[0157] In some examples, generating the output of the gaming device can include: determining whether the broadcast counter is above a threshold value; after determining that the broadcast counter is above the threshold value, decrementing the broadcast counter; determining one or more broadcast control signals based on the one or more control signals and the decremented broadcast counter; generating a broadcast audio signal based on the broadcast control signals; and generating an output of the gaming device that includes the broadcast audio signal, as described above in the context of at least FIGS. 2, 7A, and 7C. In other examples, generating the output of the gaming device can include: determining whether the broadcast counter is above a threshold value; and after determining that the broadcast counter is not above the threshold value, generating an output of the gaming device that does not include the broadcast audio signal, as described above in the context of at least FIGS. 2, 7A, 7B, and 7C.
[0158] FIG. 10 is a flow chart of functions to carry out method 1000, in accordance with example embodiments. The functions are shown within blocks 1010 to 1030. The functions of method 1000 can be carried out by a computing device, such as computing device 300 described above in the context of at least FIG. 3.
[0159] Method 1000 can begin at block 1010, where a computing device can receive content that includes an embedded audio signal, as described above in the context of at least FIGS. 1 and 6A7C. In some examples, the embedded audio signal can include an audible audio signal, as described above in the context of at least FIG. 1. In other examples, the embedded audio signal can include an inaudible audio signal, as described above in the context of at least FIG. 1. In some examples, the computing device can include a gaming device.
[0160] In some examples, receiving the content with the embedded audio signal can include receiving the content with the embedded audio signal from at least one other device, as described above in the context of at least FIGS. 1 and 6A-7C. In some examples, the at least one other device can include one or more of: an additional computing device, an additional gaming device, and/or a loudspeaker, as described above in the context of at least FIGS. 1 and 6A-7C. In some examples, the at least one other device can include a voice-controlled speaker system and the voice-controlled speaker system can include a loudspeaker, as described above in the context of at least FIGS. 4, 6A, and 6B. In these examples, receiving the content with the embedded audio signal from the at least one other device can include receiving, at the voice-controlled speaker system, an audible input related to the one or more software applications, as described above in the context of at least FIGS. 6A and 6B. In some of these examples, the audible input includes a query related to a particular type of gaming and where the one or more software applications include a gaming-related software application that is related to the particular type of gaming, as described above in the context of at least FIGS. 6A and 6B. In other of these examples, the at least one other device can include an additional gaming device and the embedded audio signal can include an embedded audio signal related to a winning wager on the additional gaming device, as described above in the context of at least FIGS. 7A, 7B, and 7C.
[0161] At block 1020, the computing device can determine one or more operations to be performed by the computing device, the one or more operations based on the embedded audio signal, as described above in the context of at least FIGS. 1 and 6A-7C. In some examples, the one or more operations can include one or more of: initiating execution of a first software application of the computing device; providing data to a second software application of the computing device, the data based on the embedded audio signal; generating a visual and/or audible output of the computing device; broadcasting the embedded audio signal; and generating a ticket using a ticket printer of the computing device, as described above in the context of at least FIGS. 1 and 6A-7C. In some of these examples, generating the visual and/or audible output of the computing device can include generating a visual and/or audible output related to an emergency condition, as described above in the context of at least FIGS. 7B and 7C.
[0162] In other examples, the embedded audio signal can include one or more control signals; and determining the one or more operations to be performed by the computing device can include determining one or more operations to be performed by the computing device based on the control signals, as described above in the context of at least FIGS. 1, 2, and 6A-7C. In still other examples, determining the one or more operations to be performed by the computing device based on the embedded audio signal can include decrypting and/or decompressing the embedded audio signal, as described above in the context of at least FIG. 1. In some examples, the embedded audio signal can include an embedded audio signal related to an emergency condition; and determining the one or more operations to be performed by the computing device based on the control signals can include determining one or more control signals related to the emergency condition from the embedded audio signal related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C. In these examples, modifying execution of the computing device can include generating a visual and/or audible output of the computing device related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C. Further, the computing device can include a ticket printer; and modifying execution of the computing device can include generating a ticket using the ticket printer based on the one or more control signals related to the emergency condition, as described above in the context of at least FIGS. 7B and 7C.
[0163] At block 1030, the computing device can modify execution of the computing device by performing the one or more operations using the computing device, as described above in the context of at least FIGS. 1, 2, and 6A-7C. In some examples, modifying execution of the computing device can include modifying execution of one or more software applications of the computing device, as described above in the context of at least FIGS. 1, 2, and 6A-7C. In some of these examples, modifying execution of the one or more software applications includes initiating execution of at least one software application of the one or more software applications, as described above in the context of at least FIGS. 1, 6A, and 6B. In some examples, the at least one software application can be identified using at least one control signal of the embedded audio signal, as described above in the context of at least FIGS. 1, 6A, and 6B.
[0164] In some of these examples, modifying execution of the one or more software applications of the computing device can include providing data to at least one software application of the one or more software applications, where the data is encoded in at least one control signal of the one or more control signals, as described above in the context of at least FIGS. 1, 6A, and 6B. In some of these examples, the one or more software applications can include a gaming-related software application, as described above in the context of at least FIGS. 1, 6A, and 6B. In other of these examples, modifying execution of one or more software applications can include: initiating execution of the gaming-related software application on the computing device; and providing data to the gaming-related software application, where the data is encoded in the one or more control signals, as described above in the context of at least FIGS. 1, 6A, and 6B. In still other of these examples, the content with the embedded audio signal include gaming-related content, as described above in the context of at least FIGS. 1, 6A, and 6B. In some examples, the gaming-related software application can include a software application related to sports wagering; and the one or more control signals can encode information about one or more wagers capable of being made using the software application related to sports wagering, as described above in the context of at least FIGS. 1, 6A, and 6B. In some examples, the one or more software applications can include a travel-related software application and/or a software application related to online purchasing, as described above in the context of at least FIGS. 1, 6A, and 6B. [0165] In some examples, method 1000 further includes: presenting the content with the embedded audio signal using the computing device, as described above in the context of at least FIGS. 1 and 6A-7C.
VI. ADDITIONAL EXAMPLE EMBODIMENTS [0166] From one perspective, there have been described methods and apparatus related to modifying execution of a computing device by providing data over audio using audio signals. A computing device can receive content that includes an embedded audio signal at a sound detector. The computing device can determine one or more control signals from the embedded audio signal. The computing device can modify execution of one or more software applications of the computing device based on the one or more control signals.
[0167] The following clauses are offered as further description of the disclosure.
[0168] Clause 1 - A method, including: receiving content with an embedded audio signal at a sound detector of a computing device; determining one or more control signals from the embedded audio signal at the computing device; and modifying execution of one or more software applications of the computing device based on the one or more control signals.
[0169] Clause 2 - The method of Clause 1, where the embedded audio signal includes an audible audio signal.
[0170] Clause 3 - The method of either Clause 1 or Clause 2, where the embedded audio signal includes an inaudible audio signal.
[0171] Clause 4 - The method of any one of Clauses 1-3, further including: presenting the content with the embedded audio signal using the computing device.
[0172] Clause 5 - The method of any one of Clauses 1-4, where receiving the content with the embedded audio signal includes receiving the content with the embedded audio signal from at least one other device.
[0173] Clause 6 - The method of Clause 5, where the at least one other device includes one or more of: an additional computing device and a loudspeaker.
[0174] Clause 7 - The method of Clause 6, where the at least one other device includes a voicecontrolled speaker system, the voice-controlled speaker system including a loudspeaker.
[0175] Clause 8 - The method of Clause 7, where receiving the content with the embedded audio signal from the at least one other device includes receiving, at the voice-controlled speaker system, an audible input related to the one or more software applications.
[0176] Clause 9 - The method of Clause 8, where the audible input includes a query related to a particular type of gaming and where the one or more software applications include a gamingrelated software application that is related to the particular type of gaming.
[0177] Clause 10 - The method of any one of Clauses 1-9, where modifying execution of the one or more software applications of the computing device includes initiating execution of at least one software application of the one or more software applications.
[0178] Clause 11 - The method of Clause 10, where the at least one software application is identified using at least one control signal of the one or more control signals.
[0179] Clause 12 - The method of any one of Clauses 1-11, where modifying execution of the one or more software applications of the computing device includes providing data to at least one software application of the one or more software applications, where the data is encoded in at least one control signal of the one or more control signals.
[0180] Clause 13 - The method of any one of Clauses 1-12, where the one or more software applications include a gaming-related software application.
[0181] Clause 14 - The method of Clause 13, where modifying execution of one or more software applications includes: initiating execution of the gaming-related software application on the computing device; and providing data to the gaming-related software application, where the data is encoded in the one or more control signals.
[0182] Clause 15 - The method of either Clause 13 or Clause 14, where the content with the embedded audio signal includes gaming-related content.
[0183] Clause 16 - The method of any one of Clauses 13-15, where the gaming-related software application includes a software application related to sports wagering, and where the one or more control signals encode information about one or more wagers capable of being made using the software application related to sports wagering.
[0184] Clause 17 - The method of any one of Clauses 1-16, where the one or more software applications include a travel-related software application and/or a software application related to online purchasing.
[0185] Clause 18 - The method of any of Clauses 1-17, where the computing device includes a mobile computing device.
[0186] Clause 19 - A computing device, including: one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions including the method of any one of Clauses 1-18.
[0187] Clause 20 - The computing device of Clause 19, where the data storage includes a nontransitory computer-readable medium.
[0188] Clause 21 - A computer-readable medium configured to store instructions that, when executed by one or more processors of a computing device, cause the computing device to carry out functions including the method of any one of Clauses 1-18.
[0189] Clause 22 - The computer-readable medium of Clause 21, where the computer-readable medium includes a non-transitory computer-readable medium.
[0190] Clause 23 - A method, including: receiving content with an embedded audio signal at a sound detector of a gaming device; determining one or more control signals from the embedded audio signal at the gaming device; and generating an output of the gaming device that is based on the one or more control signals.
[0191] Clause 24 - The method of Clause 23, where the embedded audio signal includes an audible audio signal.
[0192] Clause 25 - The method of either Clause 23 or Clause 24, where the embedded audio signal includes an inaudible audio signal.
[0193] Clause 26 - The method of any one of Clauses 23-25, where receiving the content with the embedded audio signal includes receiving the content with the embedded audio signal from at least one other device.
[0194] Clause 27 - The method of Clause 26, where the at least one other device includes one or more of: an additional gaming device and a loudspeaker.
[0195] Clause 28 - The method of Clause 27, where the at least one other device includes an additional gaming device, and where the embedded audio signal includes an embedded audio signal related to a winning wager on the additional gaming device.
[0196] Clause 29 - The method of either Clause 27 or Clause 28, where the gaming device includes an audible output device, and where generating the output of the gaming device includes generating an output related to the winning wager using the audible output device.
[0197] Clause 30 - The method of any one of Clauses 27-29, where the embedded audio signal includes an embedded audio signal related to an emergency condition, and where determining the one or more control signals includes determining one or more control signals related to the emergency condition from the embedded audio signal related to the emergency condition.
[0198] Clause 31 - The method of Clause 30, where generating the output of the gaming device includes generating a visual and/or audible output of the gaming device related to the emergency condition.
[0199] Clause 32 - The method of either Clause 30 or Clause 31, where the gaming device includes a ticket printer, and where generating the output of the gaming device includes generating a ticket using the ticket printer based on the one or more control signals related to the emergency condition.
[0200] Clause 33 - The method of any one of Clauses 23-32, where the one or more control signals include a broadcast counter related to broadcasting at least a portion of the embedded audio signal.
[0201] Clause 34 - The method of Clause 33, where generating the output of the gaming device includes: determining whether the broadcast counter is above a threshold value; after determining that the broadcast counter is above the threshold value, decrementing the broadcast counter; determining one or more broadcast control signals based on the one or more control signals and the decremented broadcast counter; generating a broadcast audio signal based on the broadcast control signals; and generating an output of the gaming device that includes the broadcast audio signal.
[0202] Clause 35 - The method of either Clause 33 or Clause 34, where generating the output of the gaming device includes: determining whether the broadcast counter is above a threshold value; and after determining that the broadcast counter is not above the threshold value, generating an output of the gaming device that does not include the broadcast audio signal.
[0203] Clause 36 - A gaming device, including: one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the gaming device to carry out functions including the method of any one of Clauses 23-35.
[0204] Clause 37 - The gaming device of Clause 36, where the data storage includes a nontransitory computer-readable medium.
[0205] Clause 38 - A computer-readable medium configured to store instructions that, when executed by one or more processors of a gaming device, cause the gaming device to carry out functions including the method of any one of Clauses 23-35.
[0206] Clause 39 - The computer-readable medium of Clause 38, where the computer-readable medium includes a non-transitory computer-readable medium.
[0207] Clause 40 - A method, including: receiving, at a computing device, content including an embedded audio signal; determining one or more operations to be performed by the computing device, the one or more operations based on the embedded audio signal; and modifying execution of the computing device by performing the one or more operations using the computing device.
[0208] Clause 41 - The method of Clause 40, where the embedded audio signal includes an audible audio signal.
[0209] Clause 42 - The method of either Clause 40 or Clause 41, where the embedded audio signal includes an inaudible audio signal.
[0210] Clause 43 - The method of any one of Clauses 40-42, where the one or more operations include one or more of: initiating execution of a first software application of the computing device; providing data to a second software application of the computing device, the data based on the embedded audio signal; generating a visual and/or audible output of the computing device; broadcasting the embedded audio signal; and generating a ticket using a ticket printer of the computing device.
[0211] Clause 44 - The method of Clause 43, where generating the visual and/or audible output of the computing device includes generating a visual and/or audible output related to an emergency condition.
[0212] Clause 45 - A computing device, including: one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions including the method of any one of Clauses 40-44.
[0213] Clause 46 - The computing device of Clause 45, where the data storage includes a nontransitory computer-readable medium.
[0214] Clause 47 - A computer-readable medium configured to store instructions that, when executed by one or more processors of a computing device, cause the computing device to carry out functions including the method of any one of Clauses 40-44.
[0215] Clause 48 - The computer-readable medium of Clause 47, where the computer-readable medium includes a non-transitory computer-readable medium.
VII. CONCLUSION [0216] Example embodiments have been described above. Those skilled in the art will understand that changes and modifications can be made to the described embodiments without departing from the true scope of the described embodiments as claimed.
[0217] This detailed description describes various features and functions of the disclosed systems, devices, and methods with reference to the accompanying figures. In the figures, similar symbols typically identify similar components, unless context dictates otherwise. The illustrative embodiments described in the detailed description, figures, and claims are not meant to be limiting. Other embodiments can be used, and other changes can be made, without departing from the spirit or scope of the subject matter presented herein. It will be readily understood that the aspects of the present disclosure, as generally described herein, and illustrated in the figures, can be arranged, substituted, combined, separated, and designed in a wide variety of different configurations, all of which are explicitly contemplated herein.
[0218] With respect to any or all of the message flow diagrams, scenarios, and flow charts in the figures and as discussed herein, each step, block and/or communication can represent a processing of information and/or a transmission of information in accordance with example embodiments. Alternative embodiments are included within the scope of these example embodiments. In these alternative embodiments, for example, functions described as steps, blocks, transmissions, communications, requests, responses, and/or messages can be executed out of order from that shown or discussed, including in substantially concurrent or in reverse order, depending on the functionality involved. Further, more or fewer steps, blocks and/or functions can be used with any of the message flow diagrams, scenarios, and flow charts discussed herein, and these message flow diagrams, scenarios, and flow charts can be combined with one another, in part or in whole.
[0219] A step or block that represents a processing of information can correspond to circuitry that can be configured to perform the specific logical functions of a herein-described method or technique. Alternatively or additionally, a step or block that represents a processing of information can correspond to a module, a segment, or a portion of program code (including related data). The program code can include one or more instructions executable by a processor for implementing specific logical functions or actions in the method or technique. The program code and/or related data can be stored on any type of computer-readable medium such as a storage device including a disk or hard drive or other storage media.
[0220] The computer-readable medium can include non-transitory computer-readable media such as computer-readable media that stores data for short periods of time like register memory, processor cache, and/or random access memory (RAM). The computer-readable media can include non-transitory computer-readable media that stores program code and/or data for longer periods of time, such as secondary or persistent long term storage, like read only memory (ROM), optical or magnetic disks, and/or compact-disc read only memory (CD-ROM), for example. The computer-readable media can be any other volatile or non-volatile storage systems. A computer-readable medium can be considered a computer-readable storage medium, for example, or a tangible storage device.
[0221] Software for use in carrying out the herein-described embodiments can also be in transitory form, for example in the form of signals transmitted over a network such as the Internet. Moreover, a step or block that represents one or more information transmissions can correspond to information transmissions between software and/or hardware modules in the same 5 physical device. However, other information transmissions can be between software modules and/or hardware modules in different physical devices.
[0222] While various aspects and embodiments have been disclosed herein, other aspects and embodiments will be apparent to those skilled in the art. The various aspects and embodiments disclosed herein are for purposes of illustration and are not intended to be limiting.
Claims (48)
1. A method, comprising:
receiving content with an embedded audio signal at a sound detector of a computing device;
determining one or more control signals from the embedded audio signal at the computing device; and modifying execution of one or more software applications of the computing device based on the one or more control signals.
2. The method of claim 1, wherein the embedded audio signal comprises an audible audio signal.
3. The method of either claim 1 or 2, wherein the embedded audio signal comprises an inaudible audio signal.
4. The method of any one of claims 1-3, further comprising:
presenting the content with the embedded audio signal using the computing device.
5. The method of any one of claims 1-4, wherein receiving the content with the embedded audio signal comprises receiving the content with the embedded audio signal from at least one other device.
6. The method of claim 5, wherein the at least one other device comprises one or more of: an additional computing device and a loudspeaker.
7. The method of claim 6, wherein the at least one other device comprises a voicecontrolled speaker system, the voice-controlled speaker system including a loudspeaker.
8. The method of claim 7, wherein receiving the content with the embedded audio signal from the at least one other device comprises receiving, at the voice-controlled speaker system, an audible input related to the one or more software applications.
9. The method of claim 8, wherein the audible input comprises a query related to a particular type of gaming and wherein the one or more software applications comprise a gamingrelated software application that is related to the particular type of gaming.
10. The method of any one of claims 1-9, wherein modifying execution of the one or more software applications of the computing device comprises initiating execution of at least one software application of the one or more software applications.
11. The method of claim 10, wherein the at least one software application is identified using at least one control signal of the one or more control signals.
12. The method of any one of claims 1-11, wherein modifying execution of the one or more software applications of the computing device comprises providing data to at least one software application of the one or more software applications, wherein the data is encoded in at least one control signal of the one or more control signals.
13. The method of any one of claims 1-12, wherein the one or more software applications comprise a gaming-related software application.
14. The method of claim 13, wherein modifying execution of one or more software applications comprises:
initiating execution of the gaming-related software application on the computing device; and providing data to the gaming-related software application, wherein the data is encoded in the one or more control signals.
15. The method of either claim 13 or claim 14, wherein the content with the embedded audio signal comprises gaming-related content.
16. The method of any one of claims 13-15, wherein the gaming-related software application comprises a software application related to sports wagering, and wherein the one or more control signals encode information about one or more wagers capable of being made using the software application related to sports wagering.
17. The method of any one of claims 1-16, wherein the one or more software applications comprise a travel-related software application and/or a software application related to online purchasing.
18. The method of any of claims 1-17, wherein the computing device comprises a mobile computing device.
19. A computing device, comprising:
one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions comprising the method of any one of claims 1-18.
20. The computing device of claim 19, wherein the data storage comprises a non-transitory computer-readable medium.
21. A computer-readable medium configured to store instructions that, when executed by one or more processors of a computing device, cause the computing device to carry out functions comprising the method of any one of claims 1-18.
22. The computer-readable medium of claim 21, wherein the computer-readable medium comprises a non-transitory computer-readable medium.
23. A method, comprising:
receiving content with an embedded audio signal at a sound detector of a gaming device;
determining one or more control signals from the embedded audio signal at the gaming device; and generating an output of the gaming device that is based on the one or more control signals.
24. The method of claim 23, wherein the embedded audio signal comprises an audible audio signal.
25. The method of either claim 23 or claim 24, wherein the embedded audio signal comprises an inaudible audio signal.
26. The method of any one of claims 23-25, wherein receiving the content with the embedded audio signal comprises receiving the content with the embedded audio signal from at least one other device.
27. The method of claim 26, wherein the at least one other device comprises one or more of: an additional gaming device and a loudspeaker.
28. The method of claim 27, wherein the at least one other device comprises an additional gaming device, and wherein the embedded audio signal comprises an embedded audio signal related to a winning wager on the additional gaming device.
29. The method of either claim 27 or 28, wherein the gaming device comprises an audible output device, and wherein generating the output of the gaming device comprises generating an output related to the winning wager using the audible output device.
30. The method of any one of claims 27-29, wherein the embedded audio signal comprises an embedded audio signal related to an emergency condition, and wherein determining the one or more control signals comprises determining one or more control signals related to the emergency condition from the embedded audio signal related to the emergency condition.
31. The method of claim 30, wherein generating the output of the gaming device comprises generating a visual and/or audible output of the gaming device related to the emergency condition.
32. The method of either claim 30 or claim 31, wherein the gaming device comprises a ticket printer, and wherein generating the output of the gaming device comprises generating a ticket using the ticket printer based on the one or more control signals related to the emergency condition.
33. The method of any one of claims 23-32, wherein the one or more control signals comprise a broadcast counter related to broadcasting at least a portion of the embedded audio signal.
34. The method of claim 33, wherein generating the output of the gaming device comprises: determining whether the broadcast counter is above a threshold value;
after determining that the broadcast counter is above the threshold value, decrementing the broadcast counter;
determining one or more broadcast control signals based on the one or more control signals and the decremented broadcast counter;
generating a broadcast audio signal based on the broadcast control signals; and generating an output of the gaming device that comprises the broadcast audio signal.
35. The method of either claim 33 or claim 34, wherein generating the output of the gaming device comprises:
determining whether the broadcast counter is above a threshold value; and after determining that the broadcast counter is not above the threshold value, generating an output of the gaming device that does not include the broadcast audio signal.
36. A gaming device, comprising:
one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the gaming device to carry out functions comprising the method of any one of claims 23-35.
37. The gaming device of claim 36, wherein the data storage comprises a non-transitory computer-readable medium.
38. A computer-readable medium configured to store instructions that, when executed by one or more processors of a gaming device, cause the gaming device to carry out functions comprising the method of any one of claims 23-35.
39. The computer-readable medium of claim 38, wherein the computer-readable medium comprises a non-transitory computer-readable medium.
40. A method, comprising:
receiving, at a computing device, content comprising an embedded audio signal;
determining one or more operations to be performed by the computing device, the one or more operations based on the embedded audio signal; and modifying execution of the computing device by performing the one or more operations using the computing device.
41. The method of claim 40, wherein the embedded audio signal comprises an audible audio signal.
42. The method of either claim 40 or claim 41, wherein the embedded audio signal comprises an inaudible audio signal.
43. The method of any one of claims 40-42, wherein the one or more operations comprise one or more of:
initiating execution of a first software application of the computing device;
providing data to a second software application of the computing device, the data based on the embedded audio signal;
generating a visual and/or audible output of the computing device;
broadcasting the embedded audio signal; and generating a ticket using a ticket printer of the computing device.
44. The method of claim 43, wherein generating the visual and/or audible output of the computing device comprises generating a visual and/or audible output related to an emergency condition.
45. A computing device, comprising:
one or more processors; and data storage configured to store at least computer-readable program instructions that, when executed by the one or more processors, cause the computing device to carry out functions comprising the method of any one of claims 40-44.
46. The computing device of claim 45, wherein the data storage comprises a non-transitory computer-readable medium.
47. A computer-readable medium configured to store instructions that, when executed by one
5 or more processors of a computing device, cause the computing device to carry out functions comprising the method of any one of claims 40-44.
48. The computer-readable medium of claim 47, wherein the computer-readable medium comprises a non-transitory computer-readable medium.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1801765.7A GB2572529A (en) | 2018-02-02 | 2018-02-02 | Data over audio |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB1801765.7A GB2572529A (en) | 2018-02-02 | 2018-02-02 | Data over audio |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| GB201801765D0 GB201801765D0 (en) | 2018-03-21 |
| GB2572529A true GB2572529A (en) | 2019-10-09 |
Family
ID=61730986
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB1801765.7A Withdrawn GB2572529A (en) | 2018-02-02 | 2018-02-02 | Data over audio |
Country Status (1)
| Country | Link |
|---|---|
| GB (1) | GB2572529A (en) |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6098106A (en) * | 1998-09-11 | 2000-08-01 | Digitalconvergence.Com Inc. | Method for controlling a computer with an audio signal |
| US6947893B1 (en) * | 1999-11-19 | 2005-09-20 | Nippon Telegraph & Telephone Corporation | Acoustic signal transmission with insertion signal for machine control |
| US20100160043A1 (en) * | 2006-01-12 | 2010-06-24 | Aruze Corp | Game machine |
| US20120130719A1 (en) * | 2000-02-16 | 2012-05-24 | Verance Corporation | Remote control signaling using audio watermarks |
| US20170075648A1 (en) * | 2015-09-16 | 2017-03-16 | Nuvoton Technology Corporation | Home appliance control system and control method thereof |
| US10169985B1 (en) * | 2015-11-24 | 2019-01-01 | CUE Audio, LLC | Method and system for coordinated control of multiple mobile communication devices |
-
2018
- 2018-02-02 GB GB1801765.7A patent/GB2572529A/en not_active Withdrawn
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6098106A (en) * | 1998-09-11 | 2000-08-01 | Digitalconvergence.Com Inc. | Method for controlling a computer with an audio signal |
| US6947893B1 (en) * | 1999-11-19 | 2005-09-20 | Nippon Telegraph & Telephone Corporation | Acoustic signal transmission with insertion signal for machine control |
| US20120130719A1 (en) * | 2000-02-16 | 2012-05-24 | Verance Corporation | Remote control signaling using audio watermarks |
| US20100160043A1 (en) * | 2006-01-12 | 2010-06-24 | Aruze Corp | Game machine |
| US20170075648A1 (en) * | 2015-09-16 | 2017-03-16 | Nuvoton Technology Corporation | Home appliance control system and control method thereof |
| US10169985B1 (en) * | 2015-11-24 | 2019-01-01 | CUE Audio, LLC | Method and system for coordinated control of multiple mobile communication devices |
Also Published As
| Publication number | Publication date |
|---|---|
| GB201801765D0 (en) | 2018-03-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20250157287A1 (en) | Method and system for transferring value for wagering using a portable electronic device | |
| US11861979B2 (en) | Gaming device docking station for authorized game play | |
| US20220406128A1 (en) | Biometric access data encryption | |
| US9991970B2 (en) | Transferring data via audio link | |
| US20100016052A1 (en) | Location-linked audio/video | |
| US20180322731A1 (en) | Location Detection for Portable Wagering Game Machines | |
| US20110191253A1 (en) | Use of mobile devices for communicating sound-based virtual transaction data | |
| CA3033566C (en) | Wagering system with a trigger symbol and player-adjustable layout and symbol group size | |
| US11386750B2 (en) | Linked communications for gaming systems using acoustic signatures | |
| US20190102967A1 (en) | Presence-detecting gaming systems for maintaining gaming sessions | |
| US20210110369A1 (en) | Information processing method, program, and terminal | |
| GB2572529A (en) | Data over audio | |
| JP7385855B1 (en) | Program, method, and information processing device | |
| US20220108298A1 (en) | Information processing method, program, and terminal | |
| JP2021069812A (en) | Server, portable terminal, prize content providing method, and prize content acquisition method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |