Detailed Description
While the description of the application will be presented in connection with certain embodiments, it is not intended to limit the features of this application to only this embodiment. Rather, the purpose of the description in connection with the embodiments is to cover other alternatives or modifications, which may be extended by the claims based on the application. The following description contains many specific details for the purpose of providing a thorough understanding of the present application. The application may be practiced without these specific details. Furthermore, some specific details are omitted from the description in order to avoid obscuring the application. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
The technical solutions of the embodiments of the present application will be clearly and thoroughly described below with reference to the accompanying drawings. Wherein, in the description of the embodiments of the present application, unless otherwise indicated, "/" means or, for example, a/B may represent a or B; the text "and/or" is merely an association relation describing the associated object, and indicates that three relations may exist, for example, a and/or B may indicate: a exists alone, A and B exist together, and B exists alone.
The terms "first," "second," and the like, are used below for descriptive purposes only and are not to be construed as implying or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature, and in addition, in the description of embodiments of the application, "a plurality" means two or more.
Reference in the specification to "one embodiment" or "some embodiments" or the like means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," and the like in the specification are not necessarily all referring to the same embodiment, but mean "one or more but not all embodiments" unless expressly specified otherwise. The terms "comprising," "including," "having," and variations thereof mean "including but not limited to," unless expressly specified otherwise.
The application provides a wearing detection method which is applied to a head-mounted device comprising a microphone, a processor and a loudspeaker. The head-mounted device may transmit a first ultrasonic wave through the speaker, receive a second ultrasonic wave through the microphone, the second ultrasonic wave being at least a portion of the first ultrasonic wave received by the microphone. The headset is configured to be in a first state when a difference between the amplitude of the first ultrasonic wave and the amplitude of the second ultrasonic wave is a first value. The headset is configured to be in a second state when the difference between the amplitude of the first ultrasonic wave and the amplitude of the second ultrasonic wave is a second value. Wherein the first value is different from the second value.
In an embodiment of the present application, the first state may be a worn state, and the second state may be a non-worn state (also referred to as an unworn state). The power consumption of the head-mounted device in the first state is greater than the power consumption of the head-mounted device in the second state.
In some embodiments, the operating modes of the head-mounted device may include a first mode and a second mode, the first mode and the second mode being different. The power consumption of the head-mounted device in the first mode is greater than the power consumption of the head-mounted device in the second mode. For example, when the head-mounted device is in the second mode, the head-mounted device may stop running the background program (including refreshing, downloading, etc. of the background program), pause playing audio, decrease the volume of playing audio, disconnect communication with other electronic devices, etc. as compared to the first mode. When the headset is configured to a first state, it operates in a first mode. When the headset is configured to the second state, it operates in the second mode.
Like this, whether detect the user and wear head-mounted equipment based on the ultrasonic wave, can use head-mounted equipment current device to wear and detect, practice thrift the cost to because head-mounted equipment does not increase other components and parts that are used for wearing and detect, can not increase head-mounted equipment's weight, reduce the pressure of head-mounted equipment to user's cervical vertebra. Because the ultrasonic wave cannot be heard by the human ear and does not cause harm to the human body, the wearing condition of the head-mounted device can be detected without sense under the condition that the user does not perceive the ultrasonic wave. Meanwhile, the wear detection method provided by the application can be applied to the head-mounted equipment, so that the working mode of the head-mounted equipment can be automatically controlled more intelligently based on the application scene. That is, the head-mounted device may automatically switch between the first mode and the second mode based on the detected wearing condition of the head-mounted device, so as to reduce the energy consumption of the head-mounted device, for example, when the head-mounted device is in a worn state, the audio file is played, and when the head-mounted device detects that the head-mounted device is switched from the worn state to the unworn state, the playing of the audio file may be automatically paused. And, when the head-mounted device detects that the head-mounted device is switched from the unworn state to the worn state again, the audio file may be automatically played continuously. Therefore, the audio file can be prevented from being continuously played when the head-mounted device is in the unworn state, the power consumption of the head-mounted device is reduced, the part of the audio file which is not listened by the user can be continuously played when the head-mounted device is switched to the unworn state again, and the user experience is improved. For another example, the head-mounted device displays an interface of the application when in the worn state, and may rest (also referred to as off-screen, off-screen) when the head-mounted device detects that the head-mounted device is switched from the worn state to the unworn state. When the head-mounted device detects that the head-mounted device is switched from the unworn state to the worn state again, the screen can be lightened, and the content displayed in front of the screen can be continuously displayed.
The head-mounted device according to the embodiment of the application can be glasses comprising a loudspeaker, a microphone and a processor, for example, intelligent eyes, and can be worn on the head of a user, and the head-mounted device can have communication functions besides the functions of optical correction, visible light adjustment, decoration and the like of common glasses. For example, the headset may establish a communication connection with other electronic devices (e.g., cell phone, computer, etc.), which may include wired and wireless connections. For example, the wireless connection may be a near field transmission technology such as a wireless fidelity (wireless fidelity, wi-Fi) connection, bluetooth (blue) connection, etc. The wired connection may be a universal serial bus (universal serial bus, USB) connection, a high definition multimedia interface (high definition multimedia interface, HDMI) connection, or the like. The present embodiment does not limit the type of communication connection. The head-mounted device can perform data transmission with other electronic devices through the communication connection modes. For example, when a communication connection is established between the head-mounted device and the communication device, when the communication device and the other communication devices are on the phone, the call can be answered by the head-mounted device. For another example, the headset may be plugged into a chip (e.g., a Subscriber Identity Module (SIM) card) provided by the mobile operator through which calls are received, placed, etc.
Not limited to smart glasses, the head-mounted device according to the embodiments of the present application may be other head-mounted devices, for example, a head-mounted display device with Augmented Reality (AR), virtual Reality (VR), or Mixed Reality (MR) technologies, or a smart helmet, or a head-mounted earphone, etc., which are not limited to this embodiment of the present application.
In the embodiment of the present application, the headset may suspend/stop a task being performed by the headset when it is detected that the user is not wearing the headset based on the wear detection method, for example, when the user is talking, it is detected that the user is taking off the headset, suspending the conversation, and so on.
Referring to fig. 1, fig. 1 shows a schematic structural diagram of an electronic device 100. In the present embodiment, the electronic device 100 is exemplified by glasses including a microphone and a speaker.
As shown in fig. 1, the electronic device 100 may include a glasses body 101 and a microphone 106, a speaker 107, a processor (not shown), and the like provided on the glasses body.
The glasses body 101 may include, among other things, a temple 102, a glasses frame 103, a display device 104, and a nose pad 105. The display device 104 is embedded in the eyeglass frame 103.
The temples 102 are used to support a user wearing the electronic device 100 on the head. In general, the eyeglass frame 103 includes two eyeglass frames, the temple 102 includes two temple, the two temple are respectively disposed at the rear positions of the two eyeglass frames, and the nose pad 105 is disposed between the two eyeglass frames. When the user wears the electronic device 100, the two temples are respectively put on the two ears of the user, and the nose pad 105 is put on the nose of the user.
The display device 104 is used for viewing real world objects and/or virtual pictures by a user. The display device 104 may be a transparent lens or a lens with other colors, may be an eyeglass lens with an optical correction function, may be a lens with an adjustable filter function, and may be a sunglass or other lenses with decorative effects. The display device 104 may also be a display screen or projection device that may generate and map optical signals into the eyes of a user. The present embodiment does not limit the type of the display device 104. In some embodiments, the display device 104 may not be present, i.e., the eyeglass body 101 comprises only the temple 102, the eyeglass frame 103, and the nose piece 105.
In some embodiments, when the headset is AR glasses, the display device 104 includes both glasses lenses and a display screen or projection device. In other embodiments, when the headset is VR glasses, the display 104 is a display screen.
The microphone 106 is provided on the glasses body 101, for example, the microphone 106 may be provided on the glasses leg 102 or the nose pad 105. The microphone 106 is used to collect sound signals, such as voice information of a user. The electronic device 100 may collect user voice information through the microphone 106 and parse the user voice information to generate a corresponding control instruction. Alternatively, the electronic device 100 may collect the voice information of the user through the microphone 106 and send it to other electronic devices for voice communication.
The speaker 107 is provided on the eyeglass body 101, and for example, the speaker 107 may be provided on the temple 102. The speaker 107 may be used to play audio.
A processor (not shown) may be used to parse signals or generate instructions, as well as process data, coordinate scheduling processes, and the like.
In an embodiment of the present application, the speaker 107 may be used to play back ultrasonic waves, and the microphone 106 may be used to collect the ultrasonic waves and send the collection result to the processor. The processor may determine whether the user is wearing the electronic device 100 based on the ultrasonic signals acquired by the microphone 106.
It will be appreciated that the structure of the electronic device 100 described above is merely illustrative and not limiting of other embodiments of the application.
Fig. 2 is a schematic hardware structure of an electronic device 100 according to an embodiment of the present application.
Fig. 2 illustrates an example in which the electronic device 100 is a smart glasses. The embodiment of the present application does not impose any limitation on the specific type of the electronic device 100. When the electronic device 100 is a wearable device such as VR/AR/MR glasses, headsets, etc., a portion of the hardware architecture may be added or subtracted.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, an interface 130, a charge management module 140, a power management module 141, a battery 142, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, a sensor module 180, a motor 191, an indicator 192, a camera 193, a display 194, a sim card interface 196, and the like. The sensor module 180 may include a pressure sensor 180A, a touch sensor 180B, an inertial measurement unit 180C, and the like, among others.
It is to be understood that the structure illustrated in the present embodiment does not constitute a specific limitation on the electronic apparatus 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110, which is typically used to control the overall operation of the electronic device 100, may include one or more processing units. For example: the processor 110 may include a central processor (central processing unit, CPU), an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a video processing unit (video processing unit, VPU), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural-network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, a serial peripheral interface (serial peripheral interface, SPI) interface, and the like.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 110 may contain multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180B, charger, flash, camera 193, etc., respectively, through different I2C bus interfaces. For example: the processor 110 may couple to the touch sensor 180B through an I2C interface, causing the processor 110 to communicate with the touch sensor 180B through an I2C bus interface, implementing the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, the processor 110 may contain multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 through an I2S interface.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled through a PCM bus interface. In some embodiments, the audio module 170 may also communicate audio signals to the wireless communication module 160 through a PCM interface. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
In some embodiments, a UART interface is typically used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit an audio signal to the wireless communication module 160 through a UART interface, so as to implement a function of playing audio.
The MIPI interface may be used to connect the processor 110 to peripheral devices such as a display device 194, a camera 193, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (display serial interface, DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the photographing functions of electronic device 100. The processor 110 and the display device 194 communicate via a DSI interface to implement the display functions of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display device 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. The interface may also be used to connect other electronic devices, such as cell phones, PCs, smart televisions, etc. The USB interface may be USB3.0, which is used for compatible high-speed display interface (DP) signal transmission, and may transmit video and audio high-speed data.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The charge management module 140 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 140 may receive a charging input of a wired charger through the USB interface 130. In some wireless charging embodiments, the charge management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 to power the processor 110, the internal memory 121, the display device 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be configured to monitor battery capacity, battery cycle number, battery health (leakage, impedance) and other parameters. In other embodiments, the power management module 141 may also be provided in the processor 110. In other embodiments, the power management module 141 and the charge management module 140 may be disposed in the same device.
The electronic device 100 may include wireless communication functionality, for example, the electronic device 100 may receive and play voice information from other electronic devices (e.g., a cell phone or cloud server). The wireless communication function may be implemented by an antenna (not shown), the mobile communication module 150 or the wireless communication module 160, a modem processor (not shown), a baseband processor (not shown), and the like.
The antenna is used for transmitting and receiving electromagnetic wave signals. Multiple antennas may be included in electronic device 100, each of which may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antennas may be multiplexed into diversity antennas of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including second generation (2th generation,2G) network/third generation (3th generation,3G) network/fourth generation (4th generation,4G) network/fifth generation (5th generation,5G) network, etc., as applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from an antenna, perform processes such as filtering, amplifying, etc., on the received electromagnetic waves, and transmit the electromagnetic waves to a modem processor for demodulation. The mobile communication module 150 may amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through audio devices (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or videos through the display device 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied on the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via an antenna. In some embodiments, the wireless communication module 160 may be disposed within the eyeglass body shown in fig. 1 for transmitting communication signals, including receiving, sending communication signals, such as voice information, control signaling, and the like. The electronic device 100 may establish a communication connection with other electronic devices, such as a cell phone, computer, etc., through the wireless communication module 160.
In some embodiments, the antenna of the electronic device 100 is coupled to the mobile communication module 150, the wireless communication module 160, such that the electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 may implement display functions through a GPU, a display device 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display device 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
In some embodiments, the display device 194 is for viewing real world objects or virtual pictures by a user.
The display device 194 may be a transparent lens or a lens with other colors, may be an eyeglass lens with an optical correction function, may be a lens with an adjustable filter function, and may be a sunglass or other lenses with decorative effects.
The display device 194 may also be a display screen or projection device that may generate and map optical signals into the eyes of a user for displaying images, video, etc. Wherein the display 194 may be used to present one or more virtual objects, thereby causing the electronic device 100 to provide a virtual reality scene to a user.
The manner in which the display device 194 presents virtual objects may include one or more of the following:
1. in some embodiments, display device 194 may include a display screen, which may include a display panel. The display panel may be used to display physical objects and/or virtual objects, thereby presenting a stereoscopic virtual environment to a user. The user can see the virtual object from the display panel and experience the virtual reality scene. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
2. In some embodiments, display device 194 may include an optical projection device for projecting an optical signal (e.g., a light beam) directly onto a user's retina. The display device 194 can convert real pixel image display into near-eye projection virtual image display through one or more optical devices of a reflecting mirror, a transmitting mirror, an optical waveguide and the like, and a user can directly see a virtual object through optical signals projected by the optical device to feel a three-dimensional virtual environment, so that virtual interaction experience is realized, or interaction experience combining the virtual and reality is realized. In one example, the optical device may be a micro projector or the like.
The electronic device 100 may include 1 or N display devices 194, N being a positive integer greater than 1. In some embodiments, the number of display devices 194 in the electronic apparatus may be two, one for each eye of the user. The content displayed on the two display devices may be displayed independently. The two display devices can display parallax images to improve stereoscopic impression of the images. In some possible embodiments, the number of display devices 194 in the electronic device may also be one, with both eyes of the user viewing the same image.
The present embodiment does not limit the type of the display device 194. In some embodiments, the display 194 may be absent, and the user may use other functions provided by the electronic device 100, not including display functions. Some users wear smart glasses without lenses for decorative purposes, but still have other functions of receiving/playing audio signals, etc.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format.
In some embodiments, the camera 193 may be used in conjunction with infrared devices (e.g., infrared emitters) to detect eye movements of a user, such as eye gaze direction, blink operation, gaze operation, etc., to enable eye tracking.
In some embodiments, electronic device 100 may not include camera 193.
In some embodiments, the electronic device 100 may further include an eye movement tracking module that may be used to track movement of the human eye to determine the gaze point of the human eye. For example, the pupil position can be located by an image processing technology, and the pupil center coordinates can be obtained, so that the gaze point of the person can be calculated.
The internal memory 121 may be used to store computer executable program code that includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, etc.), and so on.
In some embodiments of the present application, internal memory 121 may be used to store an application program for one or more applications, the application program comprising instructions. The application, when executed by the processor 110, causes the electronic device 100 to generate content for presentation to a user. By way of example, the application may include an application for managing the electronic device 100, such as a gaming application, a conferencing application, a video application, a desktop application, or other application, and so forth.
In addition, the internal memory 121 may further include a high-speed random access memory, a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an application processor, and the like. Such as playing audio, collecting sound signals, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or a portion of the functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 100 may listen to music, or to hands-free conversations, through the speaker 170A. Speaker 170A may be used to play acoustic wave signals audible to the human ear, also referred to as audible acoustic wave signals, having frequencies in the range of 20Hz-20000 Hz.
A receiver 170B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When electronic device 100 is answering a telephone call or voice message, voice may be received by placing receiver 170B in close proximity to the human ear.
Microphone 170C, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 170C through the mouth, inputting a sound signal to the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may also be provided with three, four, or more microphones 170C to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
In some embodiments of the present application, the speaker 170A may be used to transmit ultrasonic waves, wherein the ultrasonic waves are sound waves having a frequency exceeding 20000Hz, which are not audible to the human ear and do not cause injury to the human body. The speaker 170A may play an audible sound signal or an ultrasonic signal. In this way, the frequency ranges of the ultrasonic signal and the audible acoustic signal are different, so that the audible acoustic signal and the ultrasonic signal can be played through the multiplexing speaker 170A without mutual influence, and components required for wearing detection are reduced. The microphone 170C may be used to receive ultrasonic waves transmitted by the speaker 170A. The audio module 170 and/or the processor 110 may calculate (e.g., fourier transform) the ultrasonic signal received by the microphone 170C to obtain an amplitude of the received ultrasonic wave and determine whether the user is wearing the electronic device 100 based on the amplitude. For example, in the case where the speaker 170A and the microphone 170C are respectively on both of the temples 102, when the user wears the electronic apparatus 100, the amplitude of the ultrasonic wave received by the microphone 170C decreases due to the shielding of the ultrasonic wave transmitted by the speaker 170A by the user's head. When the user is not wearing the electronic device 100, there is no obstruction between the speaker 170A and the microphone 170C, and the attenuation of the amplitude of the ultrasonic wave received by the microphone 170C is extremely small and negligible. In this way, the electronic device 100 can determine whether the user wears the electronic device 100 based on the amplitude of the ultrasonic signal transmitted by the speaker 170A and the amplitude of the ultrasonic signal collected by the microphone 170C.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration alerting as well as for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also correspond to different vibration feedback effects by touching operations on different areas of the electronic device 100. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 192 may be an indicator light, which may be used to indicate a state of charge, a change in charge, a message, a notification, etc.
The electronic device 100 may also include other input-output interfaces through which other means may be connected to the electronic device 100. The components may include, for example, audio/video jacks, data connectors, and the like.
In some embodiments, the electronic device 100 may also include one or more keys that may control the electronic device to provide a user with access to functions on the electronic device 100. The key may be in the form of a button, switch, dial, etc., or may be a touch or near-touch sensing device (e.g., a touch sensor). The electronic device 100 may receive key inputs, generating key signal inputs related to user settings and function controls of the electronic device 100. The keys may include a power-on key, a volume key, etc.
The electronic device 100 is equipped with one or more sensors including, but not limited to, a pressure sensor 180A, a touch sensor 180B, an Inertial Measurement Unit (IMU) 180C, a bone conduction sensor, and the like.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. When a touch operation is applied to the electronic apparatus 100, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: when a touch operation with a touch operation intensity smaller than the first pressure threshold acts on the pressure sensor 180A, an instruction to pause audio is executed. When a touch operation with a touch operation intensity greater than or equal to the first pressure threshold acts on the pressure sensor 180A, an instruction to turn off the audio is performed. In some embodiments, touch operations that act on the same touch location, but for different lengths of time, may correspond to different operation instructions. For example: when a touch operation with a touch operation time length smaller than the first time threshold acts on the pressure sensor 180A, an instruction of confirmation is executed. When a touch operation with a touch operation time length greater than or equal to the first time threshold acts on the pressure sensor 180A, an instruction of power on/off is executed.
The touch sensor 180B, also referred to as a "touch device". The touch sensor 180B is used to detect a touch operation acting thereon or thereabout. The touch sensor 180B may communicate the detected touch operation to the application processor to determine the touch event type. The electronic device 100 may provide visual output related to touch operations through the display device 194. The electronic device 100 may also send an instruction corresponding to the touch operation to other electronic devices that establish a communication connection.
And an inertial measurement unit 180C. IMUs are sensors used to detect and measure acceleration and rotational motion and may include accelerometers, angular velocity meters (or gyroscopes), and the like. The accelerometer may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the electronic device 100 is stationary. The method can also be used for recognizing the gesture of the electronic equipment 100, and can be applied to somatosensory game scenes, horizontal and vertical screen switching, pedometers and the like. The gyroscope may be used to determine a motion gesture of the electronic device 100. In some embodiments, the angular velocity of the electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by a gyroscope. The gyroscope can also be used for navigation, somatosensory game scenes, camera anti-shake and the like. For example, the electronic device 100 may track movement of the user's head from an IMU or the like.
In some embodiments of the present application, the inertial measurement unit 180C may be used to detect whether the electronic device 100 is moving. For example, the electronic device 100 may detect an operation of wearing the electronic device 100 by a user from an IMU or the like. When the wearing operation is detected by a sensor such as an IMU, the electronic apparatus 100 may transmit ultrasonic waves through the speaker 170A, receive ultrasonic waves through the microphone 170C, and perform wearing detection based on the received ultrasonic waves, that is, determine whether the user wears the electronic apparatus 100.
In some embodiments of the present application, the sensor module 180 may also include a capacitive sensor. The capacitive sensor may be used to convert the detected non-electrical quantity into an electrical quantity. For example, a capacitive sensor may be disposed on the inside of the temple 102, with a change in capacitance value occurring when the capacitive sensor detects that a user is approaching or touching the capacitive sensor. The electronic device 100 may detect an operation of wearing the electronic device 100 by a user based on the capacitive sensor. The electronic apparatus 100 may transmit ultrasonic waves through the speaker 170A, receive ultrasonic waves through the microphone 170C, and perform wearing detection based on the received ultrasonic waves when the wearing operation is detected by the capacitive sensor.
In some embodiments, the sensor module 180 may also include a bone conduction sensor that may acquire the vibration signal. For example, the bone conduction sensor may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor may be disposed in the electronic device 100, and the audio module 170 may parse a voice signal based on a vibration signal of the vocal part vibration bone block obtained by the bone conduction sensor, so as to implement a voice function. The bone conduction sensor may also function as an audio playback device for outputting sound to the user. When the audio playing device is a bone conduction sensor, the two temples of the glasses leg 102 may be provided with a holding portion, and the bone conduction sensor may be disposed at the holding portion. When the user wears the electronic apparatus 100, the holding portion holds against the skull bone at the front side of the ear, and vibration is generated so that sound waves are conducted to the inner ear via the skull bone and the bone labyrinth. The position of the supporting part is directly close to the skull, so that vibration loss can be reduced, and a user can hear audio more clearly.
The SIM card interface 196 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 196, or removed from the SIM card interface 196 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 196 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 196 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 196 may also be compatible with different types of SIM cards. The SIM card interface 196 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The wearing detection method provided by the embodiment of the application is described next in connection with an application scene.
In some application scenarios, the microphone and speaker of the electronic device 100 are located on different components on the eyeglass body of the electronic device 100. After the electronic device 100 is turned on, the speaker of the electronic device 100 may transmit ultrasonic waves, and the microphone of the electronic device 100 may receive the ultrasonic waves transmitted by the speaker. Since the amplitude of the ultrasonic wave received by the electronic apparatus 100 in the worn state is different from the amplitude of the ultrasonic wave received by the electronic apparatus 100 in the unworn state, the electronic apparatus 100 can determine whether the user wears the electronic apparatus 100 based on the amplitude of the transmitted ultrasonic wave.
For example, the microphone and speaker of the electronic device 100 may be located on the sides of the two temples of the temple 102 shown in fig. 1, respectively, facing the user, e.g., the microphone may be located on the left side of the temple 102 and the speaker may be located on the right side of the temple 102. When the electronic device 100 is in the unworn state, as shown in (a) of fig. 3, the microphone and the speaker of the electronic device 100 are at different side positions, the ultrasonic wave transmitted by the speaker of the electronic device 100 is not blocked, the microphone of the electronic device 100 may receive the ultrasonic wave transmitted by most of the speakers, and at this time, the amplitude of the ultrasonic wave received by the microphone may be A1. When the electronic device 100 is in the wearing state, as shown in (b) of fig. 3, the microphone and the speaker of the electronic device 100 are in the same-side position, and the ultrasonic wave transmitted by the speaker of the electronic device 100 is blocked by the head of the user, that is, the ultrasonic wave transmitted by the speaker is partially reflected by the head of the user and cannot be received by the microphone, and the microphone of the electronic device 100 can only receive the ultrasonic wave transmitted by a small portion of the speaker, and at this time, the amplitude of the ultrasonic wave received by the microphone may be A2. Wherein A1 is greater than A2.
Therefore, when the microphone and the speaker of the electronic device 100 are at different side positions, the amplitude of the ultrasonic wave transmitted by the speaker of the electronic device 100 received by the microphone when the electronic device 100 is in the worn state is smaller than the amplitude of the ultrasonic wave received by the microphone when the electronic device 100 is in the unworn state. In this way, the electronic device 100 may determine whether the user wears the electronic device 100 based on the amplitude of the ultrasonic signal.
Next, a flowchart of a wear detection method provided in an embodiment of the present application is described.
In one possible implementation, the microphone and speaker of the electronic device 100 are located on different components of the electronic device 100, respectively. When the electronic device 100 is in the worn state, there is an occlusion between the microphone and the speaker of the electronic device 100, and when the electronic device 100 is in the unworn state, there is no occlusion between the microphone and the speaker of the electronic device 100. The electronic device 100 may transmit ultrasonic waves through a speaker, receive ultrasonic waves through a microphone, and determine whether the electronic device 100 is in a worn state or an unworn state based on the received ultrasonic waves. The electronic device 100 may operate in the first mode when in the worn state. The electronic device 100 may operate in the second mode when in an unworn state. In this way, the existing speaker and microphone of the electronic device 100 are used for wearing detection, so that the manufacturing cost is saved, and as the electronic device 100 is not added with other components for wearing detection, the weight of the electronic device 100 is not increased, and the pressure of the electronic device 100 on the cervical vertebra of a user is reduced. Since the ultrasonic wave cannot be heard by the human ear and does not cause injury to the human body, the wearing state of the electronic apparatus 100 can be detected without sense of perception by the user. Meanwhile, the electronic device 100 can be better based on the application scene, and the working mode of the electronic device 100 can be controlled more intelligently and automatically. In some embodiments, when in the worn state, the electronic device 100 may operate in the first mode, and when the electronic device 100 operates in the first mode, the electronic device 100 may normally play audio, establish a communication connection with other electronic devices for exchanging data, and the like, for use by a user. When the electronic device 100 is in the second mode, the electronic device 100 may pause playing audio, disconnect communication with other electronic devices, and so on, and reduce power consumption of the electronic device 100.
Illustratively, as shown in FIG. 4, the display method includes the steps of:
s401, starting the electronic device 100.
In some embodiments, after electronic device 100 receives an input from a user to turn on electronic device 100 (e.g., an input to a power-on key), electronic device 100 boots up in response to the input. After the electronic device 100 is started, step S402 may be performed.
It should be noted that, the electronic device 100 is not limited to the electronic device 100 being started when the electronic device 100 receives an input that the user starts the electronic device 100. When the electronic device 100 is switched from the standby mode or the sleep mode to the operation mode, it may also be understood that the electronic device 100 is started. For example, the electronic device 100 detects that the temples of the electronic device 100 are switched from the folded configuration (i.e., the two temples of the electronic device 100 are attached to each other) to the unfolded configuration (i.e., the two temples of the electronic device 100 are separated from each other, as shown in fig. 1), and the electronic device 100 is activated. The electronic device 100 may determine that the temples are switched to the extended configuration based on detecting the movement of the temples by the sensor such as the IMU. For another example, the electronic device 100 may be activated when the electronic device 100 detects a change in the position of the electronic device 100 via a sensor such as an IMU.
In some embodiments, a proximity sensor (e.g., a proximity light sensor, a capacitance sensor, an infrared sensor, an IMU sensor, etc.) is provided on a side of the eye body of the electronic device 100 proximate to the user. When the electronic device 100 determines that the user is near the electronic device 100 through the proximity sensor, the electronic device 100 is started.
In some embodiments, the electronic device 100 is coupled to a charging dock, the electronic device 100 is disposed within the charging dock, and the electronic device 100 is activated when the electronic device 100 is disengaged from the charging dock.
S402, sending ultrasonic waves by a loudspeaker of the electronic device 100.
The speaker of the electronic device 100 may transmit an ultrasonic signal. Among them, the ultrasonic signal transmitted by the electronic device 100 may be referred to as a detection signal, a target ultrasonic signal, or the like. The electronic device 100 may obtain the detection signal in several ways.
1. The electronic device 100 stores M kinds of detection signals, where M is a positive integer. The electronic device 100 may transmit one of the M detection signals.
Wherein, the waveforms of the M detection signals are different. Where waveform differences may be understood as one or more of frequency, duty cycle, etc. Specifically, when the detection signal includes only waveforms of one frequency, the frequencies of the two detection signals are different, i.e., the frequencies of the two detection signals are different. For example, one detection signal has a frequency f1, the other detection signal has a frequency f2, f1 and f2 are different, and the two detection signals are different. When the detection signals include waveforms of a plurality of frequencies, the difference in frequency of the two detection signals may be understood as a difference in frequency composition and/or arrangement of the two detection signals, for example, the first detection signal is composed of a waveform of frequency f1 and a waveform of frequency f 2. The second detection signal consists of a waveform with frequency f2 and a waveform with frequency f 1. The third detection signal consists of a waveform with frequency f1 and a waveform with frequency f 3. The first detection signal and the second detection signal both include a waveform with a frequency f1 and a waveform with a frequency f2, but the two detection signals are arranged in different orders, and the two detection signals are different. The first detection signal and the third detection signal comprise waveforms of different compositions, wherein the first detection signal comprises a waveform with frequency f2 and does not comprise a waveform with frequency f3, the third detection signal comprises a waveform with frequency f3 and does not comprise a waveform with frequency f2, the two detection signals are of different compositions, and the two detection signals are different.
Wherein the duty ratio is the ratio of the waveform with the amplitude of zero to the waveform with one period in the waveform with one period. Wherein, the duty cycle can be represented by a percentage or fraction, and the value range is between 0 and 1.
Specifically, when the detection signal includes only one duty cycle waveform, the duty cycles of the two detection signals are different, that is, the values of the duty cycles of the two detection signals are different. For example, one detection signal has a duty cycle w1, the other detection signal has a duty cycle w2, w1 and w2 are different, and the two detection signals are different. When the detection signals include waveforms of a plurality of duty ratios, the difference in duty ratio of the two detection signals can be understood as a difference in duty ratio composition and/or arrangement of the two detection signals.
In some embodiments, the two waveforms are different further comprising the waveform having an amplitude of 0 being different in position within one cycle. Specifically, when the duty ratio of the two detection signals is 25%, the waveform of the first detection signal with the amplitude of 0 is in the first T/4 time of one period, and the waveform of the second detection signal with the amplitude of 0 is in the last T/4 time of one period, the two detection signals are different.
In other embodiments, the two detection signals are distinguished for convenience. A waveform with amplitude 0 always appears at the beginning or end of a waveform of one cycle.
Alternatively, the two detection signals are different in amplitude.
Exemplary, the electronic device 100 may be described in terms of C m M e {1,2, …, M } identifies the M detection signals. For example, the waveform of the M detection signals may be referred to as an example of the waveform shown in fig. 5. Wherein the signal c is detected 1 Duty cycle of 0%, period T 1 . Detection signal c 2 Duty cycle of 0%, period T 2 . Detection signal c m-1 The duty cycle of (2) is 25%, the period is T 1 . Detection signal c m The duty cycle of (2) is 25%, the period is T 2 . Wherein the signal c is detected 1 And a detection signal c 2 Is different in period (i.e., frequency), detects signal c 1 And a detection signal c m-1 Is different in duty cycle, detects signal c 1 And a detection signal c m The duty cycle and period of (a) are different. That is, the ultrasonic wave set C m The amplitude and/or frequency of any two ultrasonic waves are different.
In some embodiments, to facilitate distinguishing between detection signals, the amplitude and duty cycle of all detection signals in the M detection signals are the same and the frequencies are different.
In some embodiments, the electronic device 100 stores the frequencies and amplitudes of the M detection signals, and the electronic device 100 may obtain the ultrasonic signal played by the speaker based on the frequencies and amplitudes. For example, the electronic device 100 may store the ultrasonic signals in the form of a list, e.g., a detection signal Number c 1 Can be in { c } 1 21000, 100}, wherein c 1 Is an identification of the ultrasonic signal. 21000 is the frequency of the ultrasonic signal, and the frequency value may be in Hz.100 is the amplitude of the ultrasonic signal, which may be in decibels (dB). In this way, the electronic device 100 can determine the frequency and amplitude of the ultrasonic signal, i.e., determine the waveform of the ultrasonic signal, based on the list. Similarly, the electronic device 100 stores the frequencies, duty cycles, and amplitudes of the M detection signals, and the electronic device 100 may obtain the ultrasonic signals played by the speaker based on the frequencies, duty cycles, and amplitudes.
In other embodiments, the electronic device 100 stores digital audio information corresponding to waveforms of the M detection signals. The electronic device 100 may obtain an analog audio signal corresponding to the digital audio signal, i.e., an ultrasonic signal, based on the stored digital audio information.
When the electronic device 100 transmits one of the M detection signals, the electronic device 100 may randomly select one of the M detection signals to transmit, for example, the electronic device 100 may transmit C m C in (C) i Wherein i is greater than 1 and less than or equal to m. Alternatively, the electronic device 100 may sequentially transmit one of the M detection signals in the order in which the M detection signals are stored. For example, the electronic device 100 may transmit the detection signal c when step S402 is performed for the first time 1 Upon executing step S402 for the second time, a detection signal c is transmitted 2 And so on.
Further, in order to avoid interference of other electronic devices, the electronic device 100 may receive a nearby acoustic signal through a microphone before transmitting the detection signal, and screen out a detection signal having a different waveform from the nearby acoustic signal from the stored detection signal, and transmit the detection signal.
By way of example, the specific steps of determining the detection signal by the electronic device 100 will be described below using frequency as an example. Specifically, as shown in fig. 6:
s601. the electronic device 100 receives the nearby acoustic wave signal.
The electronic device 100 may receive nearby acoustic signals through a microphone.
S602, the electronic device 100 judges whether N signals in M ultrasonic waves stored in the nearby sound wave signals are included, wherein 0< N < M.
The electronic device 100 may determine whether the nearby acoustic wave signal includes N detection signals among the M detection signals based on the frequency of the received acoustic wave signal. Specifically, the electronic device 100 may compare frequencies of the adjacent acoustic signals with frequencies of the M detection signals one by one, so as to determine N signals with the same frequencies as the adjacent acoustic signals in the M detection signals. Wherein 0< N < M. The N signals of the M detection signals having the same frequency as the nearby acoustic signals are N signals included in the nearby acoustic signals.
When the difference between the frequency of the near acoustic wave signal and the frequency of the detection signal is smaller than the first frequency difference (for example, 30 Hz), the near acoustic wave signal and the detection signal can be regarded as the same.
When the electronic apparatus 100 determines that N signals of the stored M detection signals are included in the ultrasonic signals in the vicinity, step S603 may be performed. When the electronic apparatus 100 determines that N signals of the stored M detection signals are not included in the ultrasonic signals in the vicinity, step S604 may be performed.
In some embodiments, the nearby acoustic signal and the detection signal are considered identical when the frequencies and amplitudes of the nearby acoustic signal and the detection signal are identical. Wherein when the difference between the frequency of the nearby acoustic wave signal and the frequency of the detection signal is smaller than the first frequency difference, the frequency of the nearby acoustic wave signal and the frequency of the detection signal are the same. When the difference between the amplitude of the near acoustic wave signal and the amplitude of the detection signal is smaller than a preset amplitude difference (e.g., 10 dB), the amplitude of the near acoustic wave signal and the amplitude of the detection signal are the same.
S603, the electronic device 100 transmits any one detection signal except the detected N detection signals among the M detection signals.
The electronic device 100 may exclude N detection signals identical to the nearby acoustic signals from among the M detection signals, and select any one of the remaining detection signals to transmit.
S604, the electronic device 100 sends any one of M detection signals.
The electronic device 100 may randomly transmit any one of the detection signals.
Alternatively, the electronic device 100 may acquire the frequency of the nearby acoustic wave signal, and transmit a detection signal having a frequency different from that of the nearby acoustic wave signal among the M detection signals.
In some embodiments, the M detection signals have the same amplitude and different frequencies.
2. The electronic device 100 may randomly generate the detection signal.
In some embodiments, the electronic device 100 may randomly generate the values of the frequency and the amplitude of the detection signal, and generate the corresponding detection signal based on the frequency value and the amplitude value. Wherein the value of the frequency of the detection signal is within a specified frequency range (e.g., 20000Hz-24000 Hz) and the value of the amplitude of the detection signal is within a specified amplitude range (e.g., 80dB-120 dB).
Further, in order to avoid interference of other electronic devices, the electronic device 100 may acquire a nearby acoustic signal through the microphone and determine the frequency of the nearby acoustic signal before generating the detection signal. And regenerating a detection signal having a frequency different from that of the nearby acoustic wave signal.
In one possible implementation, to better distinguish between detection signals transmitted by electronic device 100, electronic device 100 may transmit a prefix signal before transmitting the detection signal. The electronic device 100 may determine, based on the prefix signal, that the prefix signal belongs to an ultrasonic signal transmitted by the electronic device 100, i.e., determine a transmitted detection signal, and the electronic device 100 may determine, based on the prefix signal and the detection signal, whether the user wears the electronic device 100. The prefix signal and the detection signal may be the same or different. When the prefix signal and the ultrasonic signal are identical, a blank time interval may exist between the prefix signal and the ultrasonic signal. The length of the blank time interval is a fixed value, for example, may be 2ms.
By way of example, fig. 7 shows an example image of a prefix signal. As shown in fig. 7 (a), the prefix signal and the detection signal are identical, and a blank time interval exists between the prefix signal and the detection signal. As shown in fig. 7 (b), the prefix signal and the detection signal are different, and the prefix signal and the detection signal are connected. As shown in fig. 7 (c), the prefix signal and the detection signal are different, and a blank time interval exists between the prefix signal and the detection signal. In this way, the electronic device 100 may determine, based on the waveform of the prefix signal, whether the prefix signal and the detection signal are signals for detecting whether the user wears the electronic device 100, and determine, based on the determined prefix signal and the detection signal, whether the user wears the electronic device 100, so as to avoid interference of the nearby acoustic wave signal on the detection result.
Alternatively, the electronic device 100 may determine the detection signal based on the prefix signal, and then perform wear detection based on the detection signal.
Alternatively, the detection signal sent by the electronic device 100 may be obtained by splicing multiple ultrasonic signals with different frequencies. In this way, the electronic device 100 can jointly detect whether the user wears the electronic device 100 through the ultrasonic signals with different multi-stage frequencies, so as to further ensure the correctness of the detection result.
In one possible implementation, the electronic device 100 may compose the detection signal based on a variety of different waveforms. For example, when the electronic device 100 includes two waveforms, waveform a and waveform B, the electronic device 100 may determine the detection signal by a combined ordering of waveform a and waveform B. For example, when a piece of detection signal is composed of 4 pieces of waveforms, the detection signal may be expressed as AABA when the detection signal is composed of waveform a, waveform B, and waveform a. It will be appreciated that when the electronic device 100 includes only two waveforms, the two waveforms may be identified by a binary number for ease of understanding. For example, waveform a may be identified by a binary number of 0 and waveform B may be identified by a binary number of 1, then detection signal AABA may be represented as 0010. In this way, the M kinds of detection signals stored in the electronic device 100 can be represented as M kinds of character strings stored therein, and waveforms corresponding to each character in the character strings.
Alternatively, the electronic device 100 may store only N characters and N waveforms corresponding to the N characters. The electronic device 100 may randomly generate a character string composed of any character of the N characters, resulting in a detection signal.
S403. the microphone of the electronic device 200 receives the ultrasonic wave.
When the electronic device 100 transmits the detection signal through the speaker, a nearby acoustic wave signal including the detection signal transmitted by the speaker may be received through the microphone.
S404, the electronic device 100 judges whether the amplitude of the received ultrasonic wave exceeds a first threshold.
The electronic device 100 may perform fourier transform processing on the received detection signal to obtain an amplitude value of the received detection signal. When the amplitude of the detection signal is greater than the first threshold, the electronic device 100 may perform step S406. When the amplitude of the detection signal is less than or equal to the first threshold, the electronic device 100 may perform step S405. Wherein the value of the first threshold may be a product of the amplitude value of the transmitted detection signal and the first coefficient. The first coefficient may be a fraction value greater than 0 and less than or equal to 1. In some embodiments, the first coefficient may be a percentage between 50% and 80%. In some embodiments, the value of the first threshold is a fixed value that may be set by the manufacturer of the electronic device 100.
In some embodiments, the electronic device 100 may, after receiving the nearby analog audio signal, convert the analog audio signal into a digital audio signal, and perform fourier transform processing on the digital audio signal to obtain a correspondence between the frequency and the amplitude of the nearby audio signal. The electronic device 100 may determine the received detection signal based on the frequency of the transmitted detection signal. And determining the amplitude of the received detection signal based on the corresponding relation between the frequency and the amplitude of the nearby audio signal. The electronic device 100 may determine whether the user is wearing the electronic device 100 based on whether the value of the amplitude is greater than a first threshold. Here, the sound wave signal having the same frequency as the transmitted detection signal among the sound wave signals in the vicinity received by the electronic device 100 is the detection signal. In some embodiments, the frequencies of the two signals are the same when the difference in frequencies of the two signals is less than the first frequency difference.
For example, when the detection signal sent by the electronic device 100 is x (t), the amplitude value of the received detection signal may be represented as x [ n ], where the formula for performing discrete fourier transform on x [ n ] is:
wherein X [ k ] is the amplitude value of the nth sampling point in the detection signal with the frequency of k, X [ N ] is the amplitude value of the nth sampling point in the detection signal, N is the total number of sampling points, and j is a complex number.
In some embodiments, the amplitude of the detection signal is between 20dB-150dB, for example, may be 100dB. In some embodiments, the frequency of the detection signal is between 20000Hz-40000Hz, for example, it may be 20000Hz.
For example, when the frequency of the detection signal transmitted by the speaker of the electronic device 100 is 20000Hz and the amplitude is 100dB, the microphone of the electronic device 100 may collect the detection signal based on a 160KHz sampling frequency when there is no obstacle between the microphone and the speaker of the electronic device 100. Wherein the time domain images of the detection signal sent by the speaker and the detection signal collected by the microphone in 0.2ms may be as shown in fig. 8A. The waveform of the detection signal transmitted by the electronic device 100 in 0.2ms time may be as shown in (a) of fig. 8A. The electronic device 100 samples the detection signal shown in (a) in fig. 8A within 0.2ms, resulting in 32 sampling points that constitute a discrete waveform as shown in (b) in fig. 8A. Based on the above equation (1) for discrete fourier transform, the equation for fourier transform of the discrete waveform obtained by the microphone of the electronic device 100 is as follows,
wherein the total number of sampling points N is 32. From this equation (2), a frequency domain image of the detection signal can be obtained. The frequency domain image of the detection signal received by the electronic device 100 may be as shown in fig. 8B. From the frequency domain image of the detection signal shown in fig. 8B, it can be determined that the frequency of the detection signal collected by the microphone of the electronic device 100 is 20000Hz, and the amplitude is 100dB.
Next, an image of the acoustic signal collected by the microphone of the electronic device 100 will be described in some practical application scenarios. For example, if the frequency of the detection signal transmitted by the electronic device 100 is 20000Hz, the amplitude is 100dB, and the value of the first threshold is 60dB. As shown in fig. 8C (a), the acoustic wave signal with a frequency of 20000Hz is a detection signal among the acoustic wave signals collected by the electronic apparatus 100. The frequency of the detection signal is 20000Hz and the amplitude is 100dB. The amplitude of the detection signal received by the electronic device 100 is greater than a first threshold. The electronic device 100 may determine that the user is not wearing the electronic device 100.
Note that, the values of the amplitude of the detection signal received by the electronic device 100 and the amplitude of the detection signal transmitted by the electronic device 100 shown in (a) of fig. 8C are merely examples, and the amplitude value of the received detection signal may be less than or equal to the amplitude value of the transmitted detection signal due to the medium between the microphone and the speaker, which is not limited by the embodiment of the present application.
As shown in (b) of fig. 8C, the acoustic wave signal with a frequency of 20000Hz among the acoustic wave signals collected by the electronic apparatus 100 is a detection signal. The frequency of the detection signal was 20000Hz and the amplitude was 50dB. The value of the amplitude of the detection signal received by the electronic device 100 is smaller than the first threshold value. The electronic device 100 may determine that the user has worn the electronic device 100.
In one possible implementation, the electronic device 100 may set the amplitude value of the transmitted detection signal and the first threshold value based on the power of the electronic device 100. For example, when the power of the electronic device 100 is low (e.g., less than 20%), the electronic device 100 may reduce the amplitude of the transmitted detection signal and the value of the first threshold, thereby reducing the power consumption of the electronic device 100 for wearing detection, and further saving the power consumption of the electronic device 100.
Alternatively, when the amplitude of the received detection signal is greater than or equal to the first threshold, the electronic device 100 may determine that the user does not wear the electronic device 100, and step S406 may be performed. When the amplitude of the ultrasonic wave is smaller than the first threshold, the electronic device 100 may determine that the user has worn the electronic device 100, and execute step S405.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on a difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal. Specifically, when the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is in the first range, it is determined that the electronic device 100 is in the unworn state. When the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is not in the first range, it is determined that the electronic device 100 is in the worn state. The value of the first range may be a preset value or a value obtained based on the amplitude of the transmitted detection signal. For example, when the amplitude of the transmitted detection signal is a, the first range may be [0, a x ], where x is a coefficient greater than 0 and less than 1. For example, x may be 0.4. Thus, when the amplitude of the transmitted detection signal is 100dB, the first range is [0,40]. If the amplitude value of the received detection signal is between 60dB and 100dB, the amplitude difference between the transmitted detection signal and the received detection signal is in the first range, and the electronic device 100 is in the unworn state. If the amplitude value of the received detection signal is between 0dB and 59dB, the amplitude difference between the transmitted detection signal and the received detection signal is not in the first range, and the electronic device 100 is in the worn state.
Alternatively, the first range may be expressed as [0, a x ] or (0, a x), where x is a coefficient greater than 0 and less than 1.
Alternatively, the first range may be represented as [ k, a x ] or (k, a x), where k is greater than or equal to 0 and less than a x.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on a difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal. Specifically, when the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is the first value, it is determined that the electronic device 100 is in the worn state. When the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is the second value, it is determined that the electronic device 100 is in the non-wearing state.
In some embodiments, the first value is in the range of [0, axx ] or (0, axx ], the second value is in the range of (axx, 100) or (axx, 100), wherein a is the amplitude value of the transmitted detection signal, and x is a coefficient greater than 0 and less than 1.
Optionally, the first value is in a range of [ k, a x ] or (k, a x ], and the second value is in a range of (a x, p) or (a x, p), where k is greater than or equal to 0 and less than a x, and p is greater than a x and less than or equal to 100.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on the amplitude of the received detection signal and the percentage of the amplitude of the transmitted detection signal. Specifically, when the percentage of the amplitude of the received detection signal and the amplitude of the transmitted detection signal is the third value, it is determined that the electronic device 100 is in the worn state. When the difference between the amplitude of the received detection signal and the amplitude of the transmitted detection signal is the fourth value, it is determined that the electronic device 100 is in the non-wearing state.
In some embodiments, the third value ranges from [0%, y% ] or (0%, y% ], and the fourth value ranges from (y%, 100% ] or (y%, 100%). Where y is greater than 0 and less than 100, for example, when y is 60, the third value may range from 0% to 60%, and the second value may range from 60% to 100%.
Optionally, the third value is [ a%, y% ] or (a%, y% ], and the fourth value is (y%, b% ] or (y%, b%). Wherein a is greater than or equal to 0 and less than y, and b is greater than y and less than or equal to 100.
S405. the electronic device 100 is in a worn state, and the electronic device 100 operates in the first mode.
The electronic device 100 determines that the electronic device 100 is in the worn state based on the amplitude of the received detection signal in step S402-step S404. The electronic device 100 operates in a first mode. Because the electronic device 100 in the first mode consumes more power than in the second mode, the user's instructions can be executed more quickly and efficiently, which is convenient for the user to use.
After the electronic device 100 performs step S405, step S402 may be continued to be performed.
Alternatively, after the electronic device 100 performs step S405, the step S402 may be performed after a preset idle time (e.g., 20 ms) is provided. In this way, the power consumption of the electronic apparatus 100 for transmitting/receiving ultrasonic waves can be reduced.
It should be noted that, when the electronic device 100 is operated in the first mode, since the speaker of the electronic device 100 may play an audible sound wave signal having a frequency range between 20Hz and 20000Hz, the user may hear the audible sound wave signal having the frequency range.
In some embodiments, the electronic device 100 may periodically alternate between transmitting the detection signal and the audible acoustic signal in order that the user may listen to the audible acoustic signal transmitted by the microphone. Specifically, the electronic device 100 may transmit the detection signal in a first play period, the audible sound signal in a second play period, the electronic device 100 may transmit the detection signal in a third play period, the audible sound signal in a fourth play period, … …, and so on. The duration of the playing periods may be the same or different.
For example, the electronic device 100 may divide every 1s into 30 pieces of 33ms in transmitting the ultrasonic signal. Wherein, within each 33ms, the electronic device 100 may play the 23ms audible sound signal, and then play the 10ms detection signal. Thus, due to the auditory persistence phenomenon, the user can misinterpret the intermittently played audible sound wave signal as continuously played. It should be noted that, the present dividing method is only provided as an example of the embodiment of the present application, and the transmission time of the detection signal and the audible acoustic wave signal should not be limited.
In some embodiments, the electronic device 100 may provide X speakers, where X is an integer greater than or equal to 2. The electronic device 100 may send detection signals through Y speakers of the X speakers every preset idle time (e.g., 20 ms) for wear detection, Y being greater than 0 and less than M. Meanwhile, all of the X speakers of the electronic device 100 except for the speaker that transmits the detection signal may continue to transmit the audible sound wave signal. It should be noted that, during the preset idle time, the electronic device 100 may send audible sound signals through all the microphones.
Alternatively, when the electronic device 100 transmits the detection signal through the Y speakers, the Y speakers may transmit the detection signal for a first period of time, transmit an audible acoustic signal for a second period of time, and so on.
S406, the electronic device 100 is in an unworn state, and the electronic device 100 works in the second mode.
The electronic device 100 determines in step S402-step S404 that the electronic device 100 is in an unworn state based on the amplitude of the received detection signal, the electronic device 100 being operated in the second mode. Since the electronic device 100 in the second mode may stop the operation of the background program (including refreshing, downloading, etc. of the background program) compared to the first mode, the playing of the audio is paused or the volume of the playing of the audio is reduced, the display brightness is turned down, etc. The power consumption of the electronic device 100 is reduced.
After the electronic device 100 performs step S406, step S402 may be continued to be performed.
Alternatively, after the electronic device 100 performs step S406, the step S402 may be performed after a predetermined idle time (e.g., 20 ms). In this way, the power consumption of the electronic apparatus 100 for transmitting/receiving ultrasonic waves can be reduced.
Alternatively, the electronic device 100 may include an ultrasonic wave transmitting sensor through which the electronic device 100 may transmit an ultrasonic wave signal and an ultrasonic wave receiving sensor through which the electronic device 100 receives the ultrasonic wave signal. The electronic device 100 then determines the result of the wear detection based on the ultrasonic signal.
It should be noted that, the positions of the microphone and the speaker are not limited to those shown in fig. 3, and as long as the microphone and the speaker of the electronic device 100 are on different components, and the amplitude of the detection signal received when the electronic device 100 is in the worn state is smaller than the amplitude of the detection signal received when the electronic device 100 is in the unworn state, the electronic device 100 may determine the wearing state of the electronic device 100 by the wearing detection method shown in fig. 4.
By way of example, the microphone or speaker of the electronic device 100 may be located on different components of the left eye leg, the right eye leg, the left eye frame, the right eye frame, the nose pad, etc. As shown in fig. 9 (a), the microphone of the electronic device 100 is located at the nose pad, and the speaker is located at the temple. It will be appreciated that the location shown in fig. 9 (a) is not limited, and that the microphone and speaker of the electronic device may be located in other locations, for example, the speaker of the electronic device 100 is located in a nose pad, the microphone is located in a temple, etc., and embodiments of the present application are not limited.
By way of example, the number of speakers of the electronic device 100 may be more than 1. As shown in fig. 9 (B), the microphone of the electronic device 100 is located at the nose pad, the speaker a is located at the left side temple, and the speaker B is located at the right side temple. It will be appreciated that the microphone and speaker of the electronic device may be located on other components, such as the speaker a of the electronic device 100 may be located on a nose pad, the microphone may be located on a left side temple, and the speaker B may be located on a right side temple, not limited to the components shown in fig. 9 (B).
When the electronic device 100 includes a plurality of speakers, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals transmitted by the plurality of speakers. If the amplitude of at least one section of detection signals in the detection signals sent by the plurality of speakers and received by the electronic device 100 is greater than the first threshold, it may be determined that the user does not wear the electronic device 100. That is, the electronic device 100 can determine that the user wears the electronic device 100 only if the amplitudes of the detection signals transmitted by all the speakers received by the electronic device 100 are less than or equal to the first threshold. Thus, when a portion of the speakers of the electronic device 100 are occluded by false touches, the electronic device 100 may also determine whether the user is wearing the electronic device 100. Optionally, if the amplitude of at least one section of the detection signals sent by the plurality of speakers and received by the electronic device 100 is greater than or equal to the first threshold, it may be determined that the user does not wear the electronic device 100. If the amplitudes of the detection signals sent by all the speakers received by the electronic device 100 are smaller than the first threshold, it may be determined that the user wears the electronic device 100.
Alternatively, when the electronic device 100 includes a plurality (2 or more) of speakers, the frequency of the ultrasonic waves transmitted by each speaker is different in order to avoid the influence (e.g., superposition or cancellation) of the acoustic waves between the speakers. Alternatively, there is a phase difference between the ultrasonic waves transmitted from the plurality of speakers, and the phase difference is one cycle of the ultrasonic waves.
Alternatively, the electronic device 100 may control the phases of the ultrasonic waves transmitted by the plurality of speakers to form an ultrasonic wave beam such that the direction of the resulting ultrasonic wave beam is toward the microphone. In this way, the energy of the beam can be concentrated in the direction of the microphone, and more accurate wearing detection results can be obtained through the amplitude of the ultrasonic wave received by the microphone.
By way of example, the number of microphones of the electronic device 100 may be more than 1. As shown in fig. 9 (c), the microphone a of the electronic device 100 is located at the nose pad, the microphone B is located at the right side temple, and the speaker is located at the left side temple. It will be appreciated that the microphone and speaker of the electronic device may be located elsewhere, such as the speaker of the electronic device 100 being located on the nose pad, microphone a being located on the left eye temple, and microphone B being located on the right eye temple, not limited to the position shown in fig. 9 (c).
When the electronic device 100 includes a plurality of microphones, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals received by the plurality of microphones. If the amplitude of at least one section of detection signals received by the plurality of microphones of the electronic device 100 is greater than the first threshold, it may be determined that the user is not wearing the electronic device 100. That is, only if the amplitudes of the detection signals received by all the microphones of the electronic device 100 are less than or equal to the first threshold, the electronic device 100 can determine that the user wears the electronic device 100. Thus, when a portion of the microphones of the electronic device 100 are blocked by false touches, the electronic device 100 may also determine whether the user wears the electronic device 100. Optionally, if the amplitude of at least one section of the detection signals received by the plurality of microphones of the electronic device 100 is greater than or equal to the first threshold, it may be determined that the user does not wear the electronic device 100. If the amplitudes of the detection signals received by all the microphones of the electronic device 100 are smaller than the first threshold value, it may be determined that the user wears the electronic device 100.
By way of example, the number of microphones and speakers of the electronic device 100 may be more than 1. As shown in fig. 9 (d), the microphone a of the electronic device 100 is located at the nose pad, the microphone B is located at the right side temple, and the speaker a and the speaker B are located at the left side temple. It will be appreciated that the location shown in fig. 9 (d) is not limited, and that the microphone and speaker of the electronic device may be located in other locations, for example, microphone a and microphone B of the electronic device 100 are located in the nose pad, speaker a is located in the left eye temple, and speaker B is located in the right eye temple, which is not limited in this embodiment of the application.
In some application scenarios, the microphone and speaker of the electronic device 100 are located on the same component on the eye body of the electronic device 100. After the electronic device 100 is turned on, the speaker of the electronic device 100 may transmit ultrasonic waves, and the microphone of the electronic device 100 may receive the ultrasonic waves transmitted by the speaker. The electronic apparatus 100 may determine the wearing state of the electronic apparatus 100 based on the amplitude of the transmitted ultrasonic wave and the amplitude of the received ultrasonic wave.
By way of example, the microphone and speaker of the electronic device 100 may be on the same side temple (e.g., at the right side temple) of the temple 102 shown in fig. 1. When the electronic device 100 is in the unworn state, as shown in (a) of fig. 10, the microphone and the speaker of the electronic device 100 are on the same component, the ultrasonic wave transmitted by the speaker of the electronic device 100 is not blocked, most of the ultrasonic wave is transmitted to the periphery, the microphone of the electronic device 100 can only receive a small part of the ultrasonic wave transmitted by the speaker, and at this time, the amplitude of the ultrasonic wave received by the microphone may be A3. When the electronic device 100 is in the wearing state, as shown in (b) of fig. 10, the microphone and the speaker of the electronic device 100 are in the same component, the ultrasonic wave transmitted by the speaker of the electronic device 100 is blocked by the head of the user, most of the ultrasonic wave is reflected to the microphone, and the microphone of the electronic device 100 can receive most of the ultrasonic wave transmitted by the speaker, at this time, the amplitude of the ultrasonic wave received by the microphone may be A4. Wherein A3 is less than A4. It should be noted that, the present application is not limited to the right side glasses leg, the microphone and the speaker may be located at the left side glasses leg, or the nose pad, or the glasses frame, etc., and the embodiment of the present application is not limited to this.
Therefore, when the microphone and the speaker of the electronic apparatus 100 are in the same component, the amplitude of the ultrasonic wave transmitted by the speaker of the electronic apparatus 100 received by the microphone when the electronic apparatus 100 is in the worn state is larger than the amplitude of the ultrasonic wave received by the microphone when the electronic apparatus 100 is in the unworn state. In this way, the electronic apparatus 100 can determine whether or not the user wears the electronic apparatus 100 based on the amplitudes of the ultrasonic waves received in the worn state and the unworn state.
For example, the electronic device 100 may determine that the electronic device 100 is in the worn state when it is determined that the amplitude of the ultrasonic wave received by the microphone is greater than the first threshold. When it is determined that the amplitude of the ultrasonic wave received by the microphone is less than or equal to the first threshold value, it is determined that the electronic apparatus 100 is in the unworn state. Alternatively, the electronic device 100 may determine that the electronic device 100 is in the worn state when it is determined that the amplitude of the ultrasonic wave received by the microphone is greater than or equal to the first threshold. When it is determined that the amplitude of the ultrasonic wave received by the microphone is smaller than the first threshold value, it is determined that the electronic apparatus 100 is in the unworn state.
Illustratively, as shown in FIG. 11, the display method includes the steps of:
S1101. the electronic device 100 starts.
After the electronic device 100 is started, step S1102 may be performed.
S1102. the speaker of the electronic device 100 transmits ultrasonic waves.
The speaker of the electronic device 100 may transmit an ultrasonic signal. Among them, the ultrasonic signal transmitted by the electronic device 100 may be referred to as a detection signal, a target ultrasonic signal, or the like.
S1103. the microphone of the electronic device 200 receives ultrasonic waves.
When the electronic device 100 transmits the detection signal through the speaker, a nearby acoustic wave signal including the detection signal transmitted by the speaker may be received through the microphone.
Specifically, the detailed descriptions of step S1101 to step S1103 can be referred to the embodiment shown in fig. 4, and will not be repeated here.
S1104. the electronic apparatus 100 determines whether the amplitude of the received ultrasonic wave exceeds a first threshold.
The electronic device 100 may perform fourier transform processing on the received detection signal to obtain an amplitude value of the received detection signal. When the amplitude of the detection signal is greater than the first threshold, the electronic device 100 may perform step S1105. When the amplitude of the detection signal is less than or equal to the first threshold, the electronic device 100 may perform step S1106. Wherein the value of the first threshold may be a product of the amplitude value of the transmitted detection signal and the first coefficient. The first coefficient may be a fraction value greater than 0 and less than or equal to 1. In some embodiments, the first coefficient may be any percentage between 50% and 80%.
Alternatively, when the amplitude of the received detection signal is greater than or equal to the first threshold, the electronic device 100 may determine that the user has worn the electronic device 100, and step S1105 may be performed. When the amplitude of the ultrasonic wave is smaller than the first threshold, the electronic device 100 may determine that the user is not wearing the electronic device 100, and step S1106 is performed.
Specifically, the specific description of the wearing detection operation performed by the electronic device 100 may refer to the embodiment shown in fig. 4, which is not described herein.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on a difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal. Specifically, when the electronic device 100 determines that the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is in the first range, it is determined that the electronic device 100 is in the worn state. When the electronic device 100 determines that the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is not in the first range, it is determined that the electronic device 100 is in the unworn state. The value of the first range may be a preset value, or may be obtained based on the amplitude of the transmitted detection signal. Illustratively, when the amplitude of the transmitted detection signal is a, the first range may be [0, a x ], where x is a coefficient greater than 0 and less than 1, e.g., x may be 0.4. For example, when the amplitude of the transmitted detection signal is 100dB, the first range is [0,40]. If the amplitude value of the received detection signal is between 60dB and 100dB, the difference between the amplitudes of the transmitted detection signal and the received detection signal is in the first range, and the electronic device 100 is in the worn state. If the amplitude value of the received detection signal is between 0dB and 59dB, the amplitude difference between the transmitted detection signal and the received detection signal is not in the first range, and the electronic device 100 is in an unworn state.
Alternatively, the first range may be expressed as [0, a x ] or (0, a x), where x is a coefficient greater than 0 and less than 1.
Alternatively, the first range may be represented as [ k, a x ] or (k, a x), where k is greater than or equal to 0 and less than a x.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on a difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal. Specifically, when the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is the first value, it is determined that the electronic device 100 is in the non-wearing state. When the difference between the amplitude of the transmitted detection signal and the amplitude of the received detection signal is the second value, it is determined that the electronic device 100 is in the worn state.
In some embodiments, the first value is in the range of [0, axx ] or (0, axx ], the second value is in the range of (axx, 100) or (axx, 100), wherein a is the amplitude value of the transmitted detection signal, and x is a coefficient greater than 0 and less than 1.
Optionally, the first value is in a range of [ k, a x ] or (k, a x ], and the second value is in a range of (a x, p) or (a x, p), where k is greater than or equal to 0 and less than a x, and p is greater than a x and less than or equal to 100.
In one possible implementation, the electronic device 100 may determine the wearing state of the electronic device 100 based on the amplitude of the received detection signal and the percentage of the amplitude of the transmitted detection signal. Specifically, when the percentage of the amplitude of the received detection signal and the amplitude of the transmitted detection signal is the third value, it is determined that the electronic device 100 is in the non-wearing state. When the difference between the amplitude of the received detection signal and the amplitude of the transmitted detection signal is the fourth value, it is determined that the electronic device 100 is in the worn state.
In some embodiments, the third value ranges from [0%, y% ] or (0%, y% ], and the fourth value ranges from (y%, 100% ] or (y%, 100%). Where y is greater than 0 and less than 100, for example, when y is 60, the third value may range from 0% to 60%, and the fourth value may range from 60% to 100%.
Optionally, the third value is [ a%, y% ] or (a%, y% ], and the fourth value is (y%, b% ] or (y%, b%). Wherein a is greater than or equal to 0 and less than y, and b is greater than y and less than or equal to 100.
S1105. the electronic device 100 is in a worn state, and the electronic device 100 operates in the first mode.
The electronic device 100 determines that the electronic device 100 is in the worn state based on the amplitude of the received detection signal in step S1102-step S1104. The electronic device 100 operates in a first mode. Since the electronic device 100 in the first mode can execute the user's instructions more quickly and efficiently than in the second mode, it is convenient for the user to use.
After the electronic apparatus 100 performs step S1105, step S1102 may be continued to be performed.
Alternatively, after the electronic device 100 performs step S1105, step S1102 may be performed after a preset idle time (e.g., 20 ms) is separated. In this way, the power consumption of the electronic apparatus 100 for transmitting/receiving ultrasonic waves can be reduced.
In the first mode, the description of the sending detection signal of the electronic device 100 may refer to the embodiment shown in fig. 4, which is not described herein.
S1106. the electronic device 100 is in an unworn state, and the electronic device 100 operates in the second mode.
The electronic device 100 determines in step S1102-step S1104 that the electronic device 100 is in an unworn state based on the amplitude of the received detection signal, the electronic device 100 being operated in the second mode. Since the electronic device 100 in the second mode may stop the operation of the background program (including refreshing, downloading, etc. of the background program) compared to the first mode, the playing of the audio is paused or the volume of the playing of the audio is reduced, the display brightness is turned down, etc. The power consumption of the electronic device 100 is reduced.
After the electronic device 100 performs step S1106, step S1102 may be continued to be performed.
Alternatively, after the electronic device 100 performs step S1106, the step S1102 may be performed after a preset idle time (e.g., 20 ms) is provided. In this way, the power consumption of the electronic apparatus 100 for transmitting/receiving ultrasonic waves can be reduced.
In some embodiments, the electronic device 100 may further confirm the wearing state of the electronic device 100 based on other sensors (e.g., proximity sensor, IMU, etc.) after determining the wearing state of the electronic device 100 based on the detection signal. The electronic device 100 may operate in the first mode when it is determined that the user is wearing the electronic device 100 based on the detection signal and other sensors. The electronic device 100 may operate in the second mode when it is determined that the user is not wearing the electronic device 100 based on the detection signal or other sensor.
Alternatively, the electronic device 100 may include an ultrasonic transceiver sensor, and the electronic device 100 may transmit or receive ultrasonic signals through the ultrasonic transceiver sensor. The electronic device 100 then determines the result of the wear detection based on the ultrasonic signal.
In some embodiments, when the operating mode of the electronic device 100 is the second mode, the electronic device 100 may disconnect the communication connection with the electronic device 200 in order to reduce power consumption. When the operation mode of the electronic device 100 is the first mode, the electronic device 100 may establish a communication connection with the electronic device 200 in order that the electronic device 100 may exchange data with the electronic device 200. Accordingly, when the operation mode of the electronic device 100 is switched from the second mode to the first mode, a communication connection can be established with the electronic device 200. When the operation mode of the electronic device 100 is switched from the first mode to the second mode, if a communication connection is established between the electronic device 100 and the electronic device 200, the electronic device 100 may disconnect the communication connection established with the electronic device 200. The electronic device 200 may be a tablet computer, a mobile phone, a desktop computer, a laptop computer, a handheld computer, a notebook computer, a vehicle-mounted device, a smart home device, a smart city device, and/or the like.
In some embodiments, the second mode may be turned on when the electronic device 100 is in the unworn state, and the electronic device 100 may switch the second mode to the first mode when the electronic device 100 detects that the electronic device 100 is in the worn state through the detection signal. Wherein, the power consumption of the electronic device 100 in the second mode is smaller than the power consumption of the electronic device 100 in the first mode. In this way, the electronic device 100 may operate in the second mode when in the unworn state, so as to reduce power consumption of the electronic device 100, and the electronic device 100 may operate in the first mode when in the unworn state, so as to be convenient for a user to use.
The electronic device 100 may implement the wearing detection method by other signals, not limited to ultrasonic waves. Other signals may include, but are not limited to, infrasonic waves, infrared, visible light, and the like.
Note that, the positions of the microphone and the speaker are not limited to those shown in fig. 10, and as long as the microphone and the speaker of the electronic device 100 are on the same component, and the amplitude of the detection signal received when the electronic device 100 is in the worn state is larger than the amplitude of the detection signal received when the electronic device 100 is in the unworn state, the electronic device 100 can determine the wearing state of the electronic device 100 by the wearing detection method shown in fig. 11 described above.
For example, the microphone or speaker of the electronic device 100 may be located on the left eye leg, the right eye leg, the left eye frame, the right eye frame, the nose pad, and the like. As shown in fig. 12 (a), the microphone and speaker of the electronic device 100 are located at the nose pad. It will be appreciated that the location shown in fig. 12 (a) is not limited, and that the microphone and speaker of the electronic device may be located in other locations, such as the right side temple, etc., and embodiments of the application are not limited in this regard.
By way of example, the number of speakers of the electronic device 100 may be more than 1. As shown in fig. 12 (B), the microphone, speaker a, and speaker B of the electronic device 100 are both located on the right side temple. It will be appreciated that the microphone and speaker of the electronic device may be located on other components, such as the left eye leg, nose pad, etc., at the same time, not limited to the components shown in fig. 12 (b).
When the electronic device 100 includes a plurality of speakers, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals transmitted by the plurality of speakers. If the amplitude of at least one section of detection signals in the detection signals sent by the plurality of speakers and received by the electronic device 100 is smaller than the first threshold, it may be determined that the user does not wear the electronic device 100. That is, only if the amplitudes of the detection signals transmitted by all the speakers received by the electronic device 100 are greater than or equal to the first threshold, the electronic device 100 can determine that the user wears the electronic device 100. Thus, when a portion of the speakers of the electronic device 100 are occluded by false touches, the electronic device 100 may also determine whether the user is wearing the electronic device 100. Optionally, if the amplitude of at least one section of the detection signals sent by the plurality of speakers and received by the electronic device 100 is less than or equal to the first threshold, it may be determined that the user does not wear the electronic device 100. If the amplitudes of the detection signals sent by all the speakers received by the electronic device 100 are greater than the first threshold, it may be determined that the user wears the electronic device 100.
Alternatively, when the electronic device 100 includes a plurality (2 or more) of speakers, the frequency of the ultrasonic waves transmitted by each speaker is different in order to avoid the influence (e.g., superposition or cancellation) of the acoustic waves between the speakers. Alternatively, there is a phase difference between the ultrasonic waves transmitted from the plurality of speakers, and the phase difference is one cycle of the ultrasonic waves.
By way of example, the number of microphones of the electronic device 100 may be more than 1. As shown in fig. 12 (c), the microphone a, the microphone B, and the speaker of the electronic device 100 are both located on the right side temple. It will be appreciated that the microphone and speaker of the electronic device may be located elsewhere, such as the right eye temple, not limited to the position shown in fig. 12 (c).
When the electronic device 100 includes a plurality of microphones, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals received by the plurality of microphones. If at least one section of the detection signals received by the plurality of microphones of the electronic device 100 has an amplitude smaller than the first threshold value, it may be determined that the user does not wear the electronic device 100. That is, only if the amplitudes of the detection signals received by all the microphones of the electronic device 100 are greater than or equal to the first threshold, the electronic device 100 can determine that the user wears the electronic device 100. Thus, when a portion of the microphones of the electronic device 100 are blocked by false touches, the electronic device 100 may also determine whether the user wears the electronic device 100. Optionally, if the amplitude of at least one section of the detection signals received by the plurality of microphones of the electronic device 100 is less than or equal to the first threshold, it may be determined that the user does not wear the electronic device 100. If the amplitudes of the detection signals received by all the microphones of the electronic device 100 are greater than the first threshold, it may be determined that the user wears the electronic device 100.
By way of example, the number of microphones and speakers of the electronic device 100 may be more than 1. As shown in fig. 12 (d), the microphone a, the microphone B, the speaker a, and the speaker B of the electronic device 100 are all located on the right side temple. It will be appreciated that the microphone and speaker of the electronic device may be located in other locations, such as the left side temple, etc., without limitation, and that the location shown in (d) of fig. 12 is not limiting. In this way, the microphone B of the electronic device 100 may perform wearing detection based on the received detection signal sent by the speaker a and the detection-side signal sent by the speaker B, and the microphone a of the electronic device 100 may also perform wearing detection based on the received detection signal sent by the speaker a and the detection-side signal sent by the speaker B, so that the wearing detection result is more accurate.
In one possible implementation, the electronic device 100 includes both a microphone and a speaker located in the same component, as well as a microphone and a speaker located in different components. The electronic device 100 may determine the wearing state of the electronic device 100 in combination with the wearing detection methods provided in fig. 4 and 9. In this way, the electronic apparatus 100 can more accurately determine the wearing state of the electronic apparatus 100.
By way of example, the number of speakers of the electronic device 100 may be more than 1. As shown in fig. 13 (a), the microphone of the electronic device 100, the speaker B is located on the right side temple, and the speaker a is located on the left side temple.
When the electronic device 100 includes a plurality of speakers, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals transmitted by the plurality of speakers. The electronic device 100 may determine the wearing state of the electronic device 100 by using the detection method shown in fig. 11 based on the detection signal sent by the speaker (e.g., speaker B) located on the same component and received by the microphone. The electronic device 100 may determine the wearing state of the electronic device 100 by the detection method shown in fig. 4 based on the detection signal transmitted by the speaker (e.g., speaker a) located on a different component received by the microphone. It should be noted that, only when it is determined that the electronic apparatus 100 is in the worn state based on the detection signals transmitted from all the speakers, it is possible to determine that the electronic apparatus 100 is in the worn state.
Alternatively, when the electronic device 100 includes a plurality (2 or more) of speakers, the frequency of the ultrasonic waves transmitted by each speaker is different in order to avoid the influence (e.g., superposition or cancellation) of the acoustic waves between the speakers. Alternatively, there is a phase difference between the ultrasonic waves transmitted from the plurality of speakers, and the phase difference is one cycle of the ultrasonic waves.
By way of example, the number of microphones of the electronic device 100 may be more than 1. As shown in fig. 13 (B), the microphone B and the speaker of the electronic device 100 are both located on the right side temple, and the microphone a is located on the left side temple.
When the electronic device 100 includes a plurality of microphones, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals received by the plurality of microphones. The electronic device 100 may determine the wearing state of the electronic device 100 by using the detection method shown in fig. 11 based on the detection signal received by the microphone (for example, the microphone B) and sent by the speaker located on the same component as the microphone. The electronic device 100 may determine the wearing state of the electronic device 100 by the detection method shown in fig. 4 based on the detection signal received by a microphone (for example, the microphone a) and transmitted by a speaker located on a different component from the microphone. It should be noted that, only when it is determined that the electronic device 100 is in the worn state based on the detection signals received by all the microphones, it may be determined that the electronic device 100 is in the worn state.
By way of example, the number of microphones and speakers of the electronic device 100 may be more than 1. As shown in fig. 13 (c), the microphone a, the speaker a, and the microphone B of the electronic device 100 are both located on the left-side temple, and the right-side temple, respectively. In this way, the microphone B of the electronic device 100 may perform wearing detection based on the received detection signal sent by the speaker a and the detection-side signal sent by the speaker B, and the microphone a of the electronic device 100 may also perform wearing detection based on the received detection signal sent by the speaker a and the detection-side signal sent by the speaker B, so that the wearing detection result is more accurate.
When the electronic device 100 includes a plurality of microphones, the electronic device 100 may determine whether the user wears the electronic device 100 based on detection signals received by the plurality of microphones. The electronic device 100 may determine the wearing state of the electronic device 100 by using the detection method shown in fig. 11 based on the detection signal received by the microphone (e.g., the microphone B) and sent by the speaker (e.g., the speaker B) located on the same component as the microphone, and determine the wearing state of the electronic device 100 by using the detection method shown in fig. 4 based on the detection signal received by the microphone and sent by the speaker (e.g., the speaker a) located on a different component. Meanwhile, the electronic device 100 may determine the wearing state of the electronic device 100 through the detection method shown in fig. 11 based on the detection signal received by a microphone (for example, a microphone a) and transmitted by a speaker (for example, a speaker a) located on the same component as the microphone, and determine the wearing state of the electronic device 100 through the detection method shown in fig. 4 based on the detection signal received by the microphone and transmitted by a speaker (for example, a speaker B) located on a different component. It should be noted that, only when it is determined that the electronic device 100 is in the worn state based on the detection signals received by all the microphones, it may be determined that the electronic device 100 is in the worn state.
It will be appreciated that the microphone and speaker of the electronic device may be located in other components, not limited to the locations shown in fig. 13, and embodiments of the application are not limited in this regard.
In some embodiments, when the electronic device 100 is a helmet that includes a microphone, a speaker, and a processor, the microphone and speaker of the electronic device 100 are located in different parts of the helmet. The electronic device 100 may detect the wearing condition of the helmet by the wearing detection method shown in fig. 4. The microphone and speaker of the electronic device 100 are located in the same portion of the helmet. The electronic apparatus 100 can detect the wearing condition of the helmet by the wearing detection method shown in fig. 11. For example, the helmet may be divided into five parts, front, rear, top, left, right, respectively. The front part can be a partial area contacted with the forehead of the user, the rear part can be a partial area contacted with the hindbrain spoon of the user, the top part can be a partial area contacted with the top of the head of the user, the left part can be a partial area contacted with the left ear of the user, and the right part can be a partial area contacted with the bait of the user. The partial distribution of the helmet is only an example, and the application is not limited thereto.
In some application scenarios, the display device of the electronic device 100 is a display screen or a projection device. For example, the electronic device 100 may be an AR device. The electronic device 100 may transmit the detection signal through a speaker and receive the detection signal transmitted through a microphone. The electronic device 100 may perform an operation corresponding to the worn state when it is determined that the electronic device 100 is in the worn state based on the received detection signal, for example, set the operation mode to the first mode. The electronic device 100 may perform an operation corresponding to the worn state when it is determined that the electronic device 100 is in the unworn state based on the received detection signal, for example, set the operation mode to the second mode. In this way, based on the wear detection method, the power consumption of the electronic apparatus 100 can be reduced.
In particular, when the microphone and speaker of the electronic device 100 are located in different components, the step of performing wear detection by the electronic device 100 may be seen in the embodiment shown in fig. 4. The step of the electronic device 100 performing wear detection when the microphone and speaker of the electronic device 100 are located in the same component may be seen in the embodiment as shown in fig. 11.
In some embodiments, when the operating mode of the electronic device 100 is the second mode, the electronic device 100 may pause playing the video image in order to reduce power consumption. When the operation mode of the electronic device 100 is the first mode, the electronic device 100 may play video images in order for the user to view the video. Accordingly, when the operation mode of the electronic device 100 is switched from the second mode to the first mode, the paused video file may continue to be played. When the operation mode of the electronic device 100 is switched from the first mode to the second mode, the electronic device 100 may pause playing the video file.
Illustratively, as shown in fig. 14 (a), the electronic device 100 is in a worn state, and the electronic device 100 plays a video file. The electronic apparatus 100 may display the video page 1201 on the left display device and display the video page 1202 on the right display device. Video page 1201 and video page 1202 include video images.
When the electronic device 100 determines that the electronic device 100 is switched from the worn state to the unworn state based on the wear detection method provided by the embodiment of the present application, the electronic device 100 may pause playing the video file. As shown in fig. 14 (b), the electronic device 100 is in an unworn state, and the electronic device 100 pauses playing the video file. In which the electronic apparatus 100 may display a video pause icon 1211 on the video page 1201 on the left display device and a video pause icon 1212 on the video page 1202 on the right display device. Video images are included in video page 1201 and video page 1202. Video pause icon 1211 and video pause icon 1212 may be used to prompt the consumer electronic device 100 that playing of the video file has been paused. In this way, the electronic device 100 can pause playing the video file when switching from the worn state to the unworn state, without requiring the user to manually pause playing the video, so as to reduce user operations and facilitate use by the user.
Similarly, when the electronic apparatus 100 is in an unworn state, a video pause icon 1211 and a video pause icon 1212 as shown in fig. 14 (b) may be displayed. When the electronic device 100 determines that the electronic device 100 is switched from the unworn state to the worn state based on the wear detection method provided by the embodiment of the present application, the electronic device 100 may continue playing the video file. The electronic device 100 may cancel the display of the video pause icon 1211 and the video pause icon 1212 and continue playing the video file as shown in fig. 14 (a). In this way, the electronic device 100 can continue playing the paused video file when switching from the unworn state to the worn state, without requiring the user to manually play the video, so as to reduce the user operation and facilitate the use of the user.
In some embodiments, when the electronic device 100 includes a display screen or a projector, the electronic device 100 may turn off the screen (also referred to as off-screen, off-screen) in the second mode when the operating mode of the electronic device 100 is the second mode in order to reduce power consumption. When the operation mode of the electronic device 100 is the first mode, the display content of the electronic device 100 is available for use by a user. The electronic device 100 may be on screen. Therefore, when the operation mode of the electronic device 100 is switched from the second mode to the first mode, the screen may be lit, and the interface of the application may be displayed. When the operation mode of the electronic device 100 is switched from the first mode to the second mode, the electronic device 100 may turn off the screen, thereby saving power consumption.
The implementations described in the above embodiments are merely illustrative and do not limit the other embodiments of the present application in any way. The specific internal implementation may be different according to different types of electronic devices, different operating systems carried on the electronic devices, different programs used and different interfaces called, and the embodiment of the present application is not limited in any way, and may implement the feature functions described in the embodiment of the present application.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit of the application.