[go: up one dir, main page]

CN115086817A - Method and device for identifying left and right earphones and earphone - Google Patents

Method and device for identifying left and right earphones and earphone Download PDF

Info

Publication number
CN115086817A
CN115086817A CN202110271354.XA CN202110271354A CN115086817A CN 115086817 A CN115086817 A CN 115086817A CN 202110271354 A CN202110271354 A CN 202110271354A CN 115086817 A CN115086817 A CN 115086817A
Authority
CN
China
Prior art keywords
earphone
coordinate system
vector
acceleration sensor
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110271354.XA
Other languages
Chinese (zh)
Other versions
CN115086817B (en
Inventor
李乔峰
张文奇
梁其中
张洵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202110271354.XA priority Critical patent/CN115086817B/en
Publication of CN115086817A publication Critical patent/CN115086817A/en
Application granted granted Critical
Publication of CN115086817B publication Critical patent/CN115086817B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1041Mechanical or electronic switches, or control elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/10Earpieces; Attachments therefor ; Earphones; Monophonic headphones
    • H04R1/1091Details not provided for in groups H04R1/1008 - H04R1/1083

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Acoustics & Sound (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

本申请实施例适用于耳机技术领域,提供了一种左右侧耳机的识别方法、装置及一种耳机,该方法包括:获取第一加速度传感器采集的第一传感器数据以及第二加速度传感器采集的第二传感器数据;根据第一传感器数据和第二传感器数据,将第一加速度传感器对应的第一坐标系和第二加速度传感器对应的第二坐标系统一至基坐标中;确定第一目标感应轴在基坐标系中的第一投影向量以及第二目标感应轴在基坐标系中的第二投影向量;计算第一投影向量和第二投影向量之间的向量积;根据向量积识别第一耳机和第二耳机的佩戴位置。本申请实施例通过向量积结合人体耳朵的固有特征,可以实现左右侧耳机的自适应识别。

Figure 202110271354

The embodiments of the present application are applicable to the technical field of earphones, and provide a method and device for identifying left and right earphones, and an earphone. The method includes: acquiring first sensor data collected by a first acceleration sensor and first sensor data collected by a second acceleration sensor. Two sensor data; according to the first sensor data and the second sensor data, the first coordinate system corresponding to the first acceleration sensor and the second coordinate system corresponding to the second acceleration sensor are divided into base coordinates; it is determined that the first target sensing axis is in the base coordinate The first projection vector in the coordinate system and the second projection vector of the second target sensing axis in the base coordinate system; calculate the vector product between the first projection vector and the second projection vector; The wearing position of the second earphone. In the embodiment of the present application, the adaptive identification of the left and right earphones can be realized by combining the vector product with the inherent characteristics of the human ear.

Figure 202110271354

Description

Method and device for identifying left and right earphones and earphone
Technical Field
The embodiment of the application relates to the technical field of earphones, in particular to a left earphone and a right earphone identification method and device and an earphone.
Background
The earphone is a pair of conversion units which can receive the electric signal sent by the media player or the receiver and convert the electric signal into audible sound wave by using a loudspeaker close to the ear. Generally, the headset may include a wired headset, a wireless headset, etc., depending on the manner of use.
When the user uses the earphones, the left earphone and the right earphone need to be correctly distinguished. In this way, the speech information of the left and right channels can be correctly distributed to the headphones on the corresponding side. Fig. 1 is a schematic diagram of a headset. The "R" identified on the headset shown in fig. 1 indicates that the headset is a right-hand headset that the user should wear on the right ear; conversely, an "L" labeled on the headset indicates that the headset is a left headset that the user should wear on the left ear.
In recent years, with the development of true wireless headsets (TWS), some headsets are beginning to adopt a design scheme in which left and right side headsets are completely identical. In this product design, the "R" or "L" is no longer marked on each earphone to distinguish between the left and right sides, but the same earphone can be compatible with both ears. When the earphone is used by a user, the wearing position of the earphone is automatically identified by the earphone without distinguishing which earphone is the left earphone and which earphone is the right earphone, and then the voice information of the left and right sound channels is distributed according to the identification result. However, the prior art has a low accuracy in identifying the wearing position of the headset.
Disclosure of Invention
The embodiment of the application provides a method and a device for identifying a left side earphone and a right side earphone and an earphone, which are used for solving the problem that the accuracy rate of identifying the wearing position of the earphone is low in the prior art.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for identifying left and right earphones is provided, where the earphones include a first earphone having a first acceleration sensor with a first target sensing axis parallel to a central axis of the first earphone and a second earphone having a second acceleration sensor with a second target sensing axis parallel to a central axis of the second earphone, and the method includes:
acquiring first sensor data acquired by the first acceleration sensor and second sensor data acquired by the second acceleration sensor;
unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system according to the first sensor data and the second sensor data;
determining a first projection vector of the first target sensing axis in the base coordinate system and a second projection vector of the second target sensing axis in the base coordinate system;
calculating a vector product between the first projection vector and the second projection vector;
and identifying the wearing positions of the first earphone and the second earphone according to the vector product.
The method for identifying the left and right earphones provided by the embodiment of the application has the following beneficial effects: through arranging acceleration sensor in first earphone and the second earphone on left and right sides respectively, can utilize the user to wear the contained angle that forms between two earphone axes behind the earphone, gather the comparison with the data of two acceleration sensor collections to final discernment user wears the correct position of earphone. Generally, the earphones have a simple knocking function, and the knocking function is realized by an acceleration sensor, so that the identification method provided by the embodiment of the application can realize self-adaptive identification of the left and right earphones by combining the vector product with the inherent characteristics of human ears under the condition of not additionally adding devices, and the identification accuracy of the left and right earphones is improved.
In a possible implementation manner of the first aspect, a gyroscope may be added to the first earphone and the second earphone, and the recognition accuracy of the left earphone and the right earphone is further improved by detecting a gesture of the user for taking the first earphone and/or the second earphone.
In a possible implementation manner of the first aspect, the first sensor data includes multiple sets of first data respectively acquired by the first acceleration sensor when the head of the user is in multiple different postures, and the second sensor data includes multiple sets of second data respectively acquired by the second acceleration sensor when the head of the user is in multiple different postures; when the head of the user is in any posture, the first data acquired by the first acceleration sensor and the second data acquired by the second acceleration sensor have a corresponding relation. Therefore, the left and right earphones are identified by adopting various data, and the identification accuracy can be further improved.
In a possible implementation manner of the first aspect, the unifying, according to the first sensor data and the second sensor data, a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system includes: calculating a rotation matrix from the first sensor data and the second sensor data; and unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system based on the rotation matrix.
In a possible implementation manner of the first aspect, the determining a first projection vector of the first target sensing axis in the base coordinate system and a second projection vector of the second target sensing axis in the base coordinate system includes: determining a first vector of the first target sensing axis in a first coordinate system according to the first sensor data, and determining a second vector of the second target sensing axis in a second coordinate system according to the second sensor data; based on the rotation matrix, a first projection vector of the first vector in the base coordinate system and a second projection vector of the second vector in the base coordinate system are calculated.
In a possible implementation manner of the first aspect, the determining the first coordinate system as the base coordinate system and the calculating the vector product between the first projection vector and the second projection vector include: and calculating the vector product between the unit vector on the target axis of the base coordinate system and the second projection vector. Therefore, the coordinate systems corresponding to the left earphone and the right earphone are unified into the coordinate system of one earphone, the data volume of subsequent calculation can be reduced, and the identification efficiency is improved.
In a possible implementation manner of the first aspect, the identifying the wearing positions of the first earphone and the second earphone according to the vector product includes: calculating the quantity product between the vector product and the vector corresponding to the gravity acceleration; if the number product is greater than zero, determining that the first earphone is worn on the left ear and the second earphone is worn on the right ear; if the number product is less than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear.
In a possible implementation manner of the first aspect, when the user is in an inverted state, if the product of the numbers is greater than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear; if the number product is less than zero, it is determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear.
In a possible implementation manner of the first aspect, after identifying the wearing positions of the first earphone and the second earphone according to the vector product, the method further includes: detecting whether the first earphone and the second earphone are removed from the ear canal of the user; locking the information of the recognized wearing positions of the first earphone and the second earphone if the first earphone and/or the second earphone is not taken out of the ear canal of the user.
In a second aspect, an apparatus for identifying left and right earphones is provided, which may be applied to the earphones of the first aspect, and the earphones include a first earphone and a second earphone. The first earphone is provided with a first acceleration sensor, and a first target sensing axis of the first acceleration sensor is parallel to a central axis of the first earphone; the second earphone is provided with a second acceleration sensor, a second target sensing axis of the second acceleration sensor is parallel to a central axis of the second earphone, and the device specifically comprises the following modules:
the sensor data acquisition module is used for acquiring first sensor data acquired by the first acceleration sensor and second sensor data acquired by the second acceleration sensor;
a coordinate system conversion module, configured to unify a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system according to the first sensor data and the second sensor data;
a projection vector determination module, configured to determine a first projection vector of the first target sensing axis in the base coordinate system and a second projection vector of the second target sensing axis in the base coordinate system;
a vector product calculation module for calculating a vector product between the first projection vector and the second projection vector;
and the wearing position identification module is used for identifying the wearing positions of the first earphone and the second earphone according to the vector product.
In one possible implementation manner of the second aspect, the first sensor data includes multiple sets of first data respectively acquired by the first acceleration sensor when the head of the user is in multiple different postures, and the second sensor data includes multiple sets of second data respectively acquired by the second acceleration sensor when the head of the user is in multiple different postures; when the head of the user is in any posture, the first data acquired by the first acceleration sensor and the second data acquired by the second acceleration sensor have a corresponding relation.
In a possible implementation manner of the second aspect, the coordinate system conversion module is specifically configured to: calculating a rotation matrix from the first sensor data and the second sensor data; and unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system based on the rotation matrix.
In a possible implementation manner of the second aspect, the projection vector determination module is specifically configured to: determining a first vector of the first target sensing axis in a first coordinate system according to the first sensor data, and determining a second vector of the second target sensing axis in a second coordinate system according to the second sensor data; based on the rotation matrix, a first projection vector of the first vector in the base coordinate system and a second projection vector of the second vector in the base coordinate system are calculated.
In a possible implementation manner of the second aspect, the base coordinate system may be the first coordinate system, and the vector product calculation module is specifically configured to: and calculating the vector product between the unit vector on the target axis of the base coordinate system and the second projection vector.
In a possible implementation manner of the second aspect, the wearing position identifying module is specifically configured to: calculating the quantity product between the vector product and the vector corresponding to the gravity acceleration; if the number product is greater than zero, determining that the first earphone is worn on the left ear and the second earphone is worn on the right ear; if the number product is less than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear.
In a possible implementation manner of the second aspect, the wearing position identifying module is further configured to: when the user is in an inverted state, if the number product is larger than zero, the first earphone is judged to be worn on the right ear, and the second earphone is judged to be worn on the left ear; if the number product is less than zero, it is determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear.
In a possible implementation manner of the second aspect, the apparatus further includes a detection module and a locking module, where:
the detection module is used for detecting whether the first earphone and the second earphone are taken out of the ear canal of the user;
the locking module is used for locking the information of the recognized wearing positions of the first earphone and the second earphone if the first earphone and/or the second earphone are not taken out of the ear canal of the user.
In a third aspect, a headset is provided, which comprises a memory, a processor and a computer program stored in the memory and executable on the processor, and when the computer program is executed by the processor, the method for identifying the left and right headsets in any one of the above first aspect is implemented.
In a fourth aspect, a computer storage medium is provided, in which computer instructions are stored, and when the computer instructions are run on a headset, the headset executes related method steps to implement the method for identifying left and right headsets in any one of the above first aspects.
In a fifth aspect, a computer program product is provided, which when running on a headset, causes the headset to perform the relevant steps to implement the method for identifying left and right headsets of any of the above first aspects.
In a sixth aspect, a chip is provided, where the chip includes a processor, and the processor may be a general-purpose processor or a special-purpose processor. The processor is configured to support the headphones to perform the correlation steps, so as to implement the method for identifying left and right headphones in any one of the above first aspects.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a headset of the prior art;
FIG. 2 is a schematic diagram illustrating a user wearing a headset according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a relationship between coordinates of acceleration sensors of left and right earphones and a gravitational acceleration direction according to an embodiment of the present application;
fig. 4 is a schematic device layout diagram of an earphone according to an embodiment of the present application;
fig. 5 is a schematic diagram of a hardware structure of an earphone according to an embodiment of the present application;
fig. 6 is a schematic diagram of a calibration process of an acceleration sensor according to an embodiment of the present disclosure;
fig. 7 is a schematic step diagram of a method for identifying left and right earphones according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an angular difference between a first coordinate system and a second coordinate system when sensing the same gravitational acceleration according to an embodiment of the present disclosure;
FIG. 9 is a schematic diagram of a coordinate system provided by an embodiment of the present application;
fig. 10 is a schematic diagram of a process for recognizing left and right earphones of a user in an upright posture according to an embodiment of the present application;
fig. 11 is a schematic diagram of a process for recognizing left and right earphones of a user in a tilted posture according to an embodiment of the present application;
fig. 12 is a schematic diagram of a process for recognizing left and right earphones of a user in an inverted posture according to an embodiment of the present application;
fig. 13 is a block diagram of a left-right earphone recognition device according to an embodiment of the present application.
Detailed Description
For the convenience of clearly describing the technical solutions of the embodiments of the present application, in the embodiments of the present application, the terms "first" and "second" are used to distinguish the same items or similar items with basically the same functions and actions, and the number, the execution order, and the like of the items are not limited. For example, the first side earpiece, the second side earpiece, etc. are merely to distinguish between earpieces of different sides. When one earphone is the first side earphone, the other earphone is correspondingly the second earphone. The first earpiece may be any one of the two earpieces.
It should be noted that in the embodiments of the present application, words such as "exemplary" or "for example" are used to indicate examples, illustrations or explanations. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present relevant concepts in a concrete fashion.
The service scenario described in the embodiment of the present application is for more clearly illustrating the technical solution of the embodiment of the present application, and does not form a limitation on the technical solution provided in the embodiment of the present application, and it can be known by a person skilled in the art that with the occurrence of a new service scenario, the technical solution provided in the embodiment of the present application is also applicable to similar technical problems.
In the embodiments of the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
The steps involved in the method for identifying the left and right earphones provided by the embodiment of the application are merely examples, not all the steps are necessarily executed steps, or the content in each step is not necessary, and may be increased or decreased as required during the use process.
The same steps or technical features having the same functions in the embodiments of the present application may be referred to with each other between different embodiments.
As mentioned above, more and more headset manufacturers are beginning to use a completely identical design for both left and right side headsets when designing headsets. Thus, each earphone can be compatible with both ears. When the user uses the earphones, the user does not need to distinguish which earphone is the left earphone and which earphone is the right earphone.
For such earphones, although the user does not need to distinguish the left side and the right side when using the earphones, the user still needs to identify which earphone is worn on the left ear and which earphone is worn on the right ear, so that the terminal communicating with the earphones can correctly distribute the voice information of the left channel to the earphone worn on the left ear and the voice information of the right channel to the earphone worn on the right ear, and the user is guaranteed to obtain a good stereo effect.
In one possible implementation, the pressure sensors may be mounted on earphones, one on the left and one on the right. After the user wears the earphones, the wearing position of the earphones can be detected through the pressure sensor, and whether each earphone is matched with the target earphone or not can be judged according to the detection result. If the judgment result is not matched, the earphone can automatically switch the left and right sound channels. Due to the fact that the pressure sensor is low in sensing sensitivity in some forms, the problem that detection cannot be achieved or the detection accuracy rate is low easily occurs in the solution.
In another possible implementation, a fingerprint sensor may be mounted on the headset. Whether the headset is worn on the left ear or the right ear is determined by a fingerprint detected when the user wears the headset. Although this solution can improve the accuracy of detection, it requires at least two additional fingerprint sensors mounted on the headset, which is costly. Also, fingerprint sensors are typically bulky and require a flat sensing surface for use. Fingerprint sensors cannot be installed at all for cylindrical or sphere-like earphones. Meanwhile, for a user who operates with one hand, if the user wears two earphones with only one hand, the user cannot correctly identify which earphone is worn on the left ear and which earphone is worn on the right ear by using the fingerprint sensor.
In order to solve the above problem, an embodiment of the present application provides a method for identifying left and right earphones, in which acceleration sensors are respectively arranged in the left and right earphones, and data collected by the acceleration sensors in the left and right earphones is summarized and compared by using an included angle formed between an axis of the left earphone and an axis of the right earphone after a user wears the earphones, so that a correct position where the user wears the earphones is finally identified.
Specifically, as shown in fig. 2, the diagram is a schematic diagram of a user wearing an earphone according to an embodiment of the present application. Referring to fig. 2 (a), (b), and (c), there are respectively a front view, a side view, and a top view of the user wearing the headset. In each view, the user wears the left ear with L and the right ear with R. An angle exists between the axis of the left earphone L and the axis of the right earphone R, and the angle value of the angle is between 0 and 180 degrees (excluding 0 degree and 180 degrees). The axis of the left earphone L is substantially parallel to the axis of the left ear canal, and the axis of the right earphone R is substantially parallel to the axis of the right ear canal, and the direction of the left earphone L may be from the outside of the ear canal to the inside of the ear canal.
According to the above feature, the direction information of the gravitational acceleration can be sensed with the acceleration sensors disposed in the left earphone L and the right earphone R. Then, by comparing the data collected by the two acceleration sensors, the right-handed screw rule can be adopted to identify the correct position of the user wearing the earphone.
Fig. 3 is a schematic diagram illustrating a relationship between coordinates of an acceleration sensor and a gravitational acceleration direction of a left-side earphone and a right-side earphone according to an embodiment of the present application. It is assumed that the first acceleration sensor is an acceleration sensor on one of the earphones and the second acceleration sensor is an acceleration sensor on the other earphone. A coordinate system corresponding to the first acceleration sensor is a coordinate system (XYZ), and the Z-axis direction is the axial direction of the earphone on the side (first side); the coordinate system corresponding to the second acceleration sensor is a coordinate system (X 'Y' Z '), the Z' axis direction is the axis direction of the earphone on the other side (the second side), and fig. 3 can be regarded as that the vector systems of two different coordinates are unified in the same coordinate system. That is, fig. 3 shows a schematic diagram in which vectors in a coordinate system (X ' Y ' Z ') corresponding to the second acceleration sensor are unified in a basic coordinate system, using the coordinate system (XYZ) corresponding to the first acceleration sensor as the basic coordinate system. G in fig. 3 represents the gravitational acceleration.
Through the data collected by the two acceleration sensors, the projection vector of the axis of the second-side earphone in the basic coordinate system can be calculated. And performing cross multiplication on the unit vector of the first side earphone in the axial direction and the projection vector, and performing dot multiplication on the cross multiplication result (vector product/outer product) and the unit vector in the gravity acceleration direction, so that the correct position of the user wearing the earphone can be identified by adopting a right-hand spiral rule based on the dot multiplication result (quantity product/inner product).
For example, in the above example, if the dot product is greater than 0, it indicates that the direction of the thumb of the right hand coincides with the gravitational acceleration direction, and the bending direction of the other four fingers of the right hand is such that the right side points to the left side. Thus, it can be considered that the first side earphone is worn on the left ear by the user and the second side earphone is worn on the right ear by the user. If the dot product result is less than 0, the direction of the thumb of the right hand is inconsistent with the direction of the gravity acceleration, and the bending direction of other four fingers of the right hand is that the left side points to the right side. Thus, it can be considered that the first side earphone is worn on the right ear by the user and the second side earphone is worn on the left ear by the user.
In this embodiment of the present application, the earphone may be a wired earphone or a wireless earphone, which is not limited in this embodiment of the present application.
Fig. 4 is a schematic diagram of a device layout of an earphone according to an embodiment of the present application. It should be noted that, because a design scheme that the left and right earphones are completely consistent is adopted, the device layout shown in fig. 4 may be the device layout of one earphone in the pair of earphones, and the other earphone may adopt the same device layout as that in fig. 4.
The headset shown in fig. 4 comprises a processor 401, a battery 402, an ear plug 403, a main board 404 and an acceleration sensor 405. Among them, the acceleration sensor 405 may be a three-axis acceleration sensor. As shown in fig. 4, one of the axes of acceleration sensor 405 is parallel to the axis of the ear plug, thereby indicating the orientation of the headset. The axis of the earplug can also be considered as the central axis of the earpiece (axis SS' shown in fig. 4).
It should be noted that when the earphone is worn on the ear by the user, the earplug 403 is inserted into the ear canal from outside the ear canal, and the axis of the earplug 403 can be approximately regarded as being parallel to the axis of the ear canal.
As shown in fig. 4, the acceleration sensor 405 may be placed on the same motherboard as the processor 401 or on a motherboard where there is an electrical connection. Motherboard 404 may be placed at the back end of battery 402.
In addition to the components shown in fig. 4, the headset provided by the embodiment of the present application may also include other components. For example, fig. 5 shows a hardware structure diagram of a headset according to an embodiment of the present application. The headset may include a processor 501, a memory 502, a wireless communication module 503, an audio module 504, a sensor module 505, and a power module 506. Wherein, the processor 501 may be the same processor as the processor 401 in fig. 4; the audio module 504 may include a microphone 5041, a speaker 5042, an audio processor 5043, and the like; the sensor module 505 may include an acceleration sensor 5051, which acceleration sensor 5051 may be the same sensor as the acceleration sensor 405 shown in fig. 4, as well as other types of sensors, such as a first proximity sensor 5052, a second proximity sensor 5053, etc.; power module 506 may include a battery 5061 (battery 402 in fig. 4), a power management module 5062, and a charging interface 5063, among other things.
It is to be understood that the illustrated structure of the embodiments of the present application does not constitute a specific limitation to the earphone. In some embodiments of the present application, the headset may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
In the embodiment of the present application, in order to ensure the accuracy of identification, a sensing axis of the acceleration sensor disposed in the earphone should be parallel to the central axis of the earphone. Taking the example that the Z axis of the acceleration sensor is parallel to the central axis of the earphone, when the Z axis is not parallel to the central axis of the earphone, the earphone needs to be calibrated. Typically, calibration of the headset is done before the headset is shipped.
In the embodiment of the present application, the reasons for the non-parallelism between the sensing axis (Z axis) of the acceleration sensor and the central axis of the earphone include both the design of the earphone structure and/or the assembly tolerance. The causes of the two causes and the corresponding calibration method are described below.
The structural design leads to the inclined placement of the acceleration sensor
The acceleration sensor cannot be placed horizontally due to limited space of the headset. When designing the earphone structure, it may be necessary to place the acceleration sensor at an angle, which results in the sensing axis of the acceleration sensor not being parallel to the central axis of the earphone.
For the inclination of the acceleration sensor due to the architectural design, the attitude angle of the acceleration sensor in the space is included in the design data. Generally, the pose of an object in space can be described using euler angles. The three euler angles of the acceleration sensor, which represent the angles of rotation of the acceleration sensor about the Z-axis, the Y-axis and the X-axis, respectively, can be represented as phi, theta, psi. The specific angular magnitudes of the three euler angles mentioned above can be given by design data. According to the relation between the Euler angle and the rotation matrix, the problem that the (Z axis) of the acceleration sensor is not parallel to the central axis of the earphone due to the structural design can be calculated and obtained.
A rotation matrix is a matrix that when multiplied by a vector has the effect of changing the direction of the vector but not the size and preserves the chirality. In general, the dot product (inner product) of two vectors remains unchanged after they are both operated on by a rotation matrix.
The euler angles have the following relationship to the rotation matrix:
Figure BDA0002974590220000071
according to the above relation, the rotation matrix R can be calculated 1 Then, using the rotation matrix R 1 The sensor readings may be calibrated. Namely:
Figure BDA0002974590220000072
wherein,
Figure BDA0002974590220000073
the raw data representing the acceleration sensor is,
Figure BDA0002974590220000074
representing the calibrated output acceleration sensor reading.
(II) the assembly tolerance causes the acceleration sensor to incline
The assembly tolerance refers to the deviation of the acceleration sensor during the assembly of the headset. Due to assembly tolerances, the acceleration sensor may be tilted to different degrees. It also needs to be calibrated for the tilt caused by this situation.
Fig. 6 is a schematic diagram of a calibration process of an acceleration sensor according to an embodiment of the present application. Specifically, fig. 6 shows a procedure for calibration against an acceleration sensor tilt caused by an assembly tolerance.
During calibration, the earphone is horizontally placed at first, and the Z axis of the acceleration sensor is ensured to be parallel to the central axis of the earphone. At this time, the X-axis and Y-axis directions of the acceleration sensor may be as shown in (a) of fig. 6. Reading p of acceleration sensor 1 =(x 1 ,y 1 ,z 1 ) T . Then, the central axis of the earphone is taken as an axis, the earphone is rotated clockwise by 90 degrees, and then the reading p of the acceleration sensor is read 2 =(x 2 ,y 2 ,z 2 ) T . At this time, the X-axis and Y-axis directions of the acceleration sensor may be as shown in (b) of fig. 6. Then, the ear plug of the earphone is placed vertically downward, and the reading p of the acceleration sensor is taken 3 =(x 3 ,y 3 ,z 3 ) T . At this time, the X-axis and Y-axis directions of the acceleration sensor may be as shown in (c) of fig. 6.
Based on the three readings of the acceleration sensor read, a Singular Value Decomposition (SVD) may be used to calculate a rotation matrix R 2 . Namely:
Figure BDA0002974590220000081
the rotation matrix R can then be used 2 The sensor readings are calibrated. Namely:
Figure BDA0002974590220000082
wherein,
Figure BDA0002974590220000083
the raw data representing the acceleration sensor is,
Figure BDA0002974590220000084
representing the calibrated output acceleration sensor reading.
The left and right earphones can be identified using the calibrated acceleration sensor readings.
The following embodiments take the headset that has the aforementioned hardware structure/software structure and is calibrated according to the aforementioned calibration process as an example, and describe the method for identifying the left and right headsets provided in the embodiments of the present application.
Referring to fig. 7, a schematic step diagram of a method for identifying left and right earphones according to an embodiment of the present application is shown, where the method may specifically include the following steps:
s701, acquiring sensor data acquired by acceleration sensors in two earphones, wherein the sensor data comprises first sensor data acquired by a first acceleration sensor in a first earphone and second sensor data acquired by a second acceleration sensor in a second earphone.
In the embodiment of the present application, the acquisition of the sensor data collected by the acceleration sensors in the two earphones may be performed after the user wears the two earphones on the ears. When a user wears the earphones, the acceleration sensors in the two earphones will produce data simultaneously. Along with the change of the head posture of the user, the data collected by the acceleration sensor will also change.
Therefore, the first sensor data includes a plurality of sets of first data respectively collected by the first acceleration sensor when the head of the user is in a plurality of different postures; accordingly, the second sensor data comprises a plurality of sets of second data respectively acquired by the second acceleration sensor when the head of the user is in a plurality of different postures; when the head of the user is in any posture, the first data acquired by the first acceleration sensor and the second data acquired by the second acceleration sensor have a corresponding relation.
For example, the first sensor data may be represented in aggregate as:
Figure BDA0002974590220000085
the second sensor data may be represented in aggregate as:
Figure BDA0002974590220000086
the corresponding relationship between the first sensor data and the second sensor data is as follows:
Figure BDA0002974590220000087
s702, calculating a rotation matrix according to the first sensor data and the second sensor data.
In the embodiment of the present application, assuming that the motion of the head of the user is rigid motion, it can be known from the knowledge of spatial rotation that:
R×P 1 =P 2
wherein R is a rotation matrix.
R ═ VU can be calculated using singular matrix decomposition (SVD) T The V, U is a process matrix in the matrix decomposition process.
And S703, unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system based on the rotation matrix.
In the embodiment of the present application, the first coordinate system may refer to a coordinate system (X) constructed based on three sensing axes of the first acceleration sensor in the first headset 1 Y 1 Z 1 ) The second coordinate system is a coordinate system (X) constructed based on three sensing axes of a second acceleration sensor in the second earphone 2 Y 2 Z 2 )。
Fig. 8 is a schematic diagram illustrating an angular difference between a first coordinate system and a second coordinate system when sensing the same gravitational acceleration according to an embodiment of the present disclosure. In fig. 8, the first target sensing axis of the first acceleration sensor is taken as Z 1 The second target sensing axis of the second acceleration sensor is Z 2 The axis is taken as an example, and the angular difference between the two when sensing the same gravitational acceleration can be referred to as Z in FIG. 8 1 And Z 2 The angular difference a between them.
When the left and right earphones are identified according to the angle difference, the first coordinate system and the second coordinate system need to be unified.
Specifically, the first coordinate system and the second coordinate system may be unified using the rotation matrix R calculated in S702. That is, the vectors in different coordinate systems are unified into one base coordinate system.
S704, determining a first projection vector of a first target sensing axis of the first acceleration sensor in the base coordinate system, and a second projection vector of a second target sensing axis of the second acceleration sensor in the base coordinate system.
In an embodiment of the present application, a first vector of a first target sensing axis in a first coordinate system may be determined according to the first sensor data, and a second vector of a second target sensing axis in a second coordinate system may be determined according to the second sensor data. Then, based on the rotation matrix calculated in the previous step, a first projection vector of the first vector in the base coordinate system and a second projection vector of the second vector in the base coordinate system are calculated.
In one possible implementation of the embodiment of the present application, the first coordinate system may be used as a base coordinate system, and then the rotation matrix is used to unify the first coordinate system and the second coordinate system. Thus, the first target sensing axis of the first acceleration sensor is in the same direction as one axis in the base coordinate system, and the first projection vector of the first target sensing axis in the base coordinate system may be located on the one axis of the base coordinate system.
Fig. 9 is a schematic diagram of a base coordinate system according to an embodiment of the present application. The base coordinate system shown in fig. 9 is a first coordinate system. Thus, the first projection vector
Figure BDA0002974590220000091
And on the Z-axis of the base coordinate system.
Figure BDA0002974590220000092
And the second projection vector represents the projection of the second target sensing axis of the second acceleration sensor in the base coordinate system. M is a first projection vector
Figure BDA0002974590220000093
And a second projection vector
Figure BDA0002974590220000094
A plane of formation.
S705, calculating a vector product between the first projection vector and the second projection vector.
In the embodiment of the present application, calculating the vector product between the first projection vector and the second projection vector requires first determining the coordinate values of the first sensor data and the second sensor data in the base coordinate system.
Illustratively, the first projection vector
Figure BDA0002974590220000095
The coordinate values in the base coordinate system may be expressed as
Figure BDA0002974590220000096
Second projection vector
Figure BDA0002974590220000097
The coordinate values in the base coordinate system may be expressed as
Figure BDA0002974590220000098
In a possible implementation manner of the embodiment of the present application, the first coordinate system is a base coordinate system, and the first projection vector is
Figure BDA0002974590220000099
Unit vectors on the target axis of the base coordinate system can be used
Figure BDA00029745902200000910
Alternatively, the coordinate value of the unit vector may be expressed as
Figure BDA00029745902200000911
Therefore, calculating the vector product between the first projection vector and the second projection vector can be simplified to calculating the vector product between the unit vector on the target axis of the base coordinate system and the second projection vector:
Figure BDA00029745902200000912
wherein,
Figure BDA00029745902200000913
is a vector product.
S706, identifying the wearing positions of the first earphone and the second earphone according to the vector product.
In the embodiment of the present application, the vector product is obtained according to the above
Figure BDA00029745902200000914
The wearing positions of the first earphone and the second earphone can be identified.
In the embodiment of the present application, the above-mentioned vector product is obtained by calculation
Figure BDA00029745902200000915
The vector product can then be calculated
Figure BDA00029745902200000916
Vector corresponding to gravitational acceleration
Figure BDA00029745902200000917
The product of the quantities between:
Figure BDA00029745902200000918
then, the wearing positions of the first earphone and the second earphone are determined according to the number product.
Specifically, if the above number product is greater than zero, it may be determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear; if the above number product is less than zero, it can be determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear.
After the wearing positions of the first earphone and the second earphone are identified, the earphones can distribute the voice information in the corresponding left and right sound channels to the earphones corresponding to the corresponding positions. For example, the voice information of the left channel is distributed to the headphone worn on the left ear, and the voice information of the right channel is distributed to the headphone worn on the right ear. Therefore, when the user uses the earphone, a better stereo effect can be obtained.
According to the method for identifying the left and right earphones provided by the embodiment of the application, a user can correctly detect the left and right sides of the earphones in most scenes such as sitting, standing and walking. For the problem that some extremely individual scenes in an actual scene may not be detected, for example, when a user lies down and wears earphones or stands upside down, the problem that the direction cannot be identified or the identification is wrong exists, the embodiment of the present application may be avoided by adopting the following method:
1. can combine to wear the detection function, let when the user wears the earphone under the upright state, and the earphone automatic identification left and right ears after, before taking off the earphone, all lock the left and right sides state of earphone. For example, it may be detected whether the first earphone and the second earphone are removed from the ear canal of the user; and if the first earphone and/or the second earphone is not taken out of the ear canal of the user, the information of the wearing positions of the first earphone and the second earphone which are identified is locked. Thus, the user can still obtain correct left and right ear sound channels in other scenes such as lying or standing upside down.
2. Because the head of the user is difficult to keep motionless for a long time, if the user cannot judge the left ear and the right ear after wearing the earphones, the user can wait for the head of the user to move to a detectable area, judge the head and then switch to the correct sound channel to play the stereo. For example, after the user wears the headset, if the headset cannot correctly identify the wearing position within a certain time, the headset may wait for a period of time, and after new data is acquired by the acceleration sensor, the headset may identify again according to the new sensor data; alternatively, if the wearing position of the earphones cannot be correctly identified, the earphones may be identified again at regular time intervals until the wearing positions of the two earphones are determined.
3. When the user is in an inverted state, the direction of gravitational acceleration is opposite to the user's actual direction. The result of recognition in the above manner is opposite to the result of the user in the upright state. Therefore, when the user is in an inverted state, if the above number product is greater than zero, it can be determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear; if the number product is less than zero, it may be determined that the first earpiece is worn on the left ear and the second earpiece is worn on the right ear. That is, the recognition result of the user in the upright state is opposite to the recognition result in the inverted state according to the number product being greater than or less than zero.
In one possible implementation of the embodiment of the present application, the inverted state may be determined in combination with other terminal devices. Such as a handset in communication with a headset. Whether the user is in an inverted state can be judged through unlocking the screen and the upright imaging in the screen. When the mobile phone detects that the user is in the handstand state, the mobile phone can send the detection result to the earphone. The screen locking is required to ensure that the imaging in the screen is not adjusted along with the change of the mobile phone in the up-down turning state in the horizontal screen state.
4. According to the method for identifying the left and right earphones provided by the embodiment of the application, the left and right ears cannot be judged under the condition of wearing a single ear. However, this does not affect the practical use of the headset by the user, since the stereo experience is not available with monaural wear by itself.
In the following, the method for recognizing the left and right earphones provided by the embodiment of the present application is described with reference to several specific scenarios.
In some scenarios described below, a spatial coordinate system (head coordinate system) is established with the direction of the nose of the head of the user (the user's sight line direction) as the positive X-axis direction and the direction of the vertex of the head as the positive Z-axis direction.
Scene one: the user is in an upright position
As shown in fig. 10, a schematic diagram of the recognition process of the left and right earphones when the user is in an upright posture is shown. In (a) in fig. 10, the user wears headphones E11 and E12 in an upright posture, and acceleration sensors are arranged in the above-described headphones E11 and E12, respectively. Fig. 10 (b) shows coordinate systems corresponding to two acceleration sensors in E11 and E12, where coordinate system C11 is the coordinate system corresponding to the acceleration sensor in headset E11, coordinate system C12 is the coordinate system corresponding to the acceleration sensor in headset E12, and 1000 is the simulated earth.
In the upright posture, according to the identification method of the left and right earphones provided by the embodiment of the application, the coordinate system C11 and the coordinate system C12 are unified to be in the same basic coordinate system. As shown in (c) of fig. 10, vector
Figure BDA0002974590220000111
And vector
Figure BDA0002974590220000112
Constituting a reference plane. Wherein the vector
Figure BDA0002974590220000113
Is a projection vector of a target sensing axis of an acceleration sensor in the earphone E11 in a base coordinate system
Figure BDA0002974590220000114
Is the projection vector of the target sensing axis of the acceleration sensor in headset E12 in the basis coordinate system.
Computing
Figure BDA0002974590220000115
For distinguishing the left and right earphones. Wherein re < 0 represents
Figure BDA0002974590220000116
A corresponding ear piece E11 is worn on the right ear,
Figure BDA0002974590220000117
a corresponding ear piece E12 is worn on the left ear.
Scene two: the user is in a tilted posture
As shown in fig. 11, a schematic diagram of the recognition process of the left and right earphones when the user is in the inclined posture is shown. The inclined posture generally includes an upper inclined posture, a horizontal posture, and a lower inclined posture according to the orientation of the head of the user. Referring to (a), (b), (c) in fig. 11, a schematic view showing that the user wears the headphones E21 or E22 in the upper inclined posture, the horizontal posture, and the lower inclined posture, respectively, and the headphones E21 and E22 described above have acceleration sensors arranged therein, respectively.
The user is in the horizontal posture (the positive X-axis direction of the head coordinate system is directed upward) shown in fig. 11 (b) as an example. Fig. 11 (d) shows the coordinate systems corresponding to the two acceleration sensors in the earphones E21 and E22, where the coordinate system C21 is the coordinate system corresponding to the acceleration sensor in the earphone E21, the coordinate system C22 is the coordinate system corresponding to the acceleration sensor in the earphone E22, and 1100 is the simulated earth.
In the horizontal posture, according to the identification method of the left and right earphones provided by the embodiment of the application, the coordinate system C21 and the coordinate system C22 are unified to be in the same basic coordinate system. As shown in (e) of fig. 11, vector
Figure BDA0002974590220000118
And vector
Figure BDA0002974590220000119
Constituting a reference plane. Wherein the vector
Figure BDA00029745902200001110
Is a projection vector of a target sensing axis of an acceleration sensor in the earphone E21 in a base coordinate system
Figure BDA00029745902200001111
Is the projection vector of the target sensing axis of the acceleration sensor in headset E22 in the basis coordinate system.
Computing
Figure BDA00029745902200001112
For distinguishing the left and right earphones.
According to the practical result and theoretical analysis, when the head inclines, the relation between the reference plane and the gravity direction is strongly related to the wearing mode of the earphone and the inclination angle of the head, and the recognition conditions of the left ear and the right ear are different under different combinations. Therefore, there is a risk of recognition in the inclined posture. The user can first adjust to the upright posture, after the recognition, the recognition result is locked by the earphone until the user takes out the earphone.
Scene three: user is in handstand posture
As shown in fig. 12, a schematic diagram of the recognition process of the left and right earphones when the user is in an inverted posture is shown. In fig. 12 (a), the user wears headphones E31 and E32 in an inverted posture, and acceleration sensors are arranged in the headphones E31 and E32 described above, respectively. Fig. 12 (b) shows coordinate systems corresponding to two acceleration sensors in the earphones E31 and E32, where the coordinate system C31 is the coordinate system corresponding to the acceleration sensor in the earphone E31, the coordinate system C32 is the coordinate system corresponding to the acceleration sensor in the earphone E32, and 1200 is the simulated earth.
In the inverted posture, according to the identification method of the left and right earphones provided by the embodiment of the application, the coordinate system C31 and the coordinate system C32 are unified to be in the same basic coordinate system. As shown in (c) of fig. 12, vector
Figure BDA00029745902200001113
And vector
Figure BDA00029745902200001114
Constituting a reference plane. Wherein the vector
Figure BDA00029745902200001115
Is a projection vector of a target sensing axis of an acceleration sensor in the earphone E31 in a base coordinate system
Figure BDA00029745902200001116
Is the projection vector of the target sensing axis of the acceleration sensor in headset E32 in the basis coordinate system.
Computing
Figure BDA00029745902200001117
For distinguishing the left and right earphones. According to the analysis of the above-described scene, when the user is lying down, the recognition results of the left and right earphones are opposite to the recognition results in the upright posture. I.e., re > 0, indicates
Figure BDA00029745902200001118
A corresponding ear piece E31 is worn on the right ear,
Figure BDA00029745902200001119
a corresponding earphone E32 is worn on the left ear.
In the embodiment of the present application, the functional modules of the headset may be divided according to the above method example, for example, each functional module may be divided corresponding to each function, or one or more functions may be integrated into one functional module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation. The following description will be given taking as an example that each function module is divided for each function.
Corresponding to the above embodiments, referring to fig. 13, a block diagram of a left-right earphone recognition device provided in the embodiments of the present application is shown, and the device may be applied to earphones in the foregoing embodiments, where the earphones include a first earphone and a second earphone. The first earphone is provided with a first acceleration sensor, and a first target sensing axis of the first acceleration sensor is parallel to a central axis of the first earphone; the second earphone is provided with a second acceleration sensor, a second target sensing axis of the second acceleration sensor is parallel to a central axis of the second earphone, and the device specifically comprises the following modules: a sensor data acquisition module 1301, a coordinate system conversion module 1302, a projection vector determination module 1303, a vector product calculation module 1304, and a wearing position identification module 1305, wherein:
the sensor data acquisition module is used for acquiring first sensor data acquired by the first acceleration sensor and second sensor data acquired by the second acceleration sensor;
the coordinate system conversion module is used for unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system according to the first sensor data and the second sensor data;
the projection vector determination module is configured to determine a first projection vector of the first target sensing axis in the base coordinate system and a second projection vector of the second target sensing axis in the base coordinate system;
the vector product calculation module is used for calculating a vector product between the first projection vector and the second projection vector;
and the wearing position identification module is used for identifying the wearing positions of the first earphone and the second earphone according to the vector product.
In an embodiment of the present application, the first sensor data includes multiple sets of first data respectively acquired by the first acceleration sensor when the head of the user is in multiple different postures, and the second sensor data includes multiple sets of second data respectively acquired by the second acceleration sensor when the head of the user is in multiple different postures; when the head of the user is in any posture, the first data acquired by the first acceleration sensor and the second data acquired by the second acceleration sensor have a corresponding relation.
In an embodiment of the present application, the coordinate system conversion module is specifically configured to: calculating a rotation matrix from the first sensor data and the second sensor data; and unifying a first coordinate system corresponding to the first acceleration sensor and a second coordinate system corresponding to the second acceleration sensor into a base coordinate system based on the rotation matrix.
In an embodiment of the present application, the projection vector determining module is specifically configured to: determining a first vector of the first target sensing axis in a first coordinate system according to the first sensor data, and determining a second vector of the second target sensing axis in a second coordinate system according to the second sensor data; based on the rotation matrix, a first projection vector of the first vector in the base coordinate system and a second projection vector of the second vector in the base coordinate system are calculated.
In this embodiment of the application, the base coordinate system may be the first coordinate system, and the vector product calculation module is specifically configured to: and calculating a vector product between the unit vector on the target axis of the base coordinate system and the second projection vector.
In an embodiment of the present application, the wearing position identifying module is specifically configured to: calculating the quantity product between the vector product and the vector corresponding to the gravity acceleration; if the number product is greater than zero, determining that the first earphone is worn on the left ear and the second earphone is worn on the right ear; if the number product is less than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear.
In an embodiment of the present application, the wearing position identifying module is further configured to: when the user is in an inverted state, if the number product is larger than zero, the first earphone is judged to be worn on the right ear, and the second earphone is judged to be worn on the left ear; if the number product is less than zero, it is determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear.
In an embodiment of the present application, the apparatus further includes a detection module and a locking module, wherein:
the detection module is used for detecting whether the first earphone and the second earphone are taken out of the ear canal of the user;
the locking module is used for locking the information of the recognized wearing positions of the first earphone and the second earphone if the first earphone and/or the second earphone are not taken out of the ear canal of the user.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The present invention also provides a headset, which may be the headset in the foregoing embodiments, and the headset includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method for identifying the left and right headsets in the foregoing embodiments is implemented.
The embodiment of the present application further provides a computer storage medium, where the computer storage medium stores computer instructions, and when the computer instructions are run on an earphone, the earphone executes the above related method steps to implement the left and right earphone identification methods in the above embodiments.
The embodiments of the present application further provide a computer program product, which when running on a headset, causes the headset to perform the above related steps, so as to implement the left and right headset identification methods in the above embodiments.
The embodiment of the present application further provides a chip, where the chip includes that the processor may be a general-purpose processor or a special-purpose processor. The processor is configured to support the headset to perform the above related steps, so as to implement the left and right headset identification methods in the above embodiments.
Optionally, the chip further includes a transceiver, where the transceiver is used for receiving the control of the processor, and is used to support the headset to perform the above related steps, so as to implement the left and right headset identification methods in the above embodiments.
Optionally, the chip may further include a storage medium.
It should be noted that the chip may be implemented by using the following circuits or devices: one or more Field Programmable Gate Arrays (FPGAs), Programmable Logic Devices (PLDs), controllers, state machines, gate logic, discrete hardware components, any other suitable circuitry, or any combination of circuitry capable of performing the various functions described throughout this application.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application.

Claims (10)

1.一种左右侧耳机的识别方法,其特征在于,所述耳机包括第一耳机和第二耳机,所述第一耳机具有第一加速度传感器,所述第一加速度传感器的第一目标感应轴与所述第一耳机的中心轴平行,所述第二耳机具有第二加速度传感器,所述第二加速度传感器的第二目标感应轴与所述第二耳机的中心轴平行,所述方法包括:1. A method for identifying left and right earphones, wherein the earphones include a first earphone and a second earphone, the first earphone has a first acceleration sensor, and the first target sensing axis of the first acceleration sensor is Parallel to the central axis of the first earphone, the second earphone has a second acceleration sensor, the second target sensing axis of the second acceleration sensor is parallel to the central axis of the second earphone, and the method includes: 获取所述第一加速度传感器采集的第一传感器数据,以及所述第二加速度传感器采集的第二传感器数据;acquiring first sensor data collected by the first acceleration sensor and second sensor data collected by the second acceleration sensor; 根据所述第一传感器数据和所述第二传感器数据,将所述第一加速度传感器对应的第一坐标系和所述第二加速度传感器对应的第二坐标系统一至基坐标系中;According to the first sensor data and the second sensor data, the first coordinate system corresponding to the first acceleration sensor and the second coordinate system corresponding to the second acceleration sensor are converted into a base coordinate system; 确定所述第一目标感应轴在所述基坐标系中的第一投影向量,以及所述第二目标感应轴在所述基坐标系中的第二投影向量;determining a first projection vector of the first target sensing axis in the base coordinate system, and a second projection vector of the second target sensing axis in the base coordinate system; 计算所述第一投影向量和所述第二投影向量之间的向量积;calculating the vector product between the first projection vector and the second projection vector; 根据所述向量积识别所述第一耳机和所述第二耳机的佩戴位置。The wearing positions of the first earphone and the second earphone are identified according to the vector product. 2.根据权利要求1所述的方法,其特征在于,所述第一传感器数据包括由所述第一加速度传感器在用户头部处于多个不同姿态时分别采集到的多组第一数据,所述第二传感器数据包括由所述第二加速度传感器在用户头部处于多个不同姿态时分别采集到的多组第二数据;其中,当所述用户头部处于任一姿态时,所述第一加速度传感器采集到的第一数据与所述第二加速度传感器采集到的第二数据具有对应关系。2 . The method according to claim 1 , wherein the first sensor data comprises multiple sets of first data respectively collected by the first acceleration sensor when the user's head is in multiple different postures, the The second sensor data includes multiple sets of second data respectively collected by the second acceleration sensor when the user's head is in multiple different postures; wherein, when the user's head is in any posture, the first The first data collected by an acceleration sensor has a corresponding relationship with the second data collected by the second acceleration sensor. 3.根据权利要求1或2所述的方法,其特征在于,所述根据所述第一传感器数据和所述第二传感器数据,将所述第一加速度传感器对应的第一坐标系和所述第二加速度传感器对应的第二坐标系统一至基坐标系中,包括:3 . The method according to claim 1 or 2 , wherein, according to the first sensor data and the second sensor data, the first coordinate system corresponding to the first acceleration sensor and the The second coordinate system 1 to the base coordinate system corresponding to the second acceleration sensor includes: 根据所述第一传感器数据和所述第二传感器数据,计算旋转矩阵;calculating a rotation matrix according to the first sensor data and the second sensor data; 基于所述旋转矩阵,将所述第一加速度传感器对应的第一坐标系和所述第二加速度传感器对应的第二坐标系统一至基坐标系中。Based on the rotation matrix, the first coordinate system corresponding to the first acceleration sensor and the second coordinate system corresponding to the second acceleration sensor are integrated into a base coordinate system. 4.根据权利要求3所述的方法,其特征在于,所述确定所述第一目标感应轴在所述基坐标系中的第一投影向量,以及所述第二目标感应轴在所述基坐标系中的第二投影向量,包括:4. The method according to claim 3, wherein the determining the first projection vector of the first target sensing axis in the base coordinate system, and the second target sensing axis in the base coordinate system The second projection vector in the coordinate system, including: 根据所述第一传感器数据,确定所述第一目标感应轴在第一坐标系中的第一向量,以及根据所述第二传感器数据,确定所述第二目标感应轴在第二坐标系中的第二向量;According to the first sensor data, a first vector of the first target sensing axis in a first coordinate system is determined, and according to the second sensor data, it is determined that the second target sensing axis is in a second coordinate system the second vector of ; 基于所述旋转矩阵,计算所述第一向量在所述基坐标系中的第一投影向量,以及所述第二向量在所述基坐标系中的第二投影向量。Based on the rotation matrix, a first projection vector of the first vector in the base coordinate system and a second projection vector of the second vector in the base coordinate system are calculated. 5.根据权利要求1-4任一项所述的方法,其特征在于,所述基坐标系为所述第一坐标系,所述计算所述第一投影向量和所述第二投影向量之间的向量积,包括:5. The method according to any one of claims 1-4, wherein the base coordinate system is the first coordinate system, and the calculation of the first projection vector and the second projection vector The vector product between , including: 计算所述基坐标系的目标轴上的单位向量与所述第二投影向量之间的向量积。A vector product between the unit vector on the target axis of the base coordinate system and the second projection vector is calculated. 6.根据权利要求5所述的方法,其特征在于,所述根据所述向量积识别所述第一耳机和所述第二耳机的佩戴位置,包括:6. The method according to claim 5, wherein the identifying the wearing positions of the first earphone and the second earphone according to the vector product comprises: 计算所述向量积与重力加速度对应的向量之间的数量积;calculating the quantity product between the vector product and the vector corresponding to the gravitational acceleration; 若所述数量积大于零,则判定所述第一耳机被佩戴在左耳上,所述第二耳机被佩戴在右耳上;If the quantity product is greater than zero, it is determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear; 若所述数量积小于零,则判定所述第一耳机被佩戴在右耳上,所述第二耳机被佩戴在左耳上。If the quantity product is less than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear. 7.根据权利要求6所述的方法,其特征在于,还包括:7. The method of claim 6, further comprising: 当用户处于倒立状态时,若所述数量积大于零,则判定所述第一耳机被佩戴在右耳上,所述第二耳机被佩戴在左耳上;若所述数量积小于零,则判定所述第一耳机被佩戴在左耳上,所述第二耳机被佩戴在右耳上。When the user is in an inverted state, if the quantity product is greater than zero, it is determined that the first earphone is worn on the right ear and the second earphone is worn on the left ear; if the quantity product is less than zero, then It is determined that the first earphone is worn on the left ear and the second earphone is worn on the right ear. 8.根据权利要求1-7任一项所述的方法,其特征在于,在根据所述向量积识别所述第一耳机和所述第二耳机的佩戴位置之后,所述方法还包括:8. The method according to any one of claims 1-7, wherein after identifying the wearing positions of the first earphone and the second earphone according to the vector product, the method further comprises: 检测所述第一耳机和所述第二耳机是否被从用户耳道中取出;detecting whether the first earphone and the second earphone are removed from the user's ear canal; 若所述第一耳机和/或所述第二耳机未被从所述用户耳道中取出,则锁定已识别的所述第一耳机和所述第二耳机的佩戴位置的信息。If the first earphone and/or the second earphone have not been taken out from the user's ear canal, the identified wearing position information of the first earphone and the second earphone is locked. 9.一种耳机,包括存储器、处理器以及存储在所述存储器中并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述计算机程序时实现如权利要求1-8中任一项所述的左右侧耳机的识别方法。9. An earphone comprising a memory, a processor and a computer program stored in the memory and running on the processor, wherein the processor implements the computer program as claimed in claim 1 when the processor executes the computer program The method for identifying left and right earphones according to any one of -8. 10.一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,所述电子设备执行如权利要求1-8中任一项所述的左右侧耳机的识别方法。10. A computer storage medium, comprising computer instructions, when the computer instructions are executed on an electronic device, the electronic device executes the process of the left and right earphones according to any one of claims 1-8. recognition methods.
CN202110271354.XA 2021-03-12 2021-03-12 A method and device for identifying left and right earphones, and an earphone Active CN115086817B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110271354.XA CN115086817B (en) 2021-03-12 2021-03-12 A method and device for identifying left and right earphones, and an earphone

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110271354.XA CN115086817B (en) 2021-03-12 2021-03-12 A method and device for identifying left and right earphones, and an earphone

Publications (2)

Publication Number Publication Date
CN115086817A true CN115086817A (en) 2022-09-20
CN115086817B CN115086817B (en) 2025-04-25

Family

ID=83240741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110271354.XA Active CN115086817B (en) 2021-03-12 2021-03-12 A method and device for identifying left and right earphones, and an earphone

Country Status (1)

Country Link
CN (1) CN115086817B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118870246A (en) * 2023-04-27 2024-10-29 华为技术有限公司 Wearing position identification method and electronic device
WO2025045236A1 (en) * 2023-09-01 2025-03-06 华为技术有限公司 Earphone control method and wearable device
WO2025201282A1 (en) * 2024-03-28 2025-10-02 歌尔股份有限公司 Left and right channel matching method and apparatus for earbuds, and clip-on earbuds and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120114132A1 (en) * 2010-11-05 2012-05-10 Sony Ericsson Mobile Communications Ab Headset with accelerometers to determine direction and movements of user head and method
US20130279724A1 (en) * 2012-04-19 2013-10-24 Sony Computer Entertainment Inc. Auto detection of headphone orientation
CN105959854A (en) * 2016-06-22 2016-09-21 惠州Tcl移动通信有限公司 Method and system for realizing normal output of left sound and right sound based on headset
CN110012376A (en) * 2019-03-25 2019-07-12 歌尔科技有限公司 A kind of control method, earphone and the storage medium of earphone sound channel

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120114132A1 (en) * 2010-11-05 2012-05-10 Sony Ericsson Mobile Communications Ab Headset with accelerometers to determine direction and movements of user head and method
US20130279724A1 (en) * 2012-04-19 2013-10-24 Sony Computer Entertainment Inc. Auto detection of headphone orientation
CN104509129A (en) * 2012-04-19 2015-04-08 索尼电脑娱乐公司 Auto detection of headphone orientation
CN105959854A (en) * 2016-06-22 2016-09-21 惠州Tcl移动通信有限公司 Method and system for realizing normal output of left sound and right sound based on headset
CN110012376A (en) * 2019-03-25 2019-07-12 歌尔科技有限公司 A kind of control method, earphone and the storage medium of earphone sound channel

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118870246A (en) * 2023-04-27 2024-10-29 华为技术有限公司 Wearing position identification method and electronic device
WO2024222069A1 (en) * 2023-04-27 2024-10-31 华为技术有限公司 Wearing position identification method and electronic device
WO2025045236A1 (en) * 2023-09-01 2025-03-06 华为技术有限公司 Earphone control method and wearable device
WO2025201282A1 (en) * 2024-03-28 2025-10-02 歌尔股份有限公司 Left and right channel matching method and apparatus for earbuds, and clip-on earbuds and storage medium

Also Published As

Publication number Publication date
CN115086817B (en) 2025-04-25

Similar Documents

Publication Publication Date Title
US11166104B2 (en) Detecting use of a wearable device
CN113475094B (en) Different head detection in headphones
US10397728B2 (en) Differential headtracking apparatus
CN108257581B (en) Light intensity detection device, mobile terminal and display screen brightness adjustment method
CN115086817B (en) A method and device for identifying left and right earphones, and an earphone
US9916004B2 (en) Display device
CN108966087A (en) A kind of wear condition detection method, device and the wireless headset of wireless headset
CN107182011B (en) Audio playing method and system, mobile terminal and WiFi earphone
CN114205701B (en) Noise reduction method, terminal device and computer readable storage medium
CN110505403A (en) A kind of video record processing method and device
CN109558837B (en) Face key point detection method, device and storage medium
US10488223B1 (en) Methods and systems for calibrating an inertial measurement unit of an electronic device
CN108319445A (en) A kind of audio playing method and mobile terminal
KR20230013385A (en) Electronic apparatus and operating method thereof
KR102730772B1 (en) Hearable device connected electronic device and operating method thereof
CN110349527B (en) Virtual reality display method, device and system and storage medium
CN111246061A (en) Mobile terminal, detection method of shooting mode, and storage medium
CN110209543B (en) Earphone socket detection method and terminal
CN113472943B (en) Audio processing method, device, equipment and storage medium
KR102862765B1 (en) Electronic device for processing voice data and operating methiod thereof
CN114093020B (en) Motion capture method, device, electronic device and storage medium
CN105824597A (en) Terminal audio processing method and terminal
KR102864051B1 (en) Wireless earphone device and control method thereof
CN116582796A (en) Audio processing method, system, device and computer-readable storage medium
KR102745698B1 (en) Electronic device detecting wearing state of electronic device using inertial sensor and method for controlling thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant