[go: up one dir, main page]

CN110017834B - Usage object determination method, usage object determination apparatus, and storage medium - Google Patents

Usage object determination method, usage object determination apparatus, and storage medium Download PDF

Info

Publication number
CN110017834B
CN110017834B CN201910301115.7A CN201910301115A CN110017834B CN 110017834 B CN110017834 B CN 110017834B CN 201910301115 A CN201910301115 A CN 201910301115A CN 110017834 B CN110017834 B CN 110017834B
Authority
CN
China
Prior art keywords
acceleration
intelligent equipment
peak
fluctuation
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910301115.7A
Other languages
Chinese (zh)
Other versions
CN110017834A (en
Inventor
臧爱伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Goertek Techology Co Ltd
Original Assignee
Goertek Techology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Goertek Techology Co Ltd filed Critical Goertek Techology Co Ltd
Priority to CN201910301115.7A priority Critical patent/CN110017834B/en
Publication of CN110017834A publication Critical patent/CN110017834A/en
Priority to PCT/CN2019/129571 priority patent/WO2020211459A1/en
Application granted granted Critical
Publication of CN110017834B publication Critical patent/CN110017834B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The embodiment of the application provides a method, equipment and a storage medium for determining a use object. In the embodiment of the application, the acceleration information in a plurality of motion directions generated in the using process of the intelligent device is subjected to characteristic analysis, the acceleration characteristic of the using object of the intelligent device is obtained, and the using object of the intelligent device is determined according to the acceleration characteristic of the using object. Therefore, after the using object of the intelligent device is identified, the using object can be counted or positioned in a mode matched with the using object, and the accuracy of counting the steps or positioning is improved.

Description

Usage object determination method, usage object determination apparatus, and storage medium
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a method, an apparatus, and a storage medium for determining a usage object.
Background
Along with the continuous development of artificial intelligence technology, smart devices are widely applied to daily life of people, for example, wearable devices such as smart phones, smart watches and smart bracelets are inseparable from the life of people. These smart devices can also perform step counting, location, etc. in addition to providing basic functions to people. The difference of the used objects affects the step counting or positioning accuracy of the intelligent device, so that the used objects of the intelligent device need to be determined.
Disclosure of Invention
Aspects of the present application provide a usage object determining method, a device, and a storage medium, which are used to identify a usage object of a smart device, thereby facilitating improvement of accuracy of subsequent step counting or positioning of the usage object.
The embodiment of the application provides a method for determining a use object, which comprises the following steps:
acquiring acceleration information in a plurality of motion directions generated by the intelligent equipment in the using process;
performing characteristic analysis on the acceleration information in the plurality of motion directions to obtain at least one acceleration characteristic of a use object of the intelligent equipment;
and determining the use object of the intelligent equipment according to at least one acceleration characteristic of the use object.
An embodiment of the present application further provides an intelligent device, including: an acceleration sensor, a memory and a processor; the acceleration sensor is used for acquiring acceleration information in a plurality of motion directions generated by the intelligent equipment in the using process;
the memory for storing a computer program;
the processor is coupled to the memory for executing the computer program for:
performing characteristic analysis on the acceleration information in the plurality of motion directions to obtain at least one acceleration characteristic of a use object of the intelligent equipment;
and determining the use object of the intelligent equipment according to at least one acceleration characteristic of the use object.
Embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-described method.
In the embodiment of the application, the acceleration information in a plurality of motion directions generated in the using process of the intelligent device is subjected to characteristic analysis, the acceleration characteristic of the using object of the intelligent device is obtained, and the using object of the intelligent device is determined according to the acceleration characteristic of the using object. Therefore, after the using object of the intelligent device is identified, the using object can be counted or positioned in a mode matched with the using object, and the accuracy of counting the steps or positioning is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic flowchart of a method for determining an object to be used according to an embodiment of the present application;
FIGS. 2a and 2b are schematic diagrams of the resultant acceleration of a person walking and running, respectively, according to an embodiment of the present disclosure;
fig. 3a and 3b are schematic diagrams of the resultant acceleration of a small dog during walking and running, respectively, according to an embodiment of the present application;
fig. 4a and 4b are schematic diagrams of the resultant acceleration of a medium-sized dog during walking and running, respectively, according to an embodiment of the present application;
fig. 5a and 5b are schematic diagrams of the resultant acceleration of a large dog during walking and running, respectively, according to an embodiment of the present application;
fig. 6a and fig. 6b are schematic diagrams after wavelet changes corresponding to combined acceleration when a large dog walks and runs according to an embodiment of the present application;
fig. 7a and 7b are schematic diagrams after wavelet changes corresponding to combined acceleration of a person walking and running according to an embodiment of the present application;
FIG. 8a is a schematic diagram illustrating a distribution of acceleration ratios of an x axis and a y axis when a person walks according to an embodiment of the present application;
FIG. 8b is a schematic diagram illustrating a distribution of acceleration ratios of a y-axis and a z-axis when a person walks according to an embodiment of the present application;
FIG. 8c is a schematic diagram illustrating a distribution of acceleration ratios of an x-axis and a z-axis when a person walks according to an embodiment of the present application;
FIG. 9a is a schematic diagram of the distribution of acceleration ratios of the x-axis and the y-axis when a large dog runs according to an embodiment of the present application;
FIG. 9b is a schematic diagram of the distribution of acceleration ratios of the y-axis and the z-axis when a large dog runs according to the embodiment of the present application;
FIG. 9c is a schematic diagram of the distribution of the acceleration ratio of the x-axis and the z-axis when a large dog runs according to the embodiment of the present application;
fig. 10 is a schematic structural diagram of an intelligent device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
To the lower technical problem of precision of current smart machine meter step or location, this application embodiment provides a solution, and basic thinking is: the method comprises the steps of performing characteristic analysis on acceleration information of the intelligent device in multiple motion directions generated in the using process to obtain acceleration characteristics of a using object of the intelligent device, and determining the using object of the intelligent device according to the acceleration characteristics of the using object. Therefore, after the using object of the intelligent device is identified, the using object can be counted or positioned in a mode matched with the using object, and the accuracy of counting the steps or positioning is improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
Fig. 1 is a schematic flowchart of a method for determining a usage object according to an embodiment of the present application. As shown in fig. 1, the method includes:
101. acceleration information in a plurality of motion directions generated by the intelligent device in the using process is obtained.
102. And performing characteristic analysis on acceleration information in a plurality of motion directions generated in the using process of the intelligent equipment to obtain at least one acceleration characteristic of a using object of the intelligent equipment.
103. And determining the use object of the intelligent equipment according to at least one acceleration characteristic of the use object.
In this embodiment, the smart device is an electronic device having a step counting or positioning function, and may be a terminal device such as a smart phone or a tablet computer, or a wearable device such as a smart watch, a bracelet, smart glasses, a bluetooth headset, a key ring, or an anti-lost device, but is not limited thereto.
In this embodiment, the intelligent device is provided with an acceleration sensor for acquiring acceleration information in a plurality of movement directions generated by the intelligent device during use. In the embodiment of the present application, a plurality means 2 or more, and the specific value thereof may be determined according to the number of input shafts of the acceleration sensor. For example, if the acceleration sensor is a three-axis acceleration sensor, the plurality of movement directions may be 3 movement directions, i.e., front-back, left-right, and up-down, but the present invention is not limited thereto.
Further, since different types of objects to be used have different structural features, their behavior and traveling operation are also different. The behavior and the travel motion of the usage object affect the acceleration thereof, and the usage objects of different types have different acceleration characteristics. Based on this, in step 102, the acquired acceleration information in the plurality of motion directions is subjected to feature analysis, and at least one acceleration feature of the user object is obtained. And then, performing pattern recognition on the using object according to at least one acceleration characteristic of the using object, and further determining the using object of the intelligent device. In the embodiment of the application, the determined use object of the smart device refers to the category of the use object, for example, the use object is determined to be a human, a small pet, a medium pet or a large pet.
In the embodiment of the application, different step counting manners can be adopted for different types of used objects, for example, if the used objects are human, the step counting manner conforming to human behavior characteristics can be adopted to count the used objects; for another example, if the object is an animal, the step can be counted by a step counting method according to the behavior characteristics of the animal, which helps to improve the accuracy of step counting or positioning the object.
On the other hand, in the embodiment of the application, in order to improve the security of the intelligent device and protect the privacy security of the object, a corresponding early warning device can be further arranged. For example, a special hidden function may be provided on the smart device. If the fact that the using object of the intelligent equipment is a human is determined, when the preset alarm time is up, prompt information carrying the intelligent equipment can be sent to the using object. The preset alarm time may be, but is not limited to, 5 minutes, 10 minutes, and the like.
Or, an early warning period may be preset in the intelligent device, if it is determined that the use object of the intelligent device is a human, a timer or a counter is triggered to time the early warning period, and when each early warning period arrives, a prompt message carrying the intelligent device is sent to the use object. The specific value of the early warning period can be flexibly set according to actual requirements, for example, 5 minutes, 10 minutes and the like, but is not limited thereto.
Furthermore, the implementation form of the prompt message is different according to the different structure forms of the intelligent device. For example, the smart device may include an audio component, and accordingly, the prompt information may be a voice signal, that is, a voice signal carrying the smart device is sent to the user; for another example, the smart device may include a display screen, and accordingly, the prompt information may be text information or image information, etc., that is, the prompt information is presented to the user on the display screen; for another example, the smart device may include a buzzer, and accordingly, the prompt message may be a buzzer signal, that is, a buzzer signal is sent to the user; and so on.
Furthermore, the use object can be cancelled automatically by the prompt message sent by the intelligent equipment. For example, if the user determines that the user is actively wearing or using the intelligent device, the prompt message may be cancelled; if the using object determines that the intelligent device is passively used or worn, the intelligent device can play a role in early warning the using object. Therefore, the intelligent equipment can be prevented from being used for tracking the whereabouts of other people by lawless persons, the privacy and the safety of the used object can be protected, and the safety performance of the intelligent equipment can be correspondingly improved.
In the embodiment of the application, various acceleration characteristics can be adopted to identify the use object. Alternatively, the peak interval feature, the peak fluctuation feature, the time-frequency feature, or the chaos feature of the acceleration of the combined acceleration of the usage object may be used to identify the usage object. The resultant acceleration of the use object of the smart device refers to the vector sum of the acceleration information of the smart device in multiple directions. For example, if the acceleration sensor is a triaxial acceleration sensor, the resultant acceleration of the object to be used is the vector sum of the acceleration information in the three directions of front and back, left and right, and up and down. For convenience of description and distinction, in the embodiments of the present application, the front-back direction of the usage object is defined as the x-axis direction, the left-right direction is defined as the y-axis direction, and the front-back direction is defined as the z-axis direction. Based on this, an alternative implementation of step 102 is: and performing characteristic analysis on the acceleration information of the using object in a plurality of motion directions to obtain at least one of a peak interval characteristic, a peak fluctuation characteristic, a time-frequency characteristic and an acceleration chaos characteristic of the combined acceleration of the using object of the intelligent equipment.
Accordingly, an alternative implementation of step 103 is: and determining the use object of the intelligent equipment according to at least one of the peak interval characteristic, the peak fluctuation characteristic, the time-frequency characteristic and the chaos characteristic of the acceleration of the combined acceleration of the use object of the intelligent equipment.
Further, if the using object of the smart device is determined according to the peak interval feature of the using object, an average value of the peak intervals of the resultant acceleration may be calculated, and further, if the average value is less than or equal to a preset interval threshold, the using object of the smart device may be determined to be the first type animal. The preset interval threshold value is related to the sampling rate for acquiring the acceleration information of the intelligent equipment in the using process. Under the same sampling rate, the value of the preset interval threshold can be flexibly set according to the actually identified motion behavior characteristics of the use object.
For example, under the same sampling rate, the resultant accelerations generated when the applicant acquires the human, the small dog, the medium dog and the large dog to walk and run respectively, wherein the resultant accelerations generated when the human walks and runs are shown in fig. 2a and 2 b; the resultant acceleration of the small dog during walking and running is shown in fig. 3a and 3 b; the resultant acceleration of the medium dog during walking and running is shown in fig. 4a and 4 b; the resultant acceleration of a large dog while walking and running is shown in fig. 5a and 5 b. In fig. 2 a-5 b, the horizontal axis represents the sampling points and the vertical axis represents the resultant acceleration. Further, peak intervals of the combined acceleration generated when the human, the small dog, the medium dog, and the large dog are respectively walked and run are analyzed, and the average value of the peak intervals of the combined acceleration of different types of the use objects is shown in table 1 below:
TABLE 1 comparison table of peak interval average values of resultant accelerations of different types of use objects
Figure BDA0002028265070000071
As can be analyzed from table 1, if the small dog is to be distinguished from other types of usage objects, the interval threshold may be set to 5, and accordingly, if the average value of the peak intervals of the resultant acceleration of the usage objects is less than or equal to 5, the usage object of the smart device is determined to be the small dog. Selecting an interval threshold value, wherein the interval threshold value is related to the sampling rate for acquiring the acceleration information of the intelligent equipment in multiple directions; correspondingly, if the sampling rate is increased, the preset interval threshold is also increased; otherwise, the preset interval threshold is decreased.
Further, in practical applications, only the peak interval characteristic of the resultant acceleration of the object may be considered, and the category of the object may not be accurately determined. For example, in the above-described embodiment, using the average value of the peak intervals of the resultant acceleration of the subject of use, it is not possible to identify whether the subject of use is a medium-sized dog, a large-sized dog, or a human. That is, if the average value of the peak intervals of the resultant acceleration of the subject of use is larger than the preset interval threshold, it is impossible to determine whether the subject of use is a medium-sized dog, a large-sized dog, or a human.
The applicant has found that the peak fluctuation characteristics of the combined acceleration of different types of subjects are different, and the peak fluctuation range of the combined acceleration of animals with smaller physique is larger than that of animals with larger physique and humans. Based on this, in the embodiment of the present application, if the average value of the peak intervals of the combined acceleration of the using object is greater than the preset interval threshold, the using object of the smart device is further determined according to the peak fluctuation feature of the combined acceleration of the using object. Alternatively, the peak fluctuation average of the resultant acceleration of the subject of use may be calculated. Further, if the peak fluctuation average value of the combined acceleration is larger than or equal to a preset fluctuation threshold value, determining that the using object of the intelligent device is a second animal. The preset fluctuation threshold value can be flexibly set according to the actually identified motion behavior characteristics of the use object and the parameters of the acceleration sensor on the intelligent device.
For example, the applicant analyzed the peak fluctuations of the combined accelerations of the different types of the objects of use in fig. 2a to 5b, calculated the average value of the peak fluctuations of the combined accelerations of the different types of the objects of use, and found the average value of the peak fluctuations of the combined accelerations of the different types of the objects of use as shown in table 2 below:
TABLE 2 comparison table of average value of peak fluctuation of resultant acceleration of different types of use objects
Figure BDA0002028265070000081
As can be analyzed from table 2, if the middle-sized dog is to be distinguished from other types of use objects (large-sized dogs and humans), the fluctuation threshold may be set to 5000, and accordingly, if the average value of the peak fluctuations of the resultant acceleration of the use objects is greater than or equal to 5000, the use object of the smart device is determined to be the middle-sized dog.
Further, in practical applications, only the peak interval characteristic and/or the peak fluctuation average value of the resultant acceleration of the usage object are considered, and the category of the usage object may not be accurately determined. For example, in the above-described embodiment, using the peak interval average value and the average value of the peak fluctuation of the resultant acceleration of the subject of use, it is not possible to identify whether the subject of use is a large dog or a human. That is, if the average value of the peak intervals of the resultant acceleration of the subject of use is greater than the preset interval threshold value and the peak fluctuation average value thereof is less than the preset fluctuation threshold value, it is impossible to determine whether the subject of use is a large dog or a human.
It should be noted that, in the embodiment of the present application, the execution order of the objects of use of the smart device is determined according to the peak interval feature and the peak fluctuation feature of the combined acceleration of the objects of use, and the determination of whether the objects of use are the first class animals is performed only by using the peak interval feature of the combined acceleration of the objects of use first; if the use object is determined not to be the first type animal, the peak fluctuation average value of the total acceleration of the use object is used again to judge whether the use object is the second type animal for exemplary explanation. For example, it may also be determined first whether the peak fluctuation average value of the resultant acceleration of the usage object is greater than or equal to a preset fluctuation threshold value; and if so, determining that the using object of the intelligent device is the second type animal. Correspondingly, if the judgment result is negative, the average value of the wave crest intervals of the resultant acceleration is calculated. Further, if the average value of the peak intervals of the combined acceleration is smaller than or equal to a preset interval threshold, the using object of the intelligent device is determined to be the first type animal.
Further, the applicant also finds that, because the motion behavior mode and the advancing mode of different types of the using objects are different, the time-frequency characteristics of the combined acceleration of the different types of the using objects are also different. Wherein, the time-frequency characteristic of the combined acceleration refers to that: the variation frequency of the resultant acceleration is in relation to the variation of time. Based on this, in the embodiment of the application, if the type of the usage object of the smart device cannot be determined according to the peak interval feature and the peak fluctuation feature of the combined acceleration of the usage object, the usage object of the smart device may be further determined according to the time-frequency feature of the combined acceleration of the usage object. Optionally, if the peak fluctuation average value of the combined acceleration of the using object is smaller than the preset fluctuation threshold, wavelet analysis may be performed on the combined acceleration of the using object to obtain time-frequency information of the combined acceleration of the using object of the smart device. Furthermore, the peak value of the frequency of the combined acceleration can be counted according to the time-frequency information of the combined acceleration; and if the peak value of the frequency of the resultant acceleration of the using object is greater than or equal to a preset frequency peak value threshold value, determining that the using object of the intelligent device is a human. The peak value of the frequency of the resultant acceleration of the object to be used may be an average value of the peak values of the frequency of the resultant acceleration, or may be a part or all of the peak values of the frequency of the resultant acceleration.
Wavelet functions selected for wavelet analysis on the combined acceleration are different, and the obtained time-frequency information of the combined acceleration is different. For example, the time-frequency information of the resultant acceleration can be obtained by performing wavelet analysis on the resultant acceleration of the object using harr function, Daubechies function (expressed in db N, where N represents the number of layers of the wavelet base), Morlet function, Symlets function, or the like, but is not limited thereto. Further, even if the same wavelet function is used for wavelet analysis of the combined acceleration, the obtained time-frequency information of the combined acceleration is different due to the fact that the number of layers of the adopted wavelet bases is different. The number of layers of the wavelet function and the wavelet base can be flexibly selected according to actual requirements, and is not limited herein. For example, the db5 can be used to perform wavelet analysis on the resultant acceleration of the object in use. When db5 is used to perform wavelet analysis on the resultant acceleration of the object in useThe raw signal of the resultant acceleration can be expressed as: a is5+d5+d4+d3+d2+d1(ii) a Wherein s represents the original signal of the resultant acceleration; a is5Time-frequency information representing a low-frequency signal after wavelet transform of the combined acceleration of the object of use is performed using db 5; d1~d5Respectively represent the time-frequency information of the high-frequency signals of the respective layers after wavelet transform of the resultant acceleration of the subject of use using db 5.
Furthermore, the use object of the intelligent device is determined according to the time-frequency information of the layer, and flexible setting can be performed according to the motion behavior characteristics of the use object in practical application. Optionally, a high frequency hierarchy d may be used5To determine the usage object of the smart device. Optionally, statistical high frequency stratification d5If the peak value is greater than or equal to the preset frequency peak value threshold value, determining that the use object of the intelligent device is a human. The preset frequency peak value threshold value can be set according to the movement behavior characteristics of the user object determined by actual needs, and is not limited herein.
For example, the applicant performs wavelet analysis on the combined acceleration generated when the large dog walks and runs and the person walks and runs by using the wavelet db5, and obtains data corresponding to the wavelet transform of the combined acceleration when the large dog walks as shown in fig. 6a, data corresponding to the wavelet transform of the combined acceleration when the large dog runs as shown in fig. 6b, data corresponding to the wavelet transform of the combined acceleration when the person walks as shown in fig. 7a, and data corresponding to the wavelet transform of the combined acceleration when the person runs as shown in fig. 7 b. Further, layering the high frequency d5The time-frequency information is analyzed to obtain: high frequency stratification d of the resultant acceleration of the person running5The peak frequency of (a) is around 2000, while a large dog, when walking, produces a high frequency stratification of the resultant acceleration d5Is almost near 0; high frequency stratification of resultant accelerations d produced during walking and running of large dogs5The frequency distribution of (A) is all between 200 and 500. In this example, the frequency peak threshold may be set to any value between 1500-2000, for exampleAnd may be set to 1800, 1900, 2000, etc., but is not limited thereto. Based on this, if the frequency peak value of the resultant acceleration of the using object is greater than or equal to the preset frequency peak value threshold value, it is determined that the using object of the smart device is a human.
However, in practical applications, if the frequency peak of the resultant acceleration of the user object is smaller than the preset frequency peak threshold, it may not be directly determined that the user object of the smart device is not a human. For example, in the above embodiments, the high frequency stratification d of the resultant acceleration due to walking of a person and running of a large dog5The frequency distribution of the frequency distribution is between 200 and 500, and when the frequency peak value of the resultant acceleration of the using object is smaller than a preset frequency peak value threshold value, the using object can be a person.
In order to further improve the identification accuracy of the use object of the smart device, in the embodiment of the present application, the use object may be further determined according to the chaos characteristic of the acceleration. Optionally, if the peak value of the resultant acceleration of the using object is smaller than the preset frequency peak value threshold, the chaos of the acceleration of the using object of the smart device may be calculated. Further, if the chaos degree of the acceleration of the using object is within the set chaos degree range, the using object of the intelligent device is determined to be a human.
In the embodiment of the present application, the embodiment of calculating the degree of confusion of the acceleration of the object to be used is not limited. For example, entropy values of accelerations of the subject in a plurality of movement directions may be calculated, respectively, and the entropy value of the acceleration in each movement direction is taken as the degree of confusion in the movement direction. Further, if the entropy values in each motion direction belong to the corresponding entropy value ranges in the respective directions, determining that the use object of the intelligent device is a human. Accordingly, if the entropy value in the motion direction does not belong to the corresponding entropy value range in the direction, the intelligent device is determined not to be a human.
Optionally, the chaos of the accelerations in the two corresponding motion directions may also be calculated according to the ratio distribution of the accelerations in each two motion directions. The acceleration sensor on the intelligent device is taken as a three-axis acceleration sensor, that is, the acceleration information of the using object of the intelligent device in a plurality of motion directions is the acceleration on the x axis, the y axis and the z axis respectively, so that the ratio distribution of the acceleration on each two coordinate axes can be calculated by utilizing the acceleration of the using object of the intelligent device on the x axis, the y axis and the z axis respectively; further, according to the ratio distribution of the accelerations on each coordinate axis, the peak average change rate of the ratio of the accelerations on each two coordinate axes is respectively calculated, and each peak average change rate is used as the chaos degree of the accelerations on the corresponding two coordinate axes. For example, the ratio distribution of the acceleration on the x-axis and the acceleration on the y-axis may be calculated by using the acceleration of the object on the x-axis and the y-axis, and for convenience of description and distinction, the ratio distribution of the acceleration on the x-axis and the y-axis is simply referred to as x/y distribution; further, the peak average change rate of the x/y distribution is calculated from the x/y distribution, and the peak average change rate of the x/y distribution is the degree of disorder of the acceleration in x/y. Wherein, the peak average change rate of the x/y distribution refers to: in the x/y distribution, the average value of the change rates of two adjacent peaks, i.e. the average value of the change rate of the latter peak compared with the former peak in the x/y distribution.
For example, in the embodiment of the present application, the ratio of the acceleration in each of 2 coordinate axes among the accelerations generated in the x-axis, the y-axis, and the z-axis when the person walks is calculated, as shown in fig. 8a, 8b, and 8c, respectively; and calculates the ratio of the acceleration on each of 2 coordinate axes among the accelerations generated on the x-axis, the y-axis and the z-axis while the large dog runs as shown in fig. 9a, 9b and 9c, respectively. In fig. 8a, 8b, and 8c, and 9a, 9b, and 9c, the horizontal axis represents the number of sampling points, and the vertical axis represents the ratio of the accelerations on the corresponding two coordinate axes. From the analysis of fig. 8a, 8b and 8c and 9a, 9b and 9c it can be found that: the chaos of the acceleration ratios on each coordinate axis when a large dog runs is larger than the chaos of the accelerations on the two corresponding coordinate axes when a person walks. Based on this, the chaos range can be set according to the chaos of the acceleration on each 2 coordinate axes when the person walks. And if the chaos of the acceleration ratio of the use object on each 2 coordinate axes belongs to the chaos range in the preset corresponding direction, determining that the use object of the intelligent equipment is a human.
In the embodiment of the application, the human and the animal are gradually distinguished by utilizing the acceleration characteristics of the using object of the intelligent device, so that the accuracy of identifying the using object of the intelligent device can be improved; and then when subsequently adopting and using the mode that the object matches to carry out the meter step or fix a position to it, help improving the precision of meter step or fix a position.
It should be noted that in each embodiment of the present application, only the first-class animal is a small-sized dog, the second-class animal is a medium-sized dog, and the third-class animal is a large-sized dog, where the peak interval threshold, the peak fluctuation threshold, the frequency peak threshold corresponding to the time-frequency characteristic, and the range of the degree of confusion of the acceleration of the combined acceleration may be flexibly set according to the type of the user object to be actually distinguished, and are not limited. Alternatively, the small dog may be a teddy (i.e., shoulder height 28-35 cm); the medium dog may be corgi (i.e. 38-45 cm shoulder height); the large dog may be Sammon (i.e., shoulder height 48-59cm), etc., but is not limited thereto.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subject of steps 101 and 102 may be device a; for another example, the execution subject of step 101 may be device a, and the execution subject of step 102 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 101, 102, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-described method.
Fig. 10 is a schematic structural diagram of an intelligent device according to an embodiment of the present application. As shown in fig. 10, the smart device includes: acceleration sensor 10a, memory 10b and processor 10 c.
In this embodiment, the acceleration sensor 10a is used to collect acceleration information in a plurality of motion directions generated by the smart device during use.
In the present embodiment, the memory 10b is used for storing a computer program, and may be configured to store other various data to support operations on the smart device. Wherein the processor 10c may execute a computer program stored in the memory 10b to implement the corresponding control logic. The memory 10b may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In an optional embodiment, the processor 10b, when obtaining at least one acceleration characteristic of the usage object of the smart device, is specifically configured to: performing characteristic analysis on acceleration information in a plurality of motion directions generated in the using process of the intelligent equipment to obtain at least one of a peak interval characteristic, a peak fluctuation characteristic, a time-frequency characteristic and a chaos characteristic of acceleration of the combined acceleration of a using object of the intelligent equipment; the resultant acceleration of the use object of the smart device is the vector sum of the acceleration information in the plurality of motion directions.
Accordingly, the processor 10b, when determining the usage object of the smart device, is specifically configured to: and determining the use object of the intelligent equipment according to at least one of the peak interval characteristic, the peak fluctuation characteristic, the time-frequency characteristic and the chaos characteristic of the acceleration of the combined acceleration of the use object of the intelligent equipment.
Further, when determining the usage object of the smart device, the processor 10b is specifically configured to: calculating the average value of the wave crest intervals of the resultant acceleration; and if the average value of the peak intervals of the combined acceleration is smaller than or equal to a preset interval threshold, determining that the using object of the intelligent device is the first type animal.
Further, if the average value of the peak intervals of the combined acceleration is greater than the preset interval threshold, the processor 10b is specifically configured to: calculating the peak value fluctuation average value of the resultant acceleration; and if the peak value fluctuation average value of the combined acceleration is larger than or equal to a preset fluctuation threshold value, determining that the using object of the intelligent equipment is a second animal.
Further, if the peak fluctuation average value of the combined acceleration is smaller than the preset fluctuation threshold, the processor 10b is specifically configured to: performing wavelet analysis on the combined acceleration to acquire time-frequency information of the combined acceleration of the use object of the intelligent equipment; and if the peak value of the frequency of the resultant acceleration of the using object is greater than or equal to a preset frequency peak value threshold value, determining that the using object of the intelligent device is a human.
Further, if the peak value of the resultant acceleration of the using object is smaller than the preset frequency peak value threshold, the processor 10b is specifically configured to: calculating the chaos degree of the acceleration of the use object of the intelligent equipment; and if the chaos degree of the acceleration of the using object is within the set chaos degree range, determining that the using object of the intelligent device is a human.
Further, the processor 10b, when calculating the degree of confusion of the acceleration of the usage object of the smart device, is specifically configured to: calculating the ratio distribution of the accelerations on each two coordinate axes by using the accelerations of the using object of the intelligent equipment on the x axis, the y axis and the z axis respectively; and respectively calculating the peak average change rate of the ratio of the accelerations of every two coordinate axes according to the ratio distribution of the accelerations of every two coordinate axes, and taking each peak average change rate as the chaos degree of the accelerations of the corresponding two coordinate axes.
In another alternative embodiment, the processor 10b is further configured to: if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when the preset alarm time is up; or if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when each early warning period is reached.
In some embodiments, the smart device further comprises a communication component 10 d. The communication component 10d is configured to facilitate wired or wireless communication between the smart device and other devices. The smart device may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) modules, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In other embodiments, the smart device further comprises a power supply component 10 e. The power supply component 10e is configured to provide power to the various components of the smart device. The power components 10e may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the devices in which the power components are located.
In some embodiments, the smart device may further include a sound input/output unit 10f that may be configured to output and/or input an audio signal, such as projected sound, and the like. For example, the sound input/output unit 10f includes a Microphone (MIC) configured to receive an external audio signal when the apparatus in which the audio component is located is in an operation mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via the communication component 10 d. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for a smart device having a language interaction function, voice interaction with a user or the like may be realized through the sound input/output unit 10 f.
Accordingly, the smart device may further include a sound processing unit 10g for processing a sound signal input or output by the sound input/output unit 10 f.
In some embodiments, the smart device further comprises: and a display 10 h. The display 10h may include a Liquid Crystal Display (LCD) and or a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
Accordingly, the smart device may further include an image processing unit 10i for performing signal processing such as image quality correction with respect to the image signal output from the processor 10b and converting the resolution thereof to a resolution according to the screen of the display 10 h. Then, the display driving unit 10j sequentially selects each row of pixels of the display 10h and sequentially scans each row of pixels of the display 10h row by row, thereby supplying pixel signals based on the signal-processed image signals.
It should be noted that only some of the components are schematically shown in fig. 10, which does not mean that the smart device must include all of the components shown in fig. 10, nor does the smart device include only the components shown in fig. 10. In addition, the smart device inputs an operation unit (not shown in fig. 10) in addition to the components shown in fig. 10. The input operation unit includes at least one operation member, such as a key, a button, a switch, or other members having similar functions, for performing an input operation, and receives a user instruction through the operation member and outputs the instruction to the processor 10 b.
The intelligent device provided by this embodiment performs characteristic analysis on acceleration information in a plurality of motion directions generated in the using process of the intelligent device, obtains acceleration characteristics of a using object of the intelligent device, and determines the using object of the intelligent device according to the acceleration characteristics of the using object. Therefore, after the using object of the intelligent device is identified, the using object can be counted or positioned in a mode matched with the using object, and the accuracy of counting the steps or positioning is improved.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (5)

1. A usage object determining method, comprising:
acquiring acceleration information in a plurality of motion directions generated by the intelligent equipment in the using process;
performing characteristic analysis on the acceleration information in the multiple movement directions to obtain at least one of a peak interval characteristic, a peak fluctuation characteristic, a time-frequency characteristic and a chaos characteristic of acceleration of the combined acceleration of the use object of the intelligent equipment; wherein, the resultant acceleration of the use object of the intelligent device refers to the vector sum of the acceleration information in the plurality of motion directions;
calculating the average value of the wave crest intervals of the resultant acceleration;
if the average value of the wave crest intervals of the combined acceleration is smaller than or equal to a preset interval threshold value, determining that the using object of the intelligent equipment is a first class animal;
if the average value of the wave crest intervals of the combined acceleration is larger than a preset interval threshold value, calculating the peak value fluctuation average value of the combined acceleration;
if the peak value fluctuation average value of the combined acceleration is larger than or equal to a preset fluctuation threshold value, determining that the using object of the intelligent equipment is a second animal;
if the peak value fluctuation average value of the combined acceleration is smaller than a preset fluctuation threshold value, performing wavelet analysis on the combined acceleration to acquire time-frequency information of the combined acceleration of the use object of the intelligent equipment;
if the peak value of the frequency of the resultant acceleration of the using object is larger than or equal to a preset frequency peak value threshold value, determining that the using object of the intelligent equipment is a human;
the method further comprises the following steps:
if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when preset alarm time is up; or,
if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when each early warning period is reached.
2. The method according to claim 1, wherein if a peak value of a combined acceleration of the user objects is smaller than a preset frequency peak value threshold, determining the user objects of the smart device according to at least one of a peak interval feature, a peak fluctuation feature, a time-frequency feature, and a chaos feature of the acceleration of the combined acceleration, further comprising:
calculating the chaos degree of the acceleration of the using object of the intelligent equipment;
and if the chaos degree of the acceleration of the using object is within the set chaos degree range, determining that the using object of the intelligent equipment is a human.
3. The method of claim 2, wherein the calculating the degree of confusion of the acceleration of the object of use of the smart device comprises:
calculating the ratio distribution of the accelerations on each two coordinate axes by utilizing the accelerations of the using object of the intelligent equipment on the x axis, the y axis and the z axis respectively;
and respectively calculating the peak average change rate of the ratio of the accelerations of every two coordinate axes according to the ratio distribution of the accelerations of every two coordinate axes, and taking each peak average change rate as the chaos degree of the accelerations of the corresponding two coordinate axes.
4. A smart device, comprising: an acceleration sensor, a memory and a processor; the acceleration sensor is used for acquiring acceleration information in a plurality of motion directions generated by the intelligent equipment in the using process;
the memory for storing a computer program;
the processor is coupled to the memory for executing the computer program for:
performing characteristic analysis on the acceleration information in the multiple movement directions to obtain at least one of a peak interval characteristic, a peak fluctuation characteristic, a time-frequency characteristic and a chaos characteristic of acceleration of the combined acceleration of the use object of the intelligent equipment; wherein, the resultant acceleration of the use object of the intelligent device refers to the vector sum of the acceleration information in the plurality of motion directions;
calculating the average value of the wave crest intervals of the resultant acceleration;
if the average value of the wave crest intervals of the combined acceleration is smaller than or equal to a preset interval threshold value, determining that the using object of the intelligent equipment is a first class animal;
if the average value of the wave crest intervals of the combined acceleration is larger than a preset interval threshold value, calculating the peak value fluctuation average value of the combined acceleration;
if the peak value fluctuation average value of the combined acceleration is larger than or equal to a preset fluctuation threshold value, determining that the using object of the intelligent equipment is a second animal;
if the peak value fluctuation average value of the combined acceleration is smaller than a preset fluctuation threshold value, performing wavelet analysis on the combined acceleration to acquire time-frequency information of the combined acceleration of the use object of the intelligent equipment;
if the peak value of the frequency of the resultant acceleration of the using object is larger than or equal to a preset frequency peak value threshold value, determining that the using object of the intelligent equipment is a human;
the processor is further configured to:
if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when preset alarm time is up; or,
if the use object of the intelligent equipment is determined to be a human, sending prompt information carrying the intelligent equipment to the use object when each early warning period is reached.
5. A computer-readable storage medium having stored thereon computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the method of any one of claims 1-3.
CN201910301115.7A 2019-04-15 2019-04-15 Usage object determination method, usage object determination apparatus, and storage medium Active CN110017834B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910301115.7A CN110017834B (en) 2019-04-15 2019-04-15 Usage object determination method, usage object determination apparatus, and storage medium
PCT/CN2019/129571 WO2020211459A1 (en) 2019-04-15 2019-12-28 Use object determining method, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910301115.7A CN110017834B (en) 2019-04-15 2019-04-15 Usage object determination method, usage object determination apparatus, and storage medium

Publications (2)

Publication Number Publication Date
CN110017834A CN110017834A (en) 2019-07-16
CN110017834B true CN110017834B (en) 2021-12-24

Family

ID=67191368

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910301115.7A Active CN110017834B (en) 2019-04-15 2019-04-15 Usage object determination method, usage object determination apparatus, and storage medium

Country Status (2)

Country Link
CN (1) CN110017834B (en)
WO (1) WO2020211459A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110017834B (en) * 2019-04-15 2021-12-24 歌尔科技有限公司 Usage object determination method, usage object determination apparatus, and storage medium

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193633A (en) * 2011-05-25 2011-09-21 广州畅途软件有限公司 dynamic sign language recognition method for data glove
CN103310192A (en) * 2013-06-06 2013-09-18 南京邮电大学 Movement behavior recognition method based on axial acceleration sensor
CN103458051A (en) * 2013-09-16 2013-12-18 南京大学 Somatosensory network and household behavior perception method based on same
CN104021295A (en) * 2014-06-12 2014-09-03 广州三星通信技术研究有限公司 Clustering feature fusion method and device for motion recognition
CN104111733A (en) * 2014-07-29 2014-10-22 上海交通大学 Gesture recognition system and method
CN104200234A (en) * 2014-07-11 2014-12-10 杭州微纳科技有限公司 Human body action modeling and recognizing method
CN104991644A (en) * 2015-06-24 2015-10-21 小米科技有限责任公司 Method and device for determining mobile terminal usage objects
CN105589383A (en) * 2016-01-05 2016-05-18 金陵科技学院 Tracking type intelligent equipment for pet wearing and control method thereof
CN106339071A (en) * 2015-07-08 2017-01-18 中兴通讯股份有限公司 Method and device for identifying behaviors
CN106618584A (en) * 2015-11-10 2017-05-10 北京纳通科技集团有限公司 Method for monitoring lower limb movement of user
CN107016384A (en) * 2017-06-05 2017-08-04 深圳天珑无线科技有限公司 Step-recording method, mobile terminal and the storage medium of recognizable type of sports

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE10164534A1 (en) * 2001-12-31 2003-07-10 Dirk Parchmann Device and method for determining parameters of the movement of a body
US8954251B2 (en) * 2004-10-05 2015-02-10 Vision Works Ip Corporation Absolute acceleration sensor for use within moving vehicles
CN103445792B (en) * 2011-12-31 2017-05-03 北京超思电子技术有限责任公司 Step metering method
JP2016080620A (en) * 2014-10-21 2016-05-16 アズビル株式会社 Human detection system and method
CN105806359A (en) * 2016-05-17 2016-07-27 深圳市纬科联通讯有限公司 Step counting method and pedometer
CN106570479B (en) * 2016-10-28 2019-06-18 华南理工大学 A pet motion recognition method for embedded platform
CN106643722A (en) * 2016-10-28 2017-05-10 华南理工大学 Method for pet movement identification based on triaxial accelerometer
US10075846B1 (en) * 2017-08-10 2018-09-11 The Florida International University Board Of Trustees Method for continuous user authentication with wearables
CN107831907A (en) * 2017-12-07 2018-03-23 深圳先进技术研究院 Identity identifying method and device based on Gait Recognition
CN108182004B (en) * 2018-01-19 2019-07-23 百度在线网络技术(北京)有限公司 The method and apparatus of the behavior pattern of the carrier of mobile terminal are carried for identification
CN108847941B (en) * 2018-05-31 2021-08-06 上海众人网络安全技术有限公司 Identity authentication method, device, terminal and storage medium
CN108932504A (en) * 2018-07-24 2018-12-04 中国科学院深圳先进技术研究院 Identity identifying method, device, electronic equipment and storage medium
CN109166275B (en) * 2018-09-25 2020-08-11 山东科技大学 A human fall detection method based on acceleration sensor
CN110017834B (en) * 2019-04-15 2021-12-24 歌尔科技有限公司 Usage object determination method, usage object determination apparatus, and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102193633A (en) * 2011-05-25 2011-09-21 广州畅途软件有限公司 dynamic sign language recognition method for data glove
CN103310192A (en) * 2013-06-06 2013-09-18 南京邮电大学 Movement behavior recognition method based on axial acceleration sensor
CN103458051A (en) * 2013-09-16 2013-12-18 南京大学 Somatosensory network and household behavior perception method based on same
CN104021295A (en) * 2014-06-12 2014-09-03 广州三星通信技术研究有限公司 Clustering feature fusion method and device for motion recognition
CN104200234A (en) * 2014-07-11 2014-12-10 杭州微纳科技有限公司 Human body action modeling and recognizing method
CN104111733A (en) * 2014-07-29 2014-10-22 上海交通大学 Gesture recognition system and method
CN104991644A (en) * 2015-06-24 2015-10-21 小米科技有限责任公司 Method and device for determining mobile terminal usage objects
CN106339071A (en) * 2015-07-08 2017-01-18 中兴通讯股份有限公司 Method and device for identifying behaviors
CN106618584A (en) * 2015-11-10 2017-05-10 北京纳通科技集团有限公司 Method for monitoring lower limb movement of user
CN105589383A (en) * 2016-01-05 2016-05-18 金陵科技学院 Tracking type intelligent equipment for pet wearing and control method thereof
CN107016384A (en) * 2017-06-05 2017-08-04 深圳天珑无线科技有限公司 Step-recording method, mobile terminal and the storage medium of recognizable type of sports

Also Published As

Publication number Publication date
WO2020211459A1 (en) 2020-10-22
CN110017834A (en) 2019-07-16

Similar Documents

Publication Publication Date Title
US11412441B2 (en) Worker safety system
US20180107793A1 (en) Health activity monitoring and work scheduling
US11188961B2 (en) Service execution method and device
US10866950B2 (en) Method and system for modifying a search request corresponding to a person, object, or entity (POE) of interest
US11853805B2 (en) Device and method of assigning a digital-assistant task to a mobile computing device in response to an incident
US20180060500A1 (en) Smart health activity scheduling
AU2015296833B2 (en) Providing notifications based on user activity data
CN104457955B (en) Body weight information acquisition method, apparatus and system
CN106461789B (en) For crossing over the system and technology that are controlled based on geography fence
CN110032670A (en) Method for detecting abnormality, device, equipment and the storage medium of time series data
US11862004B2 (en) Methods and systems for disabling sleep alarm based on automated wake detection
CN110321790A (en) The detection method and electronic equipment of a kind of pair of resisting sample
CA3062317A1 (en) Wearable electronic belt device
KR102860386B1 (en) Method and system for classifying time series data
US20150172441A1 (en) Communication management for periods of inconvenience on wearable devices
US20180220973A1 (en) Smart devices that capture images and sensed signals
KR20160079664A (en) Device and control method for controlling a wearable device
CN110017834B (en) Usage object determination method, usage object determination apparatus, and storage medium
CN105277193B (en) Prompt information output method, apparatus and system
CN112826514A (en) A classification method, device, terminal and storage medium for atrial fibrillation signals
CN109889858A (en) Information processing method, device and the computer readable storage medium of virtual objects
CN214854810U (en) Intelligent bracelet and electronic equipment
CN104991644A (en) Method and device for determining mobile terminal usage objects
Xenakis et al. Towards personality detection and prediction using smartphone sensor data
WO2022161003A1 (en) Skin care check-in method and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant