WO2016035464A1 - Procédé d'analyse, système et dispositif d'analyse - Google Patents
Procédé d'analyse, système et dispositif d'analyse Download PDFInfo
- Publication number
- WO2016035464A1 WO2016035464A1 PCT/JP2015/070732 JP2015070732W WO2016035464A1 WO 2016035464 A1 WO2016035464 A1 WO 2016035464A1 JP 2015070732 W JP2015070732 W JP 2015070732W WO 2016035464 A1 WO2016035464 A1 WO 2016035464A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- series data
- time
- sensor
- section
- behavior
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B69/00—Training appliances or apparatus for special sports
- A63B69/38—Training appliances or apparatus for special sports for tennis
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
Definitions
- the present disclosure relates to an analysis method, a system, and an analysis apparatus.
- Patent Document 1 discloses a swing feature that calculates swing feature information of a swing based on output data of a sensor unit including a motion sensor that detects a swing of an exercise device and a motion sensor corresponding to the swing.
- a swing analyzer including the above has been proposed.
- the time-series data provided by the sensor is affected by the spatial positional relationship between the sensor and the object. That is, if the spatial positional relationship between the sensor and the object is different, the detected value of the provided time-series data is different, and the analysis result of the time-series data, for example, the detection result of the behavior of the object may be different. .
- the first time-series data provided by the first sensor attached to the first object in the first spatial positional relationship and the same type as the first object.
- a second section of the second time-series data corresponding to the behavior of the first type generated in the second object is extracted.
- calculating a correction parameter between the first time series data and the second time series data by comparing the first section and the second section. Analysis methods including It is.
- the first sensor mounted on the first object in the first spatial positional relationship, and the second space that is the same type as the first object are included in the second space.
- a time-series data acquisition unit, a first section of the first time-series data corresponding to the first type of behavior generated in the first object, and the first type generated in the second object A section extracting unit for extracting the second section of the second time series data corresponding to the behavior of the first time series data, and comparing the first section with the second section.
- the second time-series data System comprising an analysis device comprising a correction parameter calculation unit for calculating a correction parameter between is provided.
- the first time-series data provided by the first sensor attached to the first object in the first spatial positional relationship is the same type as the first object.
- a time-series data acquisition unit that acquires second time-series data provided by a second sensor mounted on a second object in a second spatial positional relationship, generated in the first object
- a correction parameter between the first time series data and the second time series data by comparing the first section and the second section. Correction parameters to calculate Analysis device comprising a data calculation unit is provided.
- an appropriate analysis result can be obtained by correcting time-series data provided by a plurality of sensors whose spatial positional relationship with an object may be different.
- FIG. 2 is a block diagram schematically showing an apparatus configuration of a system according to a first embodiment of the present disclosure. It is a figure for demonstrating the example of how to attach the sensor apparatus to the racket in 1st Embodiment of this indication. It is a figure for demonstrating the other example of the attachment method to the racket of the sensor apparatus in 1st Embodiment of this indication. It is a figure for demonstrating the other example of the attachment method to the racket of the sensor apparatus in 1st Embodiment of this indication.
- FIG. 3 is a block diagram schematically illustrating a functional configuration of a server according to the first embodiment of the present disclosure.
- 6 is a flowchart illustrating an example of processing at the time of calculating a correction parameter according to the first embodiment of the present disclosure.
- 14 is a flowchart illustrating an example of a play event estimation process using a correction parameter according to the first embodiment of the present disclosure.
- 14 is a flowchart illustrating another example of a play event estimation process using a correction parameter according to the first embodiment of the present disclosure. It is a block diagram showing roughly the device composition of the system concerning a 2nd embodiment of this indication. It is a figure which shows the 1st example of the usage pattern in 2nd Embodiment of this indication. It is a figure which shows the 2nd example of the usage pattern in 2nd Embodiment of this indication. It is a block diagram showing roughly the functional composition of the server concerning a 3rd embodiment of this indication. 12 is a flowchart illustrating an example of processing according to the third embodiment of the present disclosure. It is a block diagram which shows the hardware structural example of the analyzer which concerns on embodiment of this indication.
- FIG. 1 is a diagram illustrating an example of a system configuration according to the first embodiment of the present disclosure.
- the system 10 includes a sensor device 100, a smartphone 200, and a server 300.
- the sensor device 100 is mounted on a tennis racket R.
- the sensor device 100 includes, for example, a motion sensor (for example, an acceleration sensor, a gyro sensor, a geomagnetic sensor, etc.). Based on the motion detection result provided by the sensor device 100, it is possible to define a play event that has occurred in the sport.
- the sensor device 100 may include a vibration sensor.
- the data detected by the vibration sensor can easily identify the section corresponding to the play event (for example, the section before and after the impact of the ball).
- the data detected by the vibration sensor may also be used for play event analysis in the same manner as the data detected by the motion sensor.
- the sensor device 100 may further include a sensor for acquiring environment information of a user who plays sports, such as temperature, humidity, brightness, and position. Data detected by various sensors included in the sensor device 100 is preprocessed as necessary, and then transmitted to the smartphone 200 by wireless communication such as Bluetooth (registered trademark).
- a sensor for acquiring environment information of a user who plays sports such as temperature, humidity, brightness, and position.
- Data detected by various sensors included in the sensor device 100 is preprocessed as necessary, and then transmitted to the smartphone 200 by wireless communication such as Bluetooth (registered trademark).
- the smartphone 200 is disposed in the vicinity of a user who is playing sports, for example.
- the smartphone 200 receives data transmitted from the sensor device 100 through wireless communication such as Bluetooth (registered trademark), temporarily stores and processes the data as necessary, and then performs data transmission through network communication. Is transmitted to the server 300.
- the smartphone 200 may receive the result of the analysis performed by the server 300 based on the transmitted data and output it to the user via a display, a speaker, or the like. Note that the user does not have to play sports when the analysis result is output.
- the output of the analysis result may be executed by an information processing terminal used by the user, for example, various personal computers or tablet terminals, a game machine, a television, or the like other than the smartphone 200.
- the smartphone 200 does not necessarily have to be arranged in the vicinity of the user who is playing sports.
- the sensor device 100 accumulates the detected data in an internal storage area (memory or external storage device).
- data may be transmitted from the sensor device 100 to the smartphone 200 by wireless communication such as Bluetooth (registered trademark).
- wireless communication such as Bluetooth (registered trademark).
- data may be transmitted when the sensor device 100 and the smartphone 200 are connected by wire via USB or the like.
- a removable recording medium may be used for data transfer from the sensor device 100 to the smartphone 200.
- the server 300 communicates with the smartphone 200 via the network, and receives data detected by various sensors included in the sensor device 100.
- the server 300 executes an analysis process using the received data, and generates various types of information regarding sports play.
- the server 300 defines a play event based on data obtained by a motion sensor and indicating a motion of a user playing a sport.
- a play event corresponds to one shot using a racket R, for example.
- a play event for example, a user's play represented by motion data can be grasped as a series of plays having a meaning such as ⁇ serve, stroke, volley,.
- the information generated by the analysis processing of the server 300 is transmitted to the smartphone 200, for example, and output to the user via the display or speaker of the smartphone 200.
- the server 300 may transmit information to an information processing terminal other than the smartphone 200 and output the information to the user.
- the server 300 performs analysis processing based on data received for each of a plurality of users, generates information based on the result of comparing the play event patterns generated for each user, and the information on each user. You may transmit to a processing terminal.
- FIG. 2 is a block diagram schematically showing a device configuration of the system according to the first embodiment of the present disclosure.
- the sensor device 100 includes a sensor 110, a processing unit 120, and a transmission unit 130.
- the smartphone 200 includes a reception unit 210, a processing unit 220, a storage unit 230, a transmission unit 240, an imaging unit 250, an input unit 260, and an output unit 270.
- Server 300 includes a reception unit 310, a processing unit 320, a storage unit 330, and a transmission unit 340.
- a hardware configuration example (a hardware configuration example of a sensor device and an analysis device) for realizing each device will be described later.
- the processing unit 120 processes the data acquired by the sensor 110, and the transmission unit 130 transmits the processed data to the smartphone 200.
- the sensor 110 includes, for example, a motion sensor, and detects a motion of a user who plays sports.
- the sensor 110 may further include a vibration sensor, a sensor for acquiring user environment information, and the like.
- the processing unit 120 is realized by a processor that operates according to a program, and preprocesses data acquired by the sensor 110 as necessary.
- the preprocessing can include, for example, sampling and noise removal. Note that the preprocessing does not necessarily have to be executed.
- the transmission unit 130 is realized by a communication device, and transmits data to the smartphone 200 using wireless communication such as Bluetooth (registered trademark).
- the sensor device 100 may include a storage unit for temporarily storing data.
- the reception unit 210 receives data transmitted by the sensor device 100, and the transmission unit 240 transmits data to the server 300.
- the reception unit 210 and the transmission unit 240 are realized by a communication device that performs wireless communication such as Bluetooth (registered trademark) and wired or wireless network communication.
- the received data is transmitted after being temporarily stored in the storage unit 230 by the processing unit 220, for example. Further, the processing unit 220 may perform preprocessing on the received data.
- the processing unit 220 is realized by a processor that operates according to a program, and the storage unit 230 is realized by a memory or a storage.
- the receiving unit 210 may further receive information transmitted from the server 300.
- the received information is output from the output unit 270 to the user according to the control of the processing unit 220, for example.
- the output unit 270 includes a display and a speaker, for example.
- an image may be acquired by the imaging unit 250.
- the imaging unit 250 is realized by, for example, a camera module that combines an imaging device with an optical system such as a lens.
- the image may include a user who plays sports as a subject.
- the image acquired by the imaging unit 250 is transmitted from the transmission unit 240 to the server 300 together with the data received by the reception unit 210, for example.
- the image may be used in the analysis process in the server 300 together with the data acquired by the sensor device 100, for example, or may be incorporated in information generated by the analysis process.
- the input unit 260 includes, for example, a touch panel, hardware buttons, and / or a microphone and a camera for receiving voice input and gesture input.
- the processing unit 220 may request information from the server 300 via the transmission unit 240 according to a user operation acquired via the input unit 260.
- the server 300 includes a reception unit 310, a processing unit 320, a storage unit 330, and a transmission unit 340.
- the receiving unit 310 is realized by a communication device, and receives data transmitted from the smartphone 200 using network communication such as the Internet.
- the processing unit 320 is realized by a processor such as a CPU, and processes received data. For example, the processing unit 320 may execute an analysis process on the received data, accumulate the analyzed data in the storage unit 330, or output the data via the transmission unit 340. Alternatively, the processing unit 320 may only execute accumulation and output control of data already analyzed in the smartphone 200 or the like.
- the analysis process using the data acquired by the sensor device 100 is executed by the processing unit 320 of the server 300, but the analysis process may be executed by the processing unit 220 of the smartphone 200, It may be executed by the processing unit 120 of the sensor device 100.
- the system 10 is described as including the sensor device 100, the smartphone 200, and the server 300, for example, when an analysis process is executed by the processing unit 220 of the smartphone 200, the server 300 is included in the system 10. It does not have to be included.
- the server 300 may store the information obtained by the analysis process and provide a service shared between users.
- the smartphone 10 and the server 300 may not be included in the system 10.
- the sensor device 100 may be a dedicated sensor device attached to a user or a tool, for example, or a sensor module mounted on a portable information processing terminal may function as the sensor device 100. Therefore, the sensor device 100 can be the same device as the smartphone 200.
- FIG. 3 is a diagram for describing an example of how to attach the sensor device to the racket according to the first embodiment of the present disclosure.
- the sensor device 100a is attached to the grip end GE of the racket R.
- the grip end GE is cylindrical, and the lower part of the sensor device 100a is fitted inside the grip end GE.
- a claw or a convex portion is formed on the outer peripheral surface of the lower portion of the sensor device 100a, and the claw or convex portion is engaged with a groove formed on the inner peripheral surface of the grip end GE, whereby the sensor device 100a. May be fixed to the grip end GE.
- Such a hooking structure may be formed at a plurality of locations for convenience when the user attaches the sensor device 100a to the grip end GE. For example, when two nails or protrusions and grooves are formed at intervals of 180 degrees on the outer peripheral surface of the lower part of the sensor device 100a and the inner peripheral surface of the grip end GE, the user can If the 180 degree sensor apparatus 100a is rotated, the lower part of the sensor apparatus 100a can be inserted inside the grip end GE.
- the user If four nails or projections and grooves are formed at intervals of 90 degrees on the outer peripheral surface of the lower part of the sensor device 100a and the inner peripheral surface of the grip end GE, the user If the sensor device 100a is rotated at most 90 degrees, the lower part of the sensor device 100a can be inserted inside the grip end GE.
- a difference regarding the rotational direction ROT may occur in the spatial positional relationship between the sensor 110 and the racket R included in the sensor device 100a attached to the grip end GE. That is, the relationship between the coordinate system based on the sensor device 100a and the coordinate system based on the racket R is not necessarily uniquely determined, and the rotational direction ROT may be 90 degrees, 180 degrees, or the like. Such a difference also affects the detection value of the sensor 110.
- the motion provided in each sensor device 100a is inverted in a certain direction of, for example, three axes (x axis, y axis, z axis).
- FIGS. 4A to 4C are diagrams for explaining another example of how to attach the sensor device to the racket according to the first embodiment of the present disclosure.
- the sensor device 100a is attached to the grip end GF of the racket R as in the example of FIG.
- the sensor device 100 b is attached to the tip portion of the grip G of the racket R.
- the sensor device 100b is fixed by being wound around the tip portion of the grip G.
- the sensor device 100 c is attached to the shaft S of the racket R.
- the sensor device 100c is fixed by being wound around the shaft S or sandwiching the shaft S.
- the racket R there is a difference in the spatial positional relationship between the sensor 110 included in each sensor device and the object to which the sensor 110 is attached, that is, the racket R.
- This difference is not limited to the difference in angle with respect to the rotation direction ROT around the grip end GE as shown in the example of FIG. 3 above, for example, the difference in rotation angle in other directions and the distance to each part of the racket R Including differences.
- the time series data provided by the motion sensor provided in each sensor device includes a detection value.
- the difference in direction space rotation direction
- FIG. 5 is provided by the sensor 110 included in the sensor device 100 due to the difference in the spatial positional relationship of the sensor device 100 with respect to the racket R as described with reference to FIGS. 3 and 4A to 4C, for example. It is a figure for demonstrating the example of the difference which may generate
- FIG. 5 shows a waveform D0 detected by a certain sensor device 100 and waveforms D1 to D3 detected by other sensor devices 100 having a spatial positional relationship with respect to the racket R different from that of the sensor device 100. Yes.
- the detected value of the waveform D1 is inverted with respect to the waveform D0.
- the waveform D2 has a smaller amplitude scale than the waveform D0.
- the waveform D3 is stretched in the time axis (t) direction compared to the waveform D0.
- the server 300 detects a play event that has occurred in the motion of the user who plays sports based on the detection result of the user's motion provided by the sensor device 100.
- the detection of a play event is, for example, associating a motion sensor time-series data pattern with a play event, and generating the play event when the pattern occurs in the motion sensor time-series data provided from the sensor device 100. To detect.
- the pattern associated with the play event is defined for the time series data (for example, the waveform D0) of the motion sensor included in the sensor device 100 attached to the racket R with a certain spatial positional relationship
- Time series data for example, the above
- the defined pattern may not be applicable to the waveforms D1 to D3).
- the relationship between the difference in time-series data and the spatial positional relationship is quantitatively specified. Have difficulty. Similarly, it is not realistic to define a plurality of time-series data patterns of the motion sensor corresponding to the play event corresponding to the spatial positional relationship of the sensor device 100 with respect to the racket R.
- the processing unit 320 includes the first time-series data provided by the first sensor mounted on the racket R in the first spatial positional relationship, and the racket R.
- the second time-series data provided by the second sensor mounted in the second spatial positional relationship, and the first time-series data corresponding to the first type of behavior generated in the racket R By comparing the first section and the second section of the second time series data corresponding to the same behavior, the correction parameter between the first time series data and the second time series data is determined. calculate.
- the time series data provided by the plurality of sensor devices 100 having different spatial positional relationships with the racket R are used. Similarly, the occurrence of a play event can be detected.
- FIG. 6 is a block diagram schematically illustrating a functional configuration of the server according to the first embodiment of the present disclosure.
- the processing unit 320 of the server 300 includes a time-series data acquisition unit 321, a section extraction unit 322, a correction parameter calculation unit 323, a similarity calculation unit 324, and a class estimation unit 325. All of these functional configurations are realized by software, for example, by a processor that implements the processing unit 320.
- the storage unit 330 of the server 300 stores the reference data 331, and the correction parameter database 332 can be constructed.
- the illustrated functional configuration will be further described below.
- the time series data acquisition unit 321 acquires sensor data provided by the sensor device 100 via the reception unit 310.
- the sensor data includes time series data acquired by the sensor 110 included in the sensor device 100.
- the time series data may be preprocessed.
- the time series data acquisition unit 321 may acquire data from the reference data 331 as necessary.
- the reference data 331 is data that is generated based on, for example, one or more sensor data acquired in advance and is used as a reference for detecting a play event.
- the time-series data acquisition unit 321 acquires at least two time-series data.
- the time-series data acquisition unit 321 may acquire sensor data provided from two or more sensor devices 100.
- the two or more sensor devices 100 may include the sensor devices 100 mounted on the racket R in different spatial positional relationships.
- the time-series data acquisition unit 321 may acquire sensor data provided from at least one sensor device 100 and at least one reference data 331.
- the reference data 331 is generated based on previously acquired sensor data, the spatial positional relationship with the racket R of the sensor device 100 that provided the original sensor data is reflected. . This positional relationship may differ from the spatial positional relationship with the racket R in the sensor device 100 that has newly provided sensor data.
- the time-series data acquisition unit 321 as described above is provided by the first sensor (the sensor 110 included in the sensor device 100) mounted on the first object (the racket R) in the first spatial positional relationship.
- the second time series data provided by (possibly) is acquired.
- the first object and the second object may be the same type but different individuals or the same individual (when the sensor device 100 is replaced with the same racket R).
- the first spatial positional relationship and the second spatial positional relationship may be different as in the example of the sensor devices 100a to 100c described above, but may be the same by chance.
- the section extraction unit 322 extracts a section necessary for the processing of the correction parameter calculation unit 323 from the time series data acquired by the time series data acquisition unit 321. Similarly, the section extraction unit 322 may extract a section necessary for the processing of the similarity calculation unit 324 and the class estimation unit 325. More specifically, the section necessary for the processing of the correction parameter calculation unit 323 is a section corresponding to the same type of play event in the time series data acquired by the time series data acquisition unit 321. That is, for example, the section extraction unit 322 extracts a section corresponding to the play event of “forehand stroke” from the time-series data provided by the sensor device 100 mounted on each of the plurality of rackets R.
- the section extracting unit 322 performs the first type of behavior (forehand stroke) generated in the first object (racquet R).
- the time series data acquired by the time series data acquisition unit 321 includes at least one set of the same type of play events.
- a section corresponding to (forehand stroke in the above example) is included.
- Such a section is already specified based on the result of the analysis performed in advance for the reference data 331, for example.
- the time series data newly acquired from the sensor device 100 via the receiving unit 310 is information such as a tag input by the user while referring to the play image in real time or after the play. Based on this, a section corresponding to the play event is specified.
- the correction parameter calculation unit 323 calculates a correction parameter between time series data by comparing the sections extracted by the section extraction unit 322.
- the correction parameter is used to correct at least one of the first time-series data and the second time-series data acquired by the time-series data acquisition unit 321 so as to approach each other.
- the correction parameter is obtained by using any one of a plurality of time-series data acquired by the time-series data acquisition unit 321 as reference time-series data, and the sensor device 100 and the racket between the other time-series data. It is used to correct a difference caused by a spatial positional relationship with R. More specifically, when the time-series data acquisition unit 321 acquires data from the reference data 331, the correction parameter includes the time-series data newly provided from the sensor device 100 via the reception unit 310 and the reference data 331. Is used to correct the difference between the time-series data included in.
- the time series data included in the reference data 331 is corrected after correcting the time series data newly provided from the sensor device 100. Can be identified as a play event occurrence section in the time-series data newly provided from the sensor device 100.
- the correction parameter is used to correct a difference between the respective time-series data. It is done. As described above, even with newly acquired time-series data, it corresponds to a play event based on information such as a tag input by a user while referring to a play image in real time during a play or afterwards. It is possible to specify the section. If the time series data provided from each of the plurality of sensor devices 100 can be compared after correcting the difference due to the spatial positional relationship between the sensor device 100 and the racket R, a play event specified for a certain time series data is obtained.
- a play event identified in a certain section of the first time-series data is between the first time-series data and the second time-series data acquired by the time-series data acquisition unit 321.
- the process of estimating that the time series data similar in the second time series data also occurred, and conversely, the play event specified in a certain section of the second time series data also occurred in the similar time series data Both the process of estimating this can be performed.
- Such a process can be said to be a process of sharing a play event in a section having a high degree of similarity between time-series data.
- the correction parameters calculated by the correction parameter calculation unit 323 as described above are stored in the correction parameter database 332.
- the correction parameter is recorded in association with the ID of the sensor device 100.
- the already calculated correction parameter is applied to the time series data provided from the same sensor device 100 thereafter. It is possible.
- the ID and the correction parameter of the sensor device 100 in the correction parameter database 332 The association may be reset by elapse of a predetermined time or a change in setting information (for example, the type of the racket R) related to the sensor device 100.
- the calculated correction parameter is used by the section extraction unit 322, for example.
- the section extraction unit 322 reads the correction parameter from the correction parameter database 332 for the sensor device 100 for which the correction parameter has already been calculated, and corrects the time-series data when extracting the section.
- the section extraction unit 322 uses the time series data newly provided from the sensor device 100 as the racket R and the sensor device 100. After removing the influence of the spatial positional relationship, the time series data included in the reference data 331 is compared with a section where a similar pattern appears for processing by the similarity calculation unit 324 and the class estimation unit 325. Can be extracted.
- the section extraction unit 322 removes the influence due to the spatial positional relationship between the racket R and the sensor device 100 from the time-series data provided from each of the plurality of sensor devices 100, and patterns similar to each other appear. You may extract the section which is.
- the similarity calculation unit 324 calculates the similarity between time series data for the section extracted by the section extraction unit 322 as described above. If the play event corresponding to the section is identified in any time-series data for a section where the degree of similarity exceeds a predetermined threshold, extracted from a plurality of time-series data, the class estimation unit 325 For time-series data, it is estimated that a play event has occurred in that section. In this case, it can be said that a play event is a class corresponding to a section of time-series data.
- the section extraction unit 322 extracts a section in which a pattern common to the reference data 331 appears from the time-series data of the sensor device 100, and the similarity of the section calculated by the similarity calculation unit 324 is extracted.
- the class estimation unit 325 estimates that a play event that has already been identified as occurring in the corresponding section of the reference data 331 has occurred in the corresponding section of the time-series data of the sensor device 100.
- the section extraction unit 322 extracts a section in which a common pattern appears among the time-series data acquired from each of the plurality of sensor devices 100, and the similarity of the section calculated by the similarity calculation unit 324 is calculated.
- the class estimation unit 325 estimates that a play event specified by information input by the user with respect to any time-series data has also occurred in the corresponding section of other time-series data.
- each functional configuration is realized in the server 300, but the embodiment of the present disclosure is not limited to such an example.
- the analysis processing in the system 10 may be executed by the processing unit 220 of the smartphone 200 or may be executed by the processing unit 120 of the sensor device 100. Therefore, the functional configuration described above may also be realized by the smartphone 200 or the sensor device 100 instead of the server 300. In addition, the functional configuration described above may be realized by being distributed to the sensor device 100, the smartphone 200, and / or the server 300.
- the correction parameter calculation unit 323 realized in the processing unit 320 of the server 300 calculates correction parameters between time series data.
- a method for calculating the distance between time series for example, a method of calculating a DTW (Dynamic Time Warping) distance can be used.
- the correction parameter calculation unit 323 calculates the correction parameter ⁇ so as to minimize the DTW distance between time series calculated by the method as described above, for example.
- a loss function including the parameter ⁇ is used.
- T ⁇ represents a vector rotation conversion.
- the vector may be scalar multiplication by operators corresponding to T theta.
- the range of ⁇ may be set according to the nature of sensor data. For example, a range may be set for ⁇ so as not to invert the sign of the vector.
- DTW ( ⁇ ) is differentiated and optimized with respect to ⁇
- a distance measure that makes the pair of time series data x i and y i closest can be learned.
- the differentiation here can be calculated, for example, as in the following formulas 5 and 6.
- the part below ⁇ in Equation 6 can be calculated with a calculation amount of O (NM).
- correction parameter calculation method described above is merely an example, and in another example, a difference caused by the spatial positional relationship between the sensor device 100 and the racket R may be corrected between a plurality of time-series data. Any method may be used as long as it can calculate a correction parameter that can be calculated.
- FIG. 7 is a flowchart illustrating an example of processing at the time of calculating a correction parameter according to the first embodiment of the present disclosure.
- the server 300 performs processing for calculating correction parameters for time series data by comparing the time series data newly provided from the sensor device 100 with the reference data 331.
- the time-series data acquisition unit 321 receives the sensor data provided by the sensor device 100 (S101).
- the time series data acquisition unit 321 reads data from the reference data 331 (S103).
- the section extraction unit 322 determines whether there is a section associated with the same play event between the received sensor data and the reference data (S105).
- a play event that has occurred in each section is already specified based on the result of an analysis performed in advance.
- a play event is specified for at least one section based on information such as a tag input by the user.
- the section extraction unit 322 extracts the section (S107). Further, the correction parameter calculation unit 323 extracts correction parameters between time series data by comparing the extracted sections (S109).
- FIG. 8 is a flowchart illustrating an example of a play event estimation process using the correction parameter according to the first embodiment of the present disclosure.
- the server 300 estimates the play event indicated by the sensor data by correcting the sensor data newly provided from the sensor device 100 using the correction parameter and comparing it with the reference data 331. Processing is executed.
- the time-series data acquisition unit 321 receives the sensor data (S101) and reads the reference data 331 (S103).
- the section extraction unit 322 corrects the sensor data based on the correction parameter ( S121), it is determined whether or not there is a section in the sensor data that can newly estimate the occurrence of a play event (S123).
- a section in which occurrence of a play event can be newly estimated is a section of sensor data in which the corrected pattern is similar to the reference data 331 but is not yet associated with the play event.
- a play event is estimated (S125). More specifically, the similarity calculation unit 324 calculates the similarity between the reference data 331 and the sensor data in the section, and if the similarity exceeds a threshold, the class estimation unit 325 receives the received sensor data. However, it is estimated that the same type of play event as the reference data 331 has occurred.
- FIG. 9 is a flowchart illustrating another example of the play event estimation process using the correction parameter according to the first embodiment of the present disclosure.
- the server 300 corrects the sensor data provided from each of the plurality of sensor devices 100 so that the difference is reduced, and compares them with each other, whereby a play event specified by certain sensor data is detected.
- a process for estimating the occurrence of other sensor data is executed.
- the time-series data acquisition unit 321 receives sensor data from each of the plurality of sensor devices 100 (S101).
- the sensor data may be transmitted from the plurality of sensor devices 100 in parallel in real time, or may be transmitted with a time difference.
- the time-series data acquisition unit 321 may receive the first sensor data, temporarily store it in the memory, and then receive the second sensor data.
- the section extraction unit 322 corrects each sensor data based on the correction parameter (S143), and determines whether there is a similar section between the sensor data (S145).
- a play event in a section having a high similarity is shared between the sensor data (S147).
- the similarity calculation unit 324 calculates the similarity between the sensor data in the section, and if the similarity exceeds a threshold, the class estimation unit 325 specifies one sensor data in the section. It is presumed that the play event being performed has occurred in the other sensor data.
- the object to which the sensor is attached is the racket R that is a tool used by a user who plays sports.
- the embodiment of the present disclosure is not limited to such an example, and the sensor may be attached to a tool other than the racket R, such as a golf club, a baseball bat, shoes, or wear. Further, for example, the sensor may be attached to the user himself / herself instead of the tool.
- the behavior of the object detected in the first embodiment is associated with a play event that occurs in sports.
- the play event is not limited to the shot event in the above example of tennis, and any play event may be used as long as it is a series of motions that are grasped with some meaning during sports play.
- the senor is mounted on a mobile device such as a smartphone instead of a dedicated sensor device, and for a purpose different from the first embodiment such as user behavior detection and navigation.
- a sensor device such as a wearable device is used (that is, the system configuration is the same as that of the first embodiment), and the user's behavior Functions described in the present embodiment such as detection and navigation may be provided.
- a wearable device having an input / output function may be used instead of a mobile device such as a smartphone described in the present embodiment.
- FIG. 10 is a block diagram schematically illustrating a device configuration of a system according to the second embodiment of the present disclosure.
- system 20 includes a smartphone 400 and a server 500.
- Smartphone 400 includes a sensor 405, a reception unit 410, a processing unit 420, a storage unit 430, a transmission unit 440, an imaging unit 450, an input unit 460, and an output unit 470.
- Server 500 includes a reception unit 510, a processing unit 520, a storage unit 530, and a transmission unit 540.
- the processing unit 420 processes the data acquired by the sensor 405, and the transmission unit 440 transmits the processed data to the server 500.
- the sensor 405 includes a motion sensor, for example, similarly to the sensor 110 of the sensor device 100 in the first embodiment.
- the sensor 405 may further include a vibration sensor, a sensor for acquiring environmental information, and the like. An example of data acquired by the sensor 405 in this embodiment will be described later.
- the processing unit 420 is realized by a processor such as a CPU that operates according to a program, and preprocesses data acquired by the sensor 405 as necessary.
- the preprocessing can include, for example, sampling and noise removal. Note that the preprocessing is not necessarily executed.
- the smartphone 400 can also realize the same function as the smartphone 200 in the first embodiment. That is, the reception unit 410, the processing unit 420, the storage unit 430, the transmission unit 440, the imaging unit 450, the input unit 460, and the output unit 470 are the reception unit 210, the processing unit 220, and the storage unit of the smartphone 200 according to the first embodiment.
- the unit 230, the transmission unit 240, the imaging unit 250, the input unit 260, and the output unit 270 are configured by the same hardware, and the same functions can be realized.
- the receiving unit 410 does not receive data from the sensor device, but mainly receives data from the server 500.
- the server 500 includes a reception unit 510, a processing unit 520, a storage unit 530, and a transmission unit 540.
- the receiving unit 510 is realized by a communication device, and receives data transmitted from the smartphone 400 using network communication such as the Internet.
- the processing unit 520 is realized by a processor such as a CPU, for example, and processes received data. For example, the processing unit 520 may execute an analysis process on the received data, accumulate the analyzed data in the storage unit 530, or output the data via the transmission unit 540. Alternatively, the processing unit 520 may only execute accumulation and output control of data already analyzed in the smartphone 400.
- the processing unit 520 can realize a functional configuration similar to that of the server 300 in the first embodiment. That is, the processing unit 520 can include a time-series data acquisition unit 321, a section extraction unit 322, a correction parameter calculation unit 323, a similarity calculation unit 324, and a class estimation unit 325. Further, the storage unit 530 of the server 500 may store the reference data 331, and the correction parameter database 332 may be constructed.
- the class estimated by the class estimation unit 325 may be a user action or a vehicle state used in navigation, not a play event.
- FIG. 11 is a diagram illustrating a first example of a usage pattern in the second embodiment of the present disclosure.
- the smartphones 400a and 400b are carried by the user in different states. More specifically, the smartphone 400a is stored in a chest pocket of a shirt worn by the user. Moreover, the smart phone 400b is accommodated in the bottom pocket of the pants which the user is wearing. Between these smartphones 400a and 400b, there is a difference in the spatial positional relationship between the sensor 405 included in each smartphone and the object to which the sensor 405 is attached, that is, the user's body.
- the time series data provided by the motion sensor included in each sensor 405 includes a difference in detected value direction (spatial rotation direction) , A difference in time, a difference in amplitude of detection values, and the like may occur.
- the processing unit 520 is a first sensor mounted on the user's body in the first spatial positional relationship (for example, the sensor 405 of the smartphone 400a). And the second time series provided by the second sensor (for example, the sensor 405 of the smartphone 400b) attached to the user's body in the second spatial positional relationship.
- a correction parameter between the first time series data and the second time series data is calculated by comparing the second section of the series data.
- a time series provided by a plurality of smartphones 400 having different spatial positional relationships with the user's body by detecting the user's behavior after correcting the time series data such as a motion sensor using this correction parameter.
- the user's behavior can be similarly detected from the data.
- the first example is not necessarily limited to the case of the smartphone 400, and is the same even when a sensor is mounted on the wearable device instead of the smartphone 400, for example.
- the wearable device is worn on the user's head (eyewear), worn on the wrist, or worn on clothes
- the sensor mounted on the wearable device and the user's body There is a difference in the spatial relationship.
- the user's behavior can be similarly detected from the time-series data provided by.
- the detection of the user's behavior in the first example is performed by comparing the reference data 331 and the sensor data and detecting the behavior recognized by the reference data. It may be implemented by applying to the data.
- the detection of the behavior in the above example compares the sensor data provided by different smartphones 400 and applies the behavior recognized by one sensor data to the other sensor data where the behavior is not recognized. May be implemented.
- the behavior recognized by the sensor data having the lower reliability may be corrected by the behavior recognized by the sensor data having the higher reliability. .
- FIG. 12 is a diagram illustrating a second example of a usage pattern according to the second embodiment of the present disclosure.
- the smartphone 400 is installed in a holder attached to a dashboard of an automobile.
- the smartphone 400 is used to provide navigation for a user driving a car.
- the attachment angle and height of the holder can be adjusted so that the user in the driver's seat can easily see the display.
- the structure of the dashboard to which the smartphone 400 is attached via the holder varies depending on the vehicle type. Therefore, the spatial positional relationship between the sensor 405 included in the smartphone and the object to which the sensor 405 is attached, that is, the automobile, varies depending on the type of the automobile, the user's physique and preferences, and the like.
- the smartphone 400 since the smartphone 400 is removed when the user gets off, the spatial relationship between the sensor 405 and the automobile may be different for each ride, even for the same automobile and the same user.
- the smartphone 400 When providing navigation for a user who drives a car, the smartphone 400 generally detects the current location using a GPS (Global Positioning System) receiver.
- GPS Global Positioning System
- the smart phone 400 can perform autonomous vehicle state detection using sensor data provided by the sensor 405.
- sensor data provided by the sensor 405.
- a vehicle state such as running, stopping, or turning can be detected.
- the difference in the direction of the detection value (spatial rotation direction), the temporal difference, the amplitude of the detection value in the time series data. If there is a difference, it becomes difficult to detect the vehicle state.
- the server 500 in the server 500, the first time-series data provided by the first sensor 405 in which the processing unit 520 is mounted on the vehicle in the first spatial positional relationship. And the second time-series data provided by the second sensor 405 mounted on the automobile in the second spatial positional relationship, the first type of behavior (running, stopping, bending) generated in the automobile Etc.) and the second interval of the second time-series data corresponding to the same behavior are compared with the first interval of the first time-series data The correction parameter between the two time series data is calculated. By detecting the vehicle state after correcting time series data such as a motion sensor using this correction parameter, from the time series data provided by a plurality of smartphones 400 having different spatial positional relationships with the automobile, Similarly, the vehicle state can be detected.
- time series data such as a motion sensor using this correction parameter
- the above second example is not necessarily limited to the case of the smartphone 400, and is the same when, for example, a dedicated navigation device is installed instead of the smartphone 400. Also in this case, if the attachment angle and height of the main body of the navigation device are adjusted, the same state as the example of the smartphone 400 is generated. However, the navigation device does not have to be removed when getting off. Further, the detection of the vehicle state in the second example described above is performed by comparing the reference data 331 and the sensor data, or between the sensor data, similarly to the example described in the first example and the first embodiment. It can be implemented by comparison.
- a sensor mounted on a smartphone is used for calculating an integrated amount based on sensor data.
- This embodiment may be combined with the second embodiment, and in that case, the integrated amount may be used in user behavior detection or navigation.
- a wearable device, a navigation device, or the like may be used instead of the smartphone or together with the smartphone, as in the second embodiment.
- FIG. 13 is a block diagram schematically illustrating a functional configuration of a server according to the third embodiment of the present disclosure.
- the processing unit 520 of the server 500 includes a time series data acquisition unit 521, a section extraction unit 522, a correction parameter calculation unit 523, and an integrated amount calculation unit 524. All of these functional configurations are realized in software by a processor that implements the processing unit 520, for example. Further, the reference data 531 can be stored in the storage unit 530 of the server 500. As described above, this embodiment may be combined with the second embodiment. Therefore, in the processing unit 520, the functional configuration of the processing unit 320 of the server 300 illustrated in FIG. 6 may be realized in addition to the functional configuration illustrated in FIG. The illustrated functional configuration will be further described below.
- the time series data acquisition unit 521 acquires the data provided by the smartphone 400 via the reception unit 510.
- the data includes time series data acquired by the sensor 405 included in the smartphone 400.
- the time series data may be preprocessed.
- the time series data acquisition unit 521 may acquire data from the reference data 531 as necessary.
- the reference data 531 is data that is generated based on, for example, one or more sensor data acquired in advance and is used as a reference for calculating the integrated amount.
- the time-series data acquisition unit 521 acquires at least two time-series data.
- the time-series data acquisition unit 521 acquires sensor data provided from at least one smartphone 400 and at least one reference data 531.
- the reference data 531 is generated based on sensor data acquired in advance, the spatial positional relationship between the body of the user of the smartphone 400 that provided the original sensor data or the automobile is reflected. ing. This positional relationship may be different from the spatial positional relationship with the user's body or automobile in the smartphone 400 that has newly provided sensor data.
- the time-series data acquisition unit 521 may acquire sensor data provided from two or more smartphones 400.
- the two or more smartphones 400 may include the smartphones 400 that are attached to the user's body or automobile in different spatial positional relationships.
- the time-series data acquisition unit 521 as described above is provided by the first sensor (the sensor 405 included in the smartphone 400) attached to the first object (the user's body or the car) in the first spatial positional relationship.
- the second time series data provided by a different sensor 405 is acquired.
- the first object and the second object may be the same type but different individuals, or may be the same individual (the same user changes the way the smartphone 400 is held, the smartphone 400 Once taken out of the car).
- the first spatial positional relationship and the second spatial positional relationship may be different, but may be the same by chance.
- the section extraction unit 522 extracts a section necessary for the processing of the correction parameter calculation unit 523 from the time series data acquired by the time series data acquisition unit 521. More specifically, the section necessary for the processing of the correction parameter calculation unit 523 is a section where the same kind of behavior of the object has occurred in the time series data acquired by the time series data acquisition unit 521. That is, for example, the section extraction unit 522 extracts a section in which an action such as walking, running, or sitting is detected from time-series data provided by the smartphone 400 carried by the user. In addition, for example, the section extraction unit 522 extracts a section in which a vehicle state such as travel, stop, or turn is detected from time-series data provided by the smartphone 400 installed in the automobile.
- the section extracting unit 522 is a first type of behavior that occurs in the first object (user's body or car) ( The first section of the first time-series data corresponding to the user's behavior or vehicle state) and the first type of behavior (user's behavior or vehicle state) generated in the second object (user's body or car) It can be said that the second section of the second time-series data corresponding to) is extracted.
- the time series data acquired by the time series data acquisition unit 521 includes at least one set of behaviors of the same type ( The section corresponding to the user's action or vehicle state in the above example is included. Such a section is already specified based on the result of the analysis performed in advance for the reference data 531, for example.
- a section corresponding to the behavior is specified by a user's manual input or by data analysis. For example, when high-precision position detection means such as GPS is available on the smartphone 400, it may be possible to estimate the movement state of the user or vehicle from the history of position information. It may be easy to specify the section to be performed.
- the correction parameter calculation unit 523 calculates a correction parameter between time series data by comparing the sections extracted by the section extraction unit 522.
- the correction parameter is, for example, any one of a plurality of time-series data acquired by the time-series data acquisition unit 521 as reference time-series data, and between the smartphone 400 and the user between the other time-series data. It is used to correct the time due to the spatial positional relationship with the body or the car and bring other time series data closer to the reference time series data.
- the correction parameters are the time series data newly provided from the smartphone 400 via the reception unit 510 and the time series included in the reference data 531. Used to correct differences between data. Since the integrated amount calculated by the integrated amount calculation unit 524 described later is calibrated to be a correct value with respect to the reference data 531, the time series data newly provided from the smartphone 400 is approximated to the reference data 531 by correction. By calculating the integrated amount above, the integrated amount closer to the correct value can be calculated.
- the correction parameter is used to correct a difference between the respective time-series data.
- the time-series data acquired from the plurality of smartphones 400 may include data provided by the smartphones 400 having various spatial positional relationships with objects (user's body or automobile).
- the time series data if there is a section in which the accumulated amount can be verified using a highly accurate position detecting means such as GPS, the accumulated value is calculated so that the accumulated amount is correctly calculated for the time series data.
- the parameters used for the calculation of the quantity calculation unit 524 can be adjusted. Then, if the integrated amount is calculated after other time series data is brought close to the time series data by correction, the integrated amount closer to the correct value can be calculated.
- the integrated amount calculation unit 524 calculates an integrated amount based on time series data.
- the integrated amount represents, for example, a spatial movement amount of the user or the automobile.
- the integrated amount calculation unit 524 calculates the integrated amount after correcting the time series data in accordance with the correction parameter calculated by the correction parameter calculation unit 523. For example, when the first time-series data is reference data, the integrated amount calculation unit 524 corrects the second time-series data so as to approach the first time-series data, and then performs the second time-series data. Calculate the integrated amount based on.
- the integrated amount calculation unit 524 does not calculate the integrated amount for the first time series data. Also good.
- the integrated amount calculating unit 524 adjusts the parameter for calculating the integrated amount, The integrated amount may be calculated for the time series data.
- FIG. 14 is a flowchart illustrating an example of processing according to the third embodiment of the present disclosure.
- the time series data newly provided from the smartphone 400 in the server 500 is corrected based on the correction parameter, and the process of calculating the integrated amount is performed after being close to the reference data 531.
- the time-series data acquisition unit 521 receives time-series data provided by the smartphone 400 (S201).
- the time series data acquisition unit 521 reads data from the reference data 531 (S203).
- the integration amount calculation unit 524 corrects the time series data based on the correction parameters calculated by the section extraction unit 522 and the correction parameter calculation unit 523 (S205), and calculates the integration amount from the corrected time series data. (S207). Note that the process for calculating the correction parameter and the process for correcting the time-series data based on the calculated parameter can be executed in the same manner as in the first or second embodiment, and thus will be described in detail here. Not.
- FIG. 15 is a block diagram illustrating a hardware configuration example of the analysis apparatus according to the embodiment of the present disclosure.
- the illustrated analysis device 900 can realize, for example, a server, a terminal device such as a smartphone, or a sensor device in the above embodiment.
- the analysis apparatus 900 includes a CPU (Central Processing unit) 901, a ROM (Read Only Memory) 903, and a RAM (Random Access Memory) 905.
- the analysis device 900 may include a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, a connection port 923, and a communication device 925.
- the analysis device 900 may include an imaging device 933 and a sensor 935 as necessary.
- the analysis apparatus 900 may include a processing circuit such as a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), or an FPGA (Field-Programmable Gate Array) instead of or in addition to the CPU 901.
- DSP Digital Signal Processor
- ASIC Application Specific Integrated Circuit
- FPGA Field-Programmable Gate Array
- the CPU 901 functions as an arithmetic processing device and a control device, and controls all or a part of the operation in the analysis device 900 according to various programs recorded in the ROM 903, the RAM 905, the storage device 919, or the removable recording medium 927.
- the ROM 903 stores programs and calculation parameters used by the CPU 901.
- the RAM 905 primarily stores programs used in the execution of the CPU 901, parameters that change as appropriate during the execution, and the like.
- the CPU 901, the ROM 903, and the RAM 905 are connected to each other by a host bus 907 configured by an internal bus such as a CPU bus. Further, the host bus 907 is connected to an external bus 911 such as a PCI (Peripheral Component Interconnect / Interface) bus via a bridge 909.
- PCI Peripheral Component Interconnect / Interface
- the input device 915 is a device operated by the user, such as a mouse, a keyboard, a touch panel, a button, a switch, and a lever.
- the input device 915 may be, for example, a remote control device using infrared rays or other radio waves, or may be an external connection device 929 such as a mobile phone that supports the operation of the analysis device 900.
- the input device 915 includes an input control circuit that generates an input signal based on information input by the user and outputs the input signal to the CPU 901. The user operates the input device 915 to input various data and instruct processing operations to the analysis device 900.
- the output device 917 is configured by a device capable of notifying the acquired information to the user using a sense such as vision, hearing, or touch.
- the output device 917 can be, for example, a display device such as an LCD (Liquid Crystal Display) or an organic EL (Electro-Luminescence) display, an audio output device such as a speaker or headphones, or a vibrator.
- the output device 917 outputs the result obtained by the processing of the analysis device 900 as video such as text or image, sound such as sound or sound, or vibration.
- the storage device 919 is a data storage device configured as an example of a storage unit of the analysis device 900.
- the storage device 919 includes, for example, a magnetic storage device such as an HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, or a magneto-optical storage device.
- the storage device 919 stores, for example, programs executed by the CPU 901 and various data, and various data acquired from the outside.
- the drive 921 is a reader / writer for a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and is built in or attached to the analysis apparatus 900.
- the drive 921 reads information recorded on the attached removable recording medium 927 and outputs the information to the RAM 905.
- the drive 921 writes a record in the attached removable recording medium 927.
- the connection port 923 is a port for connecting a device to the analysis apparatus 900.
- the connection port 923 can be, for example, a USB (Universal Serial Bus) port, an IEEE 1394 port, a SCSI (Small Computer System Interface) port, or the like.
- the connection port 923 may be an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, or the like.
- the communication device 925 is a communication interface configured with, for example, a communication device for connecting to the communication network 931.
- the communication device 925 can be, for example, a communication card for LAN (Local Area Network), Bluetooth (registered trademark), Wi-Fi, or WUSB (Wireless USB).
- the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), or a modem for various communication.
- the communication device 925 transmits and receives signals and the like using a predetermined protocol such as TCP / IP with the Internet and other communication devices, for example.
- the communication network 931 connected to the communication device 925 is a network connected by wire or wireless, and may include, for example, the Internet, a home LAN, infrared communication, radio wave communication, satellite communication, or the like.
- the imaging device 933 uses various members such as an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) or a CCD (Charge Coupled Device), and a lens for controlling the formation of a subject image on the imaging element. It is an apparatus that images a real space and generates a captured image.
- the imaging device 933 may capture a still image or may capture a moving image.
- the sensor 935 is various sensors such as an acceleration sensor, an angular velocity sensor, a geomagnetic sensor, an illuminance sensor, a temperature sensor, an atmospheric pressure sensor, or a sound sensor (microphone). For example, the sensor 935 acquires information related to the state of the analysis apparatus 900 itself, such as the attitude of the casing of the analysis apparatus 900, and information related to the surrounding environment of the analysis apparatus 900, such as brightness and noise around the analysis apparatus 900.
- the sensor 935 may include a GPS receiver that receives a GPS (Global Positioning System) signal and measures the latitude, longitude, and altitude of the device.
- GPS Global Positioning System
- Each component described above may be configured using a general-purpose member, or may be configured by hardware specialized for the function of each component. Such a configuration can be appropriately changed according to the technical level at the time of implementation.
- Embodiments of the present disclosure function, for example, an analysis device (a terminal device such as a server or a smartphone, or a sensor device), a system, an analysis device, an analysis method executed by the system, or an analysis device as described above. And a non-transitory tangible medium on which the program is recorded.
- an analysis device a terminal device such as a server or a smartphone, or a sensor device
- a system an analysis device, an analysis method executed by the system, or an analysis device as described above.
- a non-transitory tangible medium on which the program is recorded for example, an analysis device (a terminal device such as a server or a smartphone, or a sensor device), a system, an analysis device, an analysis method executed by the system, or an analysis device as described above.
- a non-transitory tangible medium on which the program is recorded for example, an analysis device (a terminal device such as a server or a smartphone, or a sensor device), a system, an analysis device, an analysis method executed by the system, or an
- First time-series data provided by a first sensor mounted on a first object in a first spatial positional relationship, and a second object of the same type as the first object Obtaining second time-series data provided by a second sensor mounted in a second spatial positional relationship;
- the first section of the first time-series data corresponding to the first type of behavior generated in the first object and the first type of behavior generated in the second object Extracting a second interval of second time series data, and a processor comparing the first interval with the second interval by comparing the first interval with the second interval.
- An analysis method including calculating a correction parameter between time series data.
- the analysis method is: At least one of the first time series data and the second time series data is corrected using the correction parameter so that the first time series data and the second time series data are close to each other.
- the analysis method according to (1) further including estimating that the second type of behavior has occurred in the second object in the fourth section.
- the analysis method further includes calculating a similarity between the third section and the fourth section, The analysis method according to (2), wherein when the similarity exceeds a threshold, it is estimated that the second type of behavior has occurred in the second object in the second section.
- the first object and the second object are a user playing sports or a tool used by the user, The first sensor and the second sensor are attached to the user or the tool, The analysis method according to (2) or (3), wherein the first type of behavior and the second type of behavior are associated with a play event that occurs in the sport.
- the first object and the second object are a user's body, The first sensor and the second sensor are carried or worn by the user, The analysis method according to (2) or (3), wherein the behavior of the first type and the behavior of the second type are associated with the behavior of the user.
- the first object and the second object are vehicles, The first sensor and the second sensor are installed in the vehicle; The analysis method according to (2) or (3), wherein the behavior of the first type and the behavior of the second type are associated with the state of the vehicle.
- the first time-series data includes reference data in which a behavior generated in the first object is specified in advance, The analysis method according to any one of (2) to (6), wherein the fourth section includes a section in which a behavior generated in the second object is not specified.
- the analysis method is as follows: Extracting a sixth section of the first time-series data similar to the fifth section of the second time-series data corresponding to the third type of behavior that has occurred in the second object; The analysis method according to any one of (2) to (6), further including estimating that the third type of behavior has occurred in the first object in the sixth section. . (9) The analysis method is: Correcting at least one of the first time-series data and the second time-series data using the correction parameter so that the second time-series data approaches the first time-series data. When, The analysis method according to (1), further including: calculating an integrated amount based on the second time series data.
- the first object and the second object are a user's body or vehicle, The first sensor and the second sensor are carried or worn by the user or installed on the vehicle, The analysis method according to (9) or (10), wherein the integrated amount represents a spatial movement amount of the user or the vehicle.
- the correction parameter is an optimum parameter that is incorporated in the distance function between the first time series data and the second time series data so that the distance given by the distance function is minimized.
- the correction parameter includes a parameter for rotating a vector included in the second time-series data.
- the correction parameter includes a parameter for multiplying the second time-series data by a scalar.
- a time series data acquisition unit for acquiring first time series data provided by the first sensor and second time series data provided by the second sensor;
- a section extractor for extracting a second section of second time-series data; and comparing the first section and the second section by comparing the first section and the second section.
- an analysis device including a correction parameter calculation unit that calculates a correction parameter between series data.
- First time-series data provided by a first sensor mounted on the first object in a first spatial positional relationship, and a second object of the same type as the first object A time-series data acquisition unit for acquiring second time-series data provided by the second sensor mounted in the second spatial positional relationship; The first section of the first time-series data corresponding to the first type of behavior generated in the first object and the first type of behavior generated in the second object
- a section extractor for extracting a second section of second time-series data; and comparing the first section and the second section by comparing the first section and the second section.
- An analysis apparatus comprising a correction parameter calculation unit that calculates a correction parameter between series data.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Le problème consiste à obtenir des résultats d'analyse appropriés par la correction des données chronologiques fournies par des capteurs multiples pour lesquels la relation entre la position spatiale et un objet peuvent varier. La solution selon l'invention porte sur un procédé d'analyse comprenant : l'acquisition de premières données chronologiques fournies par un premier capteur installé sur un premier objet avec une première relation de position spatiale et des secondes données chronologiques fournies par un second capteur installé sur un second objet avec une seconde relation de position spatiale ; l'extraction d'un premier intervalle pour les premières données chronologiques correspondant à un premier type de comportement généré par le premier objet et un second intervalle pour les secondes données chronologiques correspondant au premier type de comportement généré par le second objet ; et un processeur calculant un paramètre de correction entre les premières données chronologiques et les secondes données chronologiques en comparant le premier intervalle avec le second intervalle.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2014180257 | 2014-09-04 | ||
| JP2014-180257 | 2014-09-04 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2016035464A1 true WO2016035464A1 (fr) | 2016-03-10 |
Family
ID=55439534
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2015/070732 Ceased WO2016035464A1 (fr) | 2014-09-04 | 2015-07-21 | Procédé d'analyse, système et dispositif d'analyse |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2016035464A1 (fr) |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018109882A (ja) * | 2017-01-05 | 2018-07-12 | 株式会社東芝 | 動作解析装置、動作解析方法およびプログラム |
| JP2020155100A (ja) * | 2019-03-18 | 2020-09-24 | 台達電子工業股▲ふん▼有限公司Delta Electronics,Inc. | 遠隔較正システム及びそのセンサの遠隔較正方法 |
| CN113807388A (zh) * | 2021-08-02 | 2021-12-17 | 滁州学院 | 基于多元传感器数据的独居老人行为规律发现方法 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08224330A (ja) * | 1994-12-21 | 1996-09-03 | Hitachi Ltd | 信号記録装置及びトレーニング装置 |
| JP2008529727A (ja) * | 2005-02-15 | 2008-08-07 | マグネットー・イナーシャル・センシング・テクノロジー・インコーポレイテッド | 慣性方位決定能力を有する単軸/多軸の6つの自由度(dof)の慣性運動捕捉システム |
| JP2012110359A (ja) * | 2010-11-19 | 2012-06-14 | Seiko Epson Corp | 運動解析装置 |
| JP2012120579A (ja) * | 2010-12-06 | 2012-06-28 | Seiko Epson Corp | 運動解析装置 |
| JP2012130414A (ja) * | 2010-12-20 | 2012-07-12 | Seiko Epson Corp | スイング分析装置 |
-
2015
- 2015-07-21 WO PCT/JP2015/070732 patent/WO2016035464A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH08224330A (ja) * | 1994-12-21 | 1996-09-03 | Hitachi Ltd | 信号記録装置及びトレーニング装置 |
| JP2008529727A (ja) * | 2005-02-15 | 2008-08-07 | マグネットー・イナーシャル・センシング・テクノロジー・インコーポレイテッド | 慣性方位決定能力を有する単軸/多軸の6つの自由度(dof)の慣性運動捕捉システム |
| JP2012110359A (ja) * | 2010-11-19 | 2012-06-14 | Seiko Epson Corp | 運動解析装置 |
| JP2012120579A (ja) * | 2010-12-06 | 2012-06-28 | Seiko Epson Corp | 運動解析装置 |
| JP2012130414A (ja) * | 2010-12-20 | 2012-07-12 | Seiko Epson Corp | スイング分析装置 |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2018109882A (ja) * | 2017-01-05 | 2018-07-12 | 株式会社東芝 | 動作解析装置、動作解析方法およびプログラム |
| CN108279767A (zh) * | 2017-01-05 | 2018-07-13 | 株式会社东芝 | 动作分析装置以及动作分析方法 |
| US11030564B2 (en) | 2017-01-05 | 2021-06-08 | Kabushiki Kaisha Toshiba | Motion analysis apparatus, motion analysis method, and computer program product |
| JP2020155100A (ja) * | 2019-03-18 | 2020-09-24 | 台達電子工業股▲ふん▼有限公司Delta Electronics,Inc. | 遠隔較正システム及びそのセンサの遠隔較正方法 |
| CN113807388A (zh) * | 2021-08-02 | 2021-12-17 | 滁州学院 | 基于多元传感器数据的独居老人行为规律发现方法 |
| CN113807388B (zh) * | 2021-08-02 | 2022-10-28 | 滁州学院 | 基于多元传感器数据的独居老人行为规律发现方法 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11113515B2 (en) | Information processing device and information processing method | |
| KR102252269B1 (ko) | 수영 분석 시스템 및 방법 | |
| WO2019203189A1 (fr) | Programme, dispositif de traitement d'informations et procédé de traitement d'informations | |
| EP3060317B1 (fr) | Dispositif de traitement d'informations, support d'enregistrement, et procédé de traitement d'informations | |
| KR102241414B1 (ko) | 머신 러닝 모델을 이용한 특정 움직임에 대한 피드백을 제공하는 전자 장치 및 그 동작 방법 | |
| US20250053228A1 (en) | Information processing apparatus, method for processing information, and program | |
| US20140229135A1 (en) | Motion analysis apparatus and motion analysis method | |
| EP3186599B1 (fr) | Système de fourniture de retour d'informations | |
| WO2018090538A1 (fr) | Procédé et dispositif de reconnaissance de l'action d'une raquette de tennis | |
| US11181376B2 (en) | Information processing device and information processing method | |
| CN108079547B (zh) | 图像处理装置、分析系统、图像处理方法以及记录介质 | |
| JP2013009917A (ja) | 運動解析システム、運動解析プログラム、および、運動解析プログラムを記録した記録媒体 | |
| WO2016035464A1 (fr) | Procédé d'analyse, système et dispositif d'analyse | |
| CN111093781A (zh) | 将传感器数据与视频对齐 | |
| JP6147446B1 (ja) | ソフト制約及びペナルティ機能を使用した慣性センサの初期化 | |
| JP2021165763A (ja) | 情報処理装置、情報処理方法及びプログラム | |
| US11040263B2 (en) | Sensing system, sensor device, and sensor fixture | |
| JP2018099416A (ja) | 運動解析装置、運動解析方法及びプログラム | |
| KR102671307B1 (ko) | 골프 스윙 분석 장치, 골프 스윙 분석 방법 및 기록매체에 저장된 프로그램 | |
| US20180140925A1 (en) | Movement analysis device, movement analysis method and recording medium | |
| JP7188422B2 (ja) | 画像処理装置、解析システム、画像処理方法及びプログラム | |
| TW202427395A (zh) | 姿態穩定度偵測系統及其應用之偵測方法 | |
| CN117315780A (zh) | 一种羽毛球挥拍动作识别与三维轨迹重建方法及系统 | |
| JP2015166018A (ja) | スイング分析装置 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15838230 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 15838230 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: JP |