WO2015037089A1 - 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム - Google Patents
脳機能障害評価方法、脳機能障害評価装置およびそのプログラム Download PDFInfo
- Publication number
- WO2015037089A1 WO2015037089A1 PCT/JP2013/074582 JP2013074582W WO2015037089A1 WO 2015037089 A1 WO2015037089 A1 WO 2015037089A1 JP 2013074582 W JP2013074582 W JP 2013074582W WO 2015037089 A1 WO2015037089 A1 WO 2015037089A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- subject
- data
- physical
- instruction
- accuracy
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4058—Detecting, measuring or recording for evaluating the nervous system for evaluating the central nervous system
- A61B5/4064—Evaluating the brain
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Measuring devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor or mobility of a limb
- A61B5/1124—Determining motor skills
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/12—Audiometering
- A61B5/121—Audiometering evaluating hearing capacity
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4088—Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/40—Detecting, measuring or recording for evaluating the nervous system
- A61B5/4076—Diagnosing or monitoring particular conditions of the nervous system
- A61B5/4094—Diagnosing or monitoring seizure diseases, e.g. epilepsy
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6887—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
- A61B5/6898—Portable consumer electronic devices, e.g. music players, telephones, tablet computers
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7475—User input or interface means, e.g. keyboard, pointing device, joystick
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2560/00—Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
- A61B2560/02—Operational features
- A61B2560/0223—Operational features of calibration, e.g. protocols for calibrating sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7271—Specific aspects of physiological measurement analysis
- A61B5/7278—Artificial waveform generation or derivation, e.g. synthesizing signals from measured signals
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/7405—Details of notification to user or communication with user or patient; User input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient; User input means
- A61B5/742—Details of notification to user or communication with user or patient; User input means using visual displays
Definitions
- the present invention relates to a technique for evaluating the degree of brain dysfunction including cognitive decline.
- Dementia is associated with Alzheimer-type dementia, cerebrovascular dementia, and Lewy body dementia, as well as motor disorders such as Parkinson's disease and mental disorders such as depression and schizophrenia. Cognitive function may be reduced. If the type of dementia can be specified by diagnosis, appropriate treatment such as pharmacotherapy can be performed. By detecting any type of dementia at an early stage and administering an appropriate drug, progression can be suppressed in a mild state. For these reasons, screening tests for early detection of dementia are necessary for healthy elderly people who are likely to have dementia.
- the mainstream of diagnosis of dementia is examination of cognitive functions such as memory and judgment, such as the Hasegawa scale and MMSE (Mini Mental State Examination).
- these diagnostic methods are performed by doctors over several minutes to several tens of minutes and cannot be said to be suitable for screening tests for a large number of subjects.
- a method of examining the presence or absence of brain atrophy using CT Computed Tomography
- MRI Magnetic Resonance Imaging
- SPECT Single Photon Emission Computed Tomography
- PET PET
- brain image measurement such as a method for examining the accumulation status of amyloid beta
- these brain image measurements are expensive for examinations and take a long examination time, and thus cannot be said to be suitable for screening examinations for a large number of subjects.
- Patent Document 1 Patent Document 2, Patent Document 1, and the like use a tablet computer with a touch panel sensor, so that the cognitive function of the subject can be simplified without relying on a doctor.
- An example of an evaluation system for evaluation is disclosed.
- Patent Document 1 A simple evaluation system for cognitive functions disclosed in Patent Document 1, Patent Document 2, Non-Patent Document 1, etc. indicates which subject has the name, shape, number, etc. of objects displayed on the display screen and then deleted. It is inspected whether or not the degree is memorized. That is, the conventional cognitive function evaluation system is biased toward evaluating a subject's memory ability or judgment ability related to memory. In particular, the physical ability of the subject's body has not been evaluated.
- the present invention is to provide a brain dysfunction evaluation method, a brain dysfunction evaluation apparatus, and a program thereof that can easily evaluate the degree of cerebral dysfunction such as cognitive decline.
- the method for evaluating cerebral dysfunction includes a body motion presentation device for presenting body motion to be caused by a subject, and body motion detection for detecting body motion data of the body motion performed by the subject according to the presented body motion.
- a data processing device connected to the sensor generates physical exercise instruction data to be executed by the subject, and presents the physical exercise based on the generated physical exercise instruction data to the subject via the physical exercise presentation device
- a body movement instruction step for instructing the execution a body movement data acquisition step for acquiring body movement data of the body movement performed by the subject in time series via the body movement detection sensor, and the body movement instruction data
- Body motion accuracy calculation step for calculating the position accuracy and time series accuracy of the body motion of the subject based on the body motion data and the body motion data.
- a cognitive impairment level evaluation step of evaluating the cognitive impairment level of the subject is performed.
- a cerebral dysfunction evaluation method capable of easily evaluating the degree of cerebral dysfunction including cognitive decline.
- the figure which showed the example of the physical exercise task selection screen displayed by the physical exercise task selection part The figure which showed the example of the reaching task instruction
- indication screen shown on a physical exercise presentation apparatus The figure which showed the example of the time transition graph of the distance L (t) between two fingers produced when the opening-and-closing finger tap task is performed.
- cerebral dysfunction generally refers to what develops so-called cognitive decline (Alzheimer type dementia, cerebrovascular dementia, Lewy body dementia, Parkinson's disease, hydrocephalus, depression. Disease, schizophrenia, etc.), but includes movement disorders such as stroke.
- cognitive decline Alzheimer's disease, cerebrovascular dementia, Lewy body dementia, Parkinson's disease, hydrocephalus, depression. Disease, schizophrenia, etc.
- the brain dysfunction may be simply referred to as dementia.
- FIG. 1 is a diagram showing an example of the overall configuration of a brain dysfunction evaluation apparatus 100 according to an embodiment of the present invention.
- a brain dysfunction evaluation apparatus 100 includes a body motion presentation device 2, a body motion, and a data processing device 1 including a CPU (Central Processing Unit) and a memory (not shown).
- a detection sensor 3, an operation input device 4, an output device 5, a storage device 6 and the like are combined.
- the physical exercise presentation device 2 is configured by, for example, a liquid crystal display device, a voice output device (speaker), or the like.
- the body movement detection sensor 3 is configured by a touch panel sensor (screen contact sensor) attached to the liquid crystal display device.
- the operation input device 4 is configured by, for example, a keyboard or a mouse, but may be configured by a touch panel sensor. In this case, the operation input device 4 may also serve as the body motion detection sensor 3. Good.
- the output device 5 may be constituted by, for example, a liquid crystal display device or a printer, and may also serve as the physical exercise presentation device 2.
- the storage device 6 is configured by a hard disk device, an SSD (Solid State Disk), or the like, and stores data and programs that are determined to be stored in advance.
- the data processing apparatus 1 is embodied by a CPU (not shown) executing a program stored in a memory (not shown), and a body motion instruction unit 10, a body motion data acquisition unit 20, and a body motion accuracy calculation unit 30. And functional blocks such as a cognitive impairment evaluation unit 40.
- the memory referred to here is composed of a RAM (Random Access Memory) or the like of a semiconductor memory. In the memory, a program to be executed is read from the storage device 6 and loaded as necessary, and data being subjected to arithmetic processing is stored.
- the physical exercise instruction unit 10 includes a physical exercise task selection unit 11, an instruction data generation unit 12, an instruction data presentation unit 13 and the like as lower-level functional blocks.
- the physical exercise task selection unit 11 displays a list of physical exercise tasks prepared in advance on the output device 5 (liquid crystal display device or the like) (see FIG. 3 to be described later), and an operation input device by the subject or its assistant Based on the input operation of No. 4, the physical exercise task to be executed is selected.
- the instruction data generation unit 12 generates time-series physical exercise instruction data to be presented to the subject according to the selected physical exercise task.
- the instruction data presentation unit 13 displays the generated time-series physical exercise instruction data, that is, the contents of physical exercise to be performed by the subject via the physical exercise presentation device 2 (liquid crystal display device, voice output device, etc.). To present.
- the body movement data acquisition unit 20 includes a detection data acquisition unit 21, a calibration unit 22, and the like as lower functional blocks.
- the detection data acquisition unit 21 is data of physical movement performed by the subject according to the content of the physical movement presented to the physical movement presentation device 2 (for example, the position of a specific part of the body, moving speed, acceleration, detection time, etc. ) At a predetermined time interval (for example, 10 milliseconds) via the body motion detection sensor 3 or the like. That is, the detection data acquisition unit 21 acquires time-series data of the subject's physical movement.
- the calibration unit 22 acquires data on the auditory, visual, and motor skills inherent to each subject that does not depend on cognitive impairment, calculates calibration data for each subject from the acquired data, and stores the calibration data in the storage device 6.
- the body motion accuracy calculation unit 30 includes an instruction / detection data comparison unit 31, a position accuracy calculation unit 32, a time series accuracy calculation unit 33, and the like as lower functional blocks.
- the instruction / detection data comparison unit 31 includes data on physical motion to be performed by the subject presented on the body motion presentation device 2 (liquid crystal display device, etc.) and the body taken by the subject obtained via the body motion detection sensor 3. Compare exercise data.
- the position accuracy calculation unit 32 calculates the position accuracy of the body movement of the subject based on the difference data between the instruction data regarding the position obtained by the instruction / detection data comparison unit 31 and the detection data of the body movement of the subject.
- the time series accuracy calculation unit 33 calculates the time series accuracy of the body movement of the subject based on the difference data between the instruction timing of the instruction data obtained by the instruction / detection data comparison unit 31 and the detection timing of the detection data. To do.
- the cognitive impairment level evaluation unit 40 includes a cognitive impairment level calculation unit 41, a cognitive impairment level output unit 42, and the like as lower functional blocks.
- the cognitive impairment degree calculation unit 41 recognizes the subject using the position accuracy and time series accuracy calculated by the body movement accuracy calculation unit 30, calibration data acquired by the calibration unit 22, and the like. Calculate the degree of failure.
- the cognitive impairment level output unit 42 displays the data of the cognitive impairment level calculated by the cognitive impairment level calculation unit 41 or the temporal change data thereof on the output device 5 (liquid crystal display device or the like) (see FIG. 9 described later). ).
- the subject or his / her assistant can know the degree of cognitive impairment of the subject and its change over time.
- the cerebral dysfunction evaluation apparatus 100 having the above-described configuration can be realized by a tablet computer with a touch panel sensor, a so-called tablet terminal having almost the same function and performance, a smartphone, and the like.
- a touch panel sensor is mainly used to detect a finger motion, but a body motion other than the finger may be detected.
- an acceleration sensor, a magnetic sensor, a gyro device, a motion capture device, a video camera, or the like can be used as the body motion detection sensor 3.
- the brain dysfunction evaluation apparatus 100 is configured by a tablet computer or a tablet terminal with a touch panel sensor, and a brain dysfunction evaluation program is registered as an application program in the data processing apparatus 1.
- the cerebral dysfunction evaluation program includes programs that embody the body movement instruction unit 10, the body movement data acquisition unit 20, the body movement accuracy calculation unit 30, and the cognitive impairment level evaluation unit 40 of the data processing device 1, respectively. Composed.
- FIG. 2 is a diagram showing an example of the subject registration information display screen 110 displayed when the cerebral dysfunction evaluation program is started.
- the output device 5 liquid crystal display device
- the subject registration information display screen as shown in FIG. 110 is displayed.
- the subject or his / her assistant inputs the subject ID in the subject ID column.
- the data processing device 1 prompts the user to input data such as the subject name, sex, age, and remarks (for example, a message prompting input) Is output).
- the data processing apparatus 1 associates the input data such as the subject name, sex, age, and remarks with the subject ID.
- the remarks column is a column in which the user can enter text freely. For example, the disease name of the subject, the degree of cognitive impairment diagnosed by a doctor or the like at the time of registration, and the like are described.
- the data processing device 1 stores the subject name, gender, age, and the like stored in the storage device 6 in association with the subject ID. Data such as remarks are read out, and each read out data is entered in the corresponding column of the subject registration information display screen 110 and displayed.
- FIG. 3 is a diagram showing an example of the physical exercise task selection screen 120 displayed by the physical exercise task selection unit 11.
- the physical exercise task selection screen 120 displays a list of physical exercise tasks and a selection instruction field for instructing whether or not to select each physical exercise task.
- the user selects a physical exercise task to be performed from now on by putting a check mark 121 in this selection instruction column.
- the reaching task is selected.
- various parameters used in each physical exercise task can be set and displayed.
- the parameters used here include, for example, the time-out time when the subject's reaction action is performed after the body movement instruction data is presented, and the next body movement instruction data after the subject's reaction action is performed. There are a presentation time interval until presentation, a difficulty level determined in accordance with the timeout time, presentation time, and the like.
- FIG. 4 is a diagram showing an example of the reaching task instruction screen 210 presented to the body movement presentation device 2 by the instruction data presentation unit 13.
- the instruction data presentation unit 13 of the data processing device 1 is displayed on the display screen of the liquid crystal display device (body motion presentation device 2) with a touch panel sensor (body motion detection sensor 3) according to each body motion task.
- the subject By presenting (displaying) or characters, the subject instructs the position and timing to be touched on the display screen.
- the reaching task is a task in which a specific figure is presented at a random position on the display screen, and the subject is touched as quickly as possible.
- the instruction data presentation unit 13 first displays a black circle figure 211 representing the initial position on the reaching task instruction screen 210. At this time, the subject places his finger on the black circle figure 211 at the initial position and stands by. Next, the instruction data presentation unit 13 presents (displays) the cross-shaped figure 212 on the reaching task instruction screen 210. Therefore, the test subject touches the cross-shaped graphic 212 as quickly as possible after releasing the finger from the black circle graphic 211 that has been waiting.
- Di represents the distance (straight line distance) between the center position of the cross-shaped figure 212 and the position on the reaching task instruction screen 210 touched by the subject.
- the instruction data presenting unit 13 determines that the touch position of the subject acquired by the detection data acquiring unit 21 has touched the cross-shaped graphic 212
- the instruction data presenting unit 13 changes the display color of the cross-shaped graphic 212 to the cross-shaped graphic 212. Inform the subject that it was touched correctly. In this case, if the touch is made within a predetermined distance (for example, within 5 mm) from the intersection of the cross of the cross-shaped figure 212, it is determined that the touch has been made correctly. At this time, the subject may be informed whether or not the cross-shaped graphic 212 has been correctly touched by sound output from a sound output device such as a speaker.
- a sound output device such as a speaker
- the instruction data presenting unit 13 erases the cross-shaped graphic 212 that has been presented (displayed) until the test subject correctly touches the cross-shaped graphic 212, or when the subject does not touch and a time-out error occurs. At the same time, a new cross-shaped figure 212 is presented (displayed) at another position. The subject again touches the newly presented (displayed) cross-shaped graphic 212 as quickly as possible. Such presentation of the cross-shaped figure 212 and the touch by the subject are repeated a predetermined number of times.
- the detection data acquisition unit 21 acquires a part of the subject's body, for example, the coordinates when the finger touches the display screen, that is, the touch panel sensor. Surface contact with the sensor. At this time, the detection data acquisition unit 21 acquires, for example, the coordinates of the position of the center of gravity of the figure of the contact surface as the coordinates of the finger contact position.
- the origin of coordinates is set at the left corner of the display screen
- the horizontal direction is the x-axis direction
- the vertical direction is the y-axis direction
- the coordinates of each point on the display screen are Often defined.
- the origin position may be set at another corner of the display screen, may be set at the center of the display screen, or may be set at an arbitrary position inside or outside the display screen.
- the x-axis direction and the y-axis direction are not limited to the horizontal direction and the vertical direction.
- the detection data acquisition unit 21 acquires, as its basic function, the coordinates (x, y) of the position where the subject's finger contacts.
- the detection data acquisition unit 21 corresponds to each finger contact position (for example, (x1, y1)).
- the detection data acquisition unit 21 acquires time-series coordinates (x (t), y (t)) of the finger contact position every predetermined timing period (for example, 10 milliseconds).
- FIG. 5 is a view showing a modified example of the reaching task instruction screen 210 presented to the body motion presentation device 2 by the instruction data presentation unit 13.
- a circular figure 213 is presented instead of the cross-shaped figure 212.
- the processing of the instruction data presentation unit 13 and the operation to be performed by the subject are basically the same as those in the reaching task instruction screen 210 of FIG.
- the instruction data presentation unit 13 may change the size (radius) at random each time the circular figure 213 is presented.
- Fitz's law it is known as Fitz's law that there is a certain relationship between the time required for reaching and the radius of the circular figure 213. Therefore, if the reaching task instruction screen 210a is used to change the radius of the circular figure 213, whether or not the Fitz's law relationship is maintained even if the brain function such as the cognitive function deteriorates from the result. Can be examined.
- the reach task can be further modified.
- the subject touches the figure presented on the display screen unconditionally.
- the subject may be given a certain judgment condition and touched according to the result.
- the instruction data presentation unit 13 presents a reference circle and a reaching target circle on the display screen, and causes the subject to compare the sizes of both. Then, the test subject is touched when both sizes are the same, and is not touched when the sizes are different.
- the touch condition may be determined based on the color or shape of the graphic instead of the size of the graphic to be presented. Moreover, you may determine the conditions to touch by the kind of presented character. For example, when hiragana is displayed, it is touched, and when katakana or alphabet is presented, it is not touched.
- the instruction data presentation unit 13 presents a colored figure and a color name on the display screen so that the touch is made when the name of the figure color is the same as the name of the presented color, and the touch is not made when they are different.
- the detection data acquisition unit 21 acquires time-series data about the coordinates (X, Y) of the position touched by the subject and time t.
- the time series data of the coordinates (X, Y) and time t is expressed as (X i (t i ), Y i (t i )) or simply (X (t), Y (T)).
- i 1, 2,..., N, and N is the number of reaching repetitions.
- the instruction / detection data comparison unit 31 for each of the presented leaching target figures the position of the center of the figure (XC i ( ⁇ i ), YC i ( ⁇ i )) and the position touched by the subject (X The distance between i (t i ) and Y i (t i )) is calculated as the touch position error D i (see FIG. 4). Further calculation, an instruction-detection data comparator 31, for each of the leaching subject graphic presented, the difference between the time t i the subject and that the time graphics is displayed tau i has touched, as the touch delay time T i To do.
- the position accuracy calculation unit 32 calculates the position accuracy.
- the time series accuracy calculation unit 33 calculates the time series accuracy.
- the cognitive impairment degree S is calculated as a value obtained by integrating the position accuracy m d calculated by the position accuracy calculation unit 32 and the time series accuracy m T calculated by the time series accuracy calculation unit 33. There are various calculation methods as described below, any of which may be used.
- P P is the average value M C (m Tj ) and standard deviation ⁇ C (m Tj ) of the number of subjects in the healthy group.
- the cognitive impairment degree calculation unit 41 first calculates the normalized position accuracy m d_n and the normalized time series accuracy m T_n according to the following equations (1) and (2).
- the cognitive impairment degree calculation unit 41 calculates the cognitive impairment degree S according to the following equation (3). That is, the cognitive impairment degree S is calculated as a value obtained by simply adding the normalized position accuracy m d_n and the normalized time series accuracy m T_n .
- the cognitive impairment degree S may be calculated by adding weights according to the importance of the position accuracy m d and the time series accuracy m T and adding them together. In that case, to represent the respective weight position accuracy m d and time series accuracy m T with I d and I T, cognitive disorders degree S can be calculated by the following equation (4).
- Weight I d and I T can be calculated by the following equation (5) and (6).
- the position average value M C (m dj) of accuracy m dj and standard deviation ⁇ C (m dj), and the time series accuracy m average value M C (m Tj) of Tj and standard deviation sigma C (m Tj) is obtained already, further, the P P's subject of dementia group, the average value M P (m position accuracy m dk dk) and standard deviation ⁇ P (m dk), as well as time series accuracy average M P (m Tk of m Tk) and standard deviation ⁇ P (m Tk) is obtained already stored in the storage device 6 It shall be.
- weights I d, I T taking into account the variation in dementia group normal group is an index for evaluating the magnitude of the difference between the mean values of the two groups, the variance of the test (2 groups of Welch different In this case, it is determined with reference to the statistic used in the method for testing whether there is a difference between the average values of the two groups.
- the larger the weight I d the greater the difference in position accuracy m d between the two groups, so it is determined which of the two groups the position accuracy m d belongs to. It will be easy to do. That is, it can be said that the positional accuracy m d is an important index that is easier to detect the dementia group as I d is larger. Further, according to the equation (6), similarly, the time series accuracy m T can be said to be easy to important indicator to detect dementia group as I T is large.
- the calculation of the weight I d and I T is not limited to Equation (5) and (6).
- Another statistic may be used as long as it is a statistic that can evaluate the degree of divergence between the healthy group and the dementia group.
- the cognitive impairment degree calculation unit 41 calculates Equations (5) and (6), and further calculates Equation (4), whereby the degree of cognitive impairment is calculated. S is calculated.
- the weight I d, I T instead of calculating each case, preferably stored in advance in the storage device 6.
- FIG. 6 is a diagram schematically illustrating an example in which the cognitive impairment degree S is evaluated using multivariate analysis. That is, the graph of FIG. 6, the time-series accuracy m T (average touch delay time) on the horizontal axis, the position and accuracy m d (mean touch position error) the vertical axis, time series accuracy m T and each subject The position accuracy m d is represented as a scatter diagram.
- a black dot represents a time series accuracy m T and the position accuracy m d of the plurality of subjects which belongs to the normal group, triangles, when the plurality of subjects which belongs to dementia group
- the sequence accuracy m T and the position accuracy m d are represented.
- the cognitive impairment degree S is calculated. Can do.
- a straight line 302 (indicated by a broken line in FIG. 6) perpendicular to the axis 301 at a certain threshold Sth can be used as a boundary line separating the healthy group and the dementia group. That is, it is possible to determine cognitive disability S obtained from the subject by the formula (7) is dementia larger than S th, and not dementia be large. In this determination, if the threshold value Sth is set to a smaller value closer to the healthy group, dementia can be detected more sharply, and if the threshold value Sth is set to a larger value closer to the dementia group, recognition is performed. False detection of symptoms can be avoided.
- the cognitive impairment degree calculating unit 41 calculates the coefficients C d1 and C T1 in advance by linear discriminant analysis or the like, and the time series accuracy m T and the position of the subject.
- the cognitive impairment degree S of the subject is calculated according to the equation (7).
- the degree of cognitive impairment S may be calculated using another statistical method such as a support vector machine.
- FIG. 7 is a diagram schematically showing a modification of the example in which the cognitive impairment degree S shown in FIG. 6 is evaluated.
- data black circles and triangles
- the data of each subject in FIG. 7 is given an MMSE (Mini Mental State Examination) score obtained by a doctor who previously evaluated the degree of dementia through a medical examination.
- MMSE Mini Mental State Examination
- MMSE 30 is the highest score and is a healthy state with no cognitive decline, and the lower the score, the more severe the dementia.
- a multiple regression analysis which is a kind of multivariate analysis, is applied to the position accuracy m d , time series accuracy m T, and MMSE score of the healthy group, an axis 305 representing the degree of cognitive impairment S is obtained.
- the cognitive impairment degree S is expressed by the following equation (8) using coefficients C d2 and C T2 obtained by multiple regression analysis.
- the equation (8) can be regarded as an equation for estimating the MMSE score evaluated by the doctor from the time series accuracy m T and the position accuracy m d .
- the cognitive impairment calculating unit 41 calculates the coefficients C d2 and C T2 in advance by multiple regression analysis or the like, and the time-series accuracy m T and position accuracy of the subject.
- the cognitive impairment degree S of the subject is calculated according to the equation (8).
- the calibration unit 22 evaluates the subject's ability such as hearing, vision, and movement in advance, and subtracts these influences on the task for evaluating the cognitive impairment degree S (reaching task in this embodiment). In that case, all effects such as hearing, vision, and motor ability may be subtracted, or only one of the effects may be subtracted.
- FIG. 8 is a diagram illustrating an example of a calibration instruction screen presented to the body motion presentation device 2 by the calibration unit 22, where (a) is for visual calibration and (b) is for auditory calibration. (C) is an example of an instruction screen for exercise capacity calibration.
- the calibration unit 22 When evaluating the visual ability of the subject, the calibration unit 22 displays a calibration instruction message 221 and a cross-shaped figure 222 on the calibration instruction screen 220, for example, as shown in FIG. Next, the calibration unit 22, the subject is touched to the center of the cross-shaped figure 222 (intersection) without providing a time limit, a distance D i between the center position and the cross-shaped figure 222 subjects touches get.
- a 3mm position calibration values c d of the subject when the position accuracy m d obtained in such leaching task was 10mm, the position accuracy m dc after calibration, a 7 mm.
- the calibration unit 22 When evaluating the hearing ability of the subject, the calibration unit 22 displays a calibration instruction message 224 and a circular figure 225 on the calibration instruction screen 223, for example, as shown in FIG. Next, the calibration unit 22 outputs a predetermined sound from a speaker or the like. When the subject hears the sound, the calibration unit 22 touches the circular figure 225, and after the sound is output, the tester touches the circular figure 225. to get a touch delay time t i.
- the time-series calibration value c T based on the subject's auditory ability is 60 milliseconds
- the time-series accuracy m T obtained in a task related to auditory ability, such as a rhythm touch task described later is 100 milliseconds.
- the time series accuracy m Tc after calibration is 40 milliseconds.
- the calibration unit 22 When evaluating the exercise ability of the subject, the calibration unit 22 displays a calibration instruction message 227 and two circular figures 228 on the calibration instruction screen 226, for example, as shown in FIG. .
- the calibration unit 22 the subject is touched alternately as fast as possible into two circular shapes 228 that is displayed on the position determined, two circular shapes 228 obtains the time interval t i to be touched alternately.
- the time-series calibration value e T based on the exercise ability of the subject is 80 milliseconds and the time-series accuracy m T obtained by the leaching task is 100 milliseconds
- the time after calibration The sequence accuracy m Tc is 20 milliseconds. Note that 20 milliseconds in this case represents the time taken for the subject to recognize the position of the figure to be reached (in the example of FIG. 4, the cross-shaped figure 212).
- the calibration unit 22 evaluates the subject's auditory, visual, and motor skills in advance, so that the position accuracy md and time series accuracy m obtained in the task of evaluating the cognitive impairment degree S are obtained. T can be corrected according to the auditory, visual and motor skills of the subject. Then, the post-calibration position accuracy m dc and time series accuracy m Tc , m Te obtained by the calibration unit 22 are processed by the subsequent cognitive impairment degree calculation unit 41 to calculate the cognitive impairment degree S. Is used as the value of the position accuracy m d and the time series accuracy m T.
- the calibration unit 22 it is possible to obtain a highly accurate position accuracy md and time series accuracy m T in consideration of the subject's auditory, visual, and motor skills, and more accurate cognitive impairment.
- the degree S can be calculated.
- the cognitive impairment level output unit 42 stores the cognitive impairment level S calculated by the cognitive impairment level calculation unit 41 in the storage device 6.
- the cognitive impairment degree S not only the cognitive impairment degree S but also the measurement date and time, the subject ID, the age of the subject, the sex, and the like may be stored together.
- scores such as MMSE, FAB, Hasegawa scale, etc. obtained by doctor's inquiry may be stored together.
- the cognitive impairment level output unit 42 outputs the cognitive impairment level S calculated by the cognitive impairment level calculation unit 41 to the output device 5 such as a liquid crystal display device or a printer. Therefore, the subject or his / her assistant can know the degree of cognitive impairment S of the subject obtained by the cognitive impairment evaluation task such as the implemented leaching task.
- the cognitive impairment output unit 42 displays a time-dependent change graph representing the relationship between the measurement date and time of the subject and the cognitive impairment S stored in the storage device.
- FIG. 9 is a diagram illustrating an example of a graph showing a change over time in the degree of cognitive impairment S of a subject.
- the horizontal axis represents the date and time when the subject performed the reaching task or the like, that is, the date and time when the cognitive impairment degree S of the subject was measured.
- the vertical axis represents the degree of cognitive impairment S.
- a black circle mark represents the cognitive impairment degree S obtained at each measurement date and time for a certain subject.
- the cognitive impairment degree output unit 42 further obtains an index S d of the temporal change of the cognitive impairment degree S for a certain subject.
- the index S d of aging cognitive disability S for example, expressed as the slope of a straight line 310 drawn in the graph of FIG. 9, the slope of the straight line 310, be determined by such regression analysis, least squares method Can do.
- the doctor, the subject, and the assistant can know the effects of treatment and rehabilitation performed on the subject from the graph of the change over time in the degree of cognitive impairment S as shown in FIG.
- the index S d of the cognitive impairment degree S over time is not limited to the inclination of the axis 301. Any other index may be used as long as it is an index that can evaluate the temporal change in the degree of cognitive impairment S at the present time and the past.
- the standard deviation of the degree of cognitive impairment S for the past several times may be used as the index S d of the temporal change in the degree of cognitive impairment S. In this case, the stability of the cognitive function is evaluated.
- cogntive impairment degree S the degree of cognitive function deterioration
- the test is further performed on the subject. It is possible to know the effects of treatment and rehabilitation.
- the cognitive impairment degree S is calculated based on the result of the subject performing the leaching task, but is calculated based on the result of performing other physical exercise tasks. Also good.
- examples of other physical exercise tasks for obtaining the degree of cognitive impairment S will be described, and if the method for calculating the degree of cognitive impairment S is different from the embodiment in the leaching task described above, it is different. The point will also be described.
- FIG. 10 is a diagram showing an example of a one-handed rhythm touch task instruction screen 230 by auditory stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 of the physical exercise instruction unit 10 see FIG. 1
- a one-handed rhythm touch task instruction screen 230 as shown is displayed on the body motion presentation device 2.
- a circular figure 231 to be touched by the subject is displayed in the one-handed rhythm touch task instruction screen 230.
- the instruction data presentation unit 13 repeatedly outputs a touch instruction sound for instructing a timing to be touched by the subject from the speaker at a specific time interval or a random time interval.
- the test subject touches the circular figure 231 with, for example, the thumb in synchronization with the output touch instruction sound as much as possible. Since the test subject is often aged, it is preferable that the touch instruction sound for auditory stimulation is large and avoids the high sound range (the same applies to the description of FIGS. 11 and 12 described later).
- the degree of cognitive impairment S can be calculated as described later.
- the touch instruction sound is output at the same time interval in this one-handed rhythm touch task
- the degree of cognitive impairment S based on the subject's predictive power for auditory stimulation can be evaluated.
- the cognitive impairment degree S based on the response speed of the subject to the auditory stimulus can be evaluated.
- the effect of this one-handed rhythm touch task is the same in other rhythm touch tasks described below.
- the circular figure 231 to be touched by the subject is displayed on the left side of the one-handed rhythm touch task instruction screen 230 and touched with the left thumb of the subject, but the circular figure 231 to be touched by the subject. May be displayed on the right side of the one-handed rhythm touch task instruction screen 230 and touched by the subject's right thumb.
- FIG. 11 is a diagram showing an example of a two-handed rhythm touch task instruction screen 230a by auditory stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays the two-hand rhythm touch task instruction screen 230a as shown in FIG. Displayed on the physical exercise presentation device 2.
- two circular figures 231 to be touched by the subject with each of the right hand and the left hand are displayed.
- the instruction data presentation unit 13 repeatedly outputs a touch instruction sound for instructing the timing to be touched by the subject from the speaker at a specific time interval or a random time interval.
- the subject touches each of the two circular figures 231 at the same time with, for example, the left thumb and the right thumb with the timing of the touch instruction sound outputted as much as possible.
- FIG. 12 is a diagram showing an example of a two-hand alternating rhythm touch task instruction screen 230b by auditory stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays the two-hand alternating rhythm touch task instruction screen 230b as shown in FIG. It is displayed on the exercise presentation device 2.
- two circular figures 231 to be touched by the subject with each of the right hand and the left hand are displayed in the two-hand alternating rhythm touch task instruction screen 230b.
- the instruction data presentation unit 13 repeatedly outputs a touch instruction sound for instructing the timing to be touched by the subject from the speaker at a specific time interval or a random time interval.
- the test subject touches each of the two circular figures 231 alternately with, for example, the left thumb or the right thumb, in synchronization with the output touch instruction sound as much as possible.
- the touch instruction sound may be different depending on the left and right circular figures 231 to be touched.
- the pitch of the touch instruction sound may be changed on the left and right, or the sound of different instruments may be used.
- FIG. 13 is a diagram showing an example of a one-handed rhythm touch task instruction screen 240 by visual stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays a one-handed rhythm touch task instruction screen 240 as shown in FIG. Displayed on the physical exercise presentation device 2.
- a circular figure 241 to be touched by the subject is displayed in the one-handed rhythm touch task instruction screen 240.
- the instruction data presenting unit 13 repeatedly displays the touch instruction figure 242 indicating the timing to be touched by the subject at a specific time interval or a random time interval (however, immediately after that).
- the subject touches the circular graphic 241 in synchronization with the timing at which the touch instruction graphic 242 is displayed as much as possible.
- the touch instruction figure 242 for visual stimulation is not limited to circular black. Since the test subject is often aged, it is preferable that the subject has a conspicuous shape including the colors of the primary colors (the same applies to the description of FIGS. 14 and 15).
- FIG. 14 is a diagram illustrating an example of a two-handed rhythm touch task instruction screen 240a by visual stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays the two-handed rhythm touch task instruction screen 240a as shown in FIG. Displayed on the physical exercise presentation device 2.
- two circular figures 241 to be touched by the subject with the right hand and the left hand are displayed.
- the instruction data presenting unit 13 repeatedly displays the touch instruction figure 242 that indicates the timing to be touched by the subject at a specific time interval or a random time interval (however, immediately after that).
- the subject touches each of the two circular figures 241 simultaneously with, for example, the left thumb and the right thumb in accordance with the timing at which the touch instruction figure 242 is displayed as much as possible.
- FIG. 15 is a diagram illustrating an example of a two-hand alternating rhythm touch task instruction screen 240b by visual stimulation presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays the two-hand alternating rhythm touch task instruction screen as shown in FIG. 240b is displayed on the body motion presentation device 2.
- two circular figures 241 to be touched by the subject with the right hand and the left hand are displayed in the two-hand alternating rhythm touch task instruction screen 240b.
- the instruction data presenting unit 13 repeatedly displays the touch instruction figure 242 that indicates the timing to be touched by the subject at a specific time interval or a random time interval (however, immediately after that).
- the subject touches each of the two circular figures 241 alternately with the left thumb or the right thumb, for example, in accordance with the timing when the touch instruction figure 242 is displayed as much as possible.
- FIG. 16 is a diagram showing an example of a metronome-type rhythm touch task instruction screen 250 presented on the body motion presentation device 2.
- the instruction data presentation unit 13 displays the metronome type rhythm touch task instruction screen 250 as shown in FIG. It is displayed on the exercise presentation device 2.
- two circular figures 251 to be touched by the subject with each of the right hand and the left hand are displayed, and furthermore, a metronome-like pendulum 252 and a fan shape representing its amplitude range.
- a graphic 253 is displayed.
- the pendulum 252 vibrates in the range of the fan-shaped figure 253 at a constant period. Therefore, the test subject touches the right circular figure 251 with the right thumb in accordance with the timing when the pendulum 252 reaches the right end of the fan-shaped figure 253, and the left hand with the left thumb in accordance with the timing when the left end is reached. The circular figure 251 is touched.
- the metronome-type rhythm touch task can be said to be a body movement task that is substantially the same as the two-hand alternating rhythm touch task for visual stimulation shown in FIG.
- the cognitive impairment degree S including the power predicted by the subject can be evaluated.
- metronome-type rhythm touch task can be changed so that the subject alternately touches the circular figure 251 with both hands so that it touches with one hand or touches with both hands at the same time. It can be substantially similar to a touch task.
- the instruction / detection data comparison unit 31 displays the touch position coordinates (X, Y) by the subject acquired by the detection data acquisition unit 21 first. It is determined whether or not it is included in the circular figures 231, 241, 251.
- the position accuracy m d is 0.25.
- the cognitive impairment degree calculation unit 41 calculates the cognitive impairment degree S by the same method as the method shown in the embodiment. Can do.
- the cognitive impairment output unit 42 can output the calculated cognitive impairment S and its temporal change graph (see FIG. 9) to the output device 5 such as a liquid crystal display device or a printer.
- the subject or his / her assistant can easily evaluate the degree of cognitive function deterioration (cognitive impairment degree S) including brain function impairment of the subject by performing the rhythm touch task.
- FIG. 17 is a diagram showing an example of a one-hand opening / closing finger tap task instruction screen 260 presented on the body motion presentation device 2.
- the instruction data presentation unit 13 of the physical exercise instruction unit 10 is as shown in FIG.
- a one-hand open / close finger tap task instruction screen 260 is displayed on the physical exercise presentation device 2.
- the one-hand open / close finger tap task refers to a physical exercise task that repeatedly opens and closes two fingers (for example, a thumb and an index finger) of a subject.
- a fan-shaped opening / closing movement designation area 261 for designating an area in which the subject performs an opening / closing movement of two fingers is displayed.
- the opening / closing movement designation area 261 is not limited to a fan shape, and may be any shape as long as it is an area where two fingers can easily open and close.
- the subject repeats the movement of opening and closing as large and as fast as possible with the thumb and index finger in contact with the fan-shaped figure 261, for example.
- the detection data acquisition unit 21 detects the coordinate value of the position touched by the thumb and the coordinate value of the position touched by the index finger, and calculates the distance between the touch positions of the two fingers as the distance L between the two fingers.
- double-ended arrow 262 indicates the opening / closing amplitude of the index finger
- double-ended arrow 263 indicates the opening / closing amplitude of the thumb.
- the open / close finger tap task includes a two-hand open / close finger tap task and a two-hand alternate open / close finger tap task in addition to the example of FIG.
- FIG. 18 is a diagram illustrating an example of a two-hand open / close finger tap task instruction screen 260a presented on the physical exercise presenting apparatus 2
- FIG. 19 is a diagram illustrating an example of a two-hand alternating open / close finger tap task instruction screen 260b.
- the subject displays the open / close motion of two fingers on both hands on the double-hand open / close finger tap task instruction screen 260a and the double-hand alternate open / close finger tap task instruction screen 260b.
- Two opening / closing movement designation areas 261 for performing the above are displayed.
- the two-hand open / close finger tap task and the two-hand alternate open / close finger tap task are the same in that the subject uses two hands to open and close the two fingers.
- the two-hand open / close finger tap task two fingers open and close at the same time in both hands.
- the two-hand alternate open / close finger tap task differs in that two hands open and close alternately.
- FIG. 20 is a diagram illustrating an example of a time transition graph of the distance L (t) between two fingers created when the open / close finger tap task is performed.
- the instruction / detection data comparison unit 31 (see FIG. 1) is acquired by the detection data acquisition unit 21, for example, every 10 milliseconds.
- a time transition graph of the distance between two fingers L (t) as shown in FIG. 20 is created using the distance between the two fingers L (t).
- the horizontal axis represents the elapsed time from the start of the open / close finger tap task
- the vertical axis represents the distance L (t) between the two fingers at the elapsed time t.
- a missing portion 311 occurs in the acquired distance L (t) between two fingers, but in many cases, the missing portion 311 can be interpolated using a general interpolation method such as spline interpolation. it can.
- the position accuracy calculation unit 32 is obtained from the time transition graph of the distance L (t) between two fingers of FIG. 20 by the amplitude A i of two fingers opening / closing for each time (the finger opening / closing operation for each time. The difference between the maximum value and the minimum value of L (t) is calculated, and the standard deviation is calculated. Then, the position accuracy calculation unit 32 is defined as the position accuracy m d with the standard deviation of the amplitude A i that the calculated.
- the time-series accuracy calculation unit 33 calculates the time interval t i for two-finger opening / closing each time t 1 (L (t obtained by the finger opening / closing operation) from the time transition graph of the distance L (t) between two fingers in FIG. ) From the maximum value or the minimum value to the next maximum value or the minimum value), and the standard deviation is calculated. Then, the standard deviation of the calculated time interval t i is defined as the time series accuracy m T.
- the cognitive impairment degree calculation unit 41 calculates the cognitive impairment degree S by the same method as the method shown in the embodiment. Can do.
- the cognitive impairment output unit 42 can output the calculated cognitive impairment S and its temporal change graph (see FIG. 9) to the output device 5 such as a liquid crystal display device or a printer.
- the subject or his / her assistant can easily evaluate the degree of cognitive function deterioration (cognitive impairment degree S) including the brain function impairment of the subject by performing the open / close finger tap task.
- FIG. 21 is a diagram illustrating an example of a five-finger touch task instruction screen 270 presented on the body motion presentation device 2.
- the instruction data presentation unit 13 of the physical exercise instruction unit 10 (see FIG. 1) is as shown in FIG.
- a five-finger touch task instruction screen 270 is displayed on the physical exercise presentation device 2.
- ten touch instruction areas 271 corresponding to the five left and right fingers are displayed by, for example, broken circles.
- the instruction data presentation unit 13 selects one of the ten touch instruction areas 271 and displays the touch instruction figure 272 at the position of the selected touch instruction area 271. Therefore, the subject touches the displayed touch instruction graphic 272 or the touch instruction area 271 at the position.
- the instruction data presentation unit 13 deletes the touch instruction graphic 272 that has been displayed until then when the touch instruction graphic 272 is touched by the subject or when a predetermined time has elapsed.
- a touch instruction area 271 is selected, and a touch instruction figure 272 is newly displayed in the selected touch instruction area 271.
- the instruction data presenting unit 13 selects one of the touch instruction areas 271 as described above, displays the touch instruction figure 272 at the position of the touch instruction area 271, and detects the touch by the subject. Is repeated a predetermined number of times.
- the touch instruction area 271 is selected according to a predetermined order, for example, the order of finger arrangement (the left little finger ⁇ ring finger ⁇ middle finger ⁇ index finger ⁇ thumb ⁇ right finger little finger ⁇ ring finger ⁇ middle finger ⁇ index finger ⁇ thumb etc.) Alternatively, they may be selected in a random order.
- the touch instruction area 271 may be provided at a position substantially corresponding to the position of the five fingers of the subject, and the position may be calibrated for each subject. In the case of calibration, the subject is caused to touch the display screen (touch panel) with five fingers, and the touch instruction area 271 for both fingers is set based on the positions of the five fingers touched at that time. Further, the shape of the touch instruction graphic 272 is not limited to the circular shape as illustrated in FIG. 21, and may be other shapes as long as the touch of both fingers can be detected separately.
- the position accuracy md is obtained by calculating the touch failure rate as in the case of the rhythm touch task described above. That is, whenever the touch instruction figure 272 is displayed, the instruction / detection data comparison unit 31 (see FIG. 1) acquires touch position coordinates (X, Y) touched by the subject according to the display. If the touch position coordinates (X, Y) are included in the previously displayed touch instruction figure 272 or the touch instruction area 271 at that position, it is determined that the touch is successful. It is determined that the touch has failed.
- touch failure rate not the touch success rate
- the meaning of the position accuracy m d is as the accuracy increases as the value decreases. It depends on what is stipulated.
- the cognitive impairment degree calculation unit 41 calculates the cognitive impairment degree S by the same method as the method shown in the embodiment. Can do.
- the cognitive impairment output unit 42 can output the calculated cognitive impairment S and its temporal change graph (see FIG. 9) to the output device 5 such as a liquid crystal display device or a printer.
- the subject or his / her assistant can easily evaluate the degree of cognitive function deterioration (cognitive impairment degree S) including the subject's brain dysfunction by performing the five-finger touch task.
- FIG. 22 is a diagram illustrating an example of use of a five-finger touch task.
- the instruction data presentation unit 13 first displays the touch instruction graphic 272 while selecting the touch instruction area 271 in a specific order for the right hand, and further displays the same for the left hand as the right hand.
- the touch instruction figure 272 is displayed while selecting the touch instruction area 271 in order. For example, when the touch instruction figure 272 is displayed such as little finger ⁇ middle finger ⁇ thumb ⁇ index finger ⁇ ring finger ⁇ ... With the left hand, the little finger ⁇ middle finger ⁇ thumb ⁇ index finger ⁇ ring finger ⁇ . In this way, the touch instruction figures 272 are displayed in the same order.
- the subject first touches the displayed touch instruction graphic 272 or the touch instruction area 271 at the position with the finger of the right hand while following the touch instruction graphic 272 displayed in a specific order on the right hand side. Subsequently, the subject touches the displayed touch instruction graphic 272 or the touch instruction area 271 at the position with the left hand finger while following the displayed touch instruction graphic 272 displayed in the same order on the left hand side. To do.
- the five-finger touch task it is possible to evaluate whether or not the effect of exercise learned with one hand appears in the other hand, based on the cognitive impairment degree S obtained by the execution. For example, it is generally believed that when cognitive function decreases, the ability to transcribe a series of motor commands in the left and right motor areas also decreases. If the five-finger touch task is used as shown in FIG. 22, it can also be used as a tool for demonstrating general common sense and theory about brain function.
- FIG. 23 is a diagram showing an example of the tracking task instruction screen 280 presented on the body motion presentation device 2.
- the instruction data presentation unit 13 of the physical exercise instruction unit 10 displays the tracking task as shown in FIG.
- An instruction screen 280 is displayed on the physical exercise presentation device 2.
- two tracking target graphics 281a and 281b for left hand and right hand are displayed.
- the instruction data presentation unit 13 moves the two tracking target figures 281a and 281b so as to draw different trajectories 282a and 282b in the tracking task instruction screen 280, respectively.
- the subject tracks the two tracking target graphics 281a and 281b with their respective fingers according to their movements while touching the tracking task instruction screen 280 with the left and right fingers.
- the two tracking target figures 281a and 281b may have different display colors and shapes.
- L 1 (t) is called a left-hand error
- L r (t) is called a right-hand error.
- the position accuracy calculation unit 32 calculates the time average of the left hand error L l (t) and the time average of the right hand error L r (t), and further calculates the time of the calculated left hand error L l (t).
- the average value of the average and the time average of the right hand error L r (t) is defined as the position accuracy m d .
- FIG. 24 is a diagram showing the relationship between the position coordinates of the tracking target figure and the touch position coordinates of the subject as an example of a schematic time transition change graph.
- FIG. 24A is an example of the time transition change of the X coordinate.
- (B) is an example of a time transition change of the Y coordinate.
- a broken line 351 in FIG. 24A indicates the X coordinate of the left (or right) tracking target graphic 281a (or 281b) shown in FIG. 23, that is, X 0l (t) (or X 0r (t)). It represents the change over time.
- a solid line 352 represents a change over time of the X coordinate X l (t) (or X r (t)) of the touch position coordinate by the left finger (or right finger) of the subject.
- the broken line 361 in FIG. 24B indicates the Y coordinate of the left (or right) tracking target graphic 281a (or 281b) shown in FIG. 23, that is, Y 0l (t) (or Y 0r (t )) Over time.
- a solid line 362 represents a change over time of the Y coordinate Y l (t) (or Y r (t)) of the touch position coordinate by the left finger (or right finger) of the subject.
- the time series accuracy calculation unit 33 includes a function X 0l (t) (or X 0r (t)) represented by the X coordinate of the tracking target graphic 281a (or 281b) and a touch position by the subject's left finger (or right finger).
- a cross-correlation function FX l ( ⁇ ) (or FX r ( ⁇ )) between the function X l (t) (or X r (t)) represented by the X coordinate of the coordinates is calculated.
- the time-series accuracy calculation unit 33 uses the function Y 0l (t) (or Y 0r (t)) represented by the Y coordinate of the tracking target graphic 281a (or 281b) and the left finger (or right finger) of the subject.
- the cross-correlation function FY l ( ⁇ ) (or FY r ( ⁇ )) between the function Y l (t) (or Y r (t)) represented by the Y coordinate of the touch position coordinates by is calculated.
- the cross-correlation function is a function that is often used when evaluating the correlation between two time-series signals when one signal is shifted by the time ⁇ .
- the time series accuracy calculation unit 33 obtains four cross-correlation functions FX l ( ⁇ ), FX r ( ⁇ ), FY l ( ⁇ ), and FY r ( ⁇ ). Therefore, the time-series accuracy calculation unit 33 determines the time ⁇ Xl of the deviation in which the values of the respective cross-correlation functions FX l ( ⁇ ), FX r ( ⁇ ), FY l ( ⁇ ), and FY r ( ⁇ ) are maximized. , ⁇ Xr , ⁇ Yl , ⁇ Yr are calculated.
- the shift time ⁇ Xl at which the value of the cross-correlation function FX l ( ⁇ ) is maximized is the function X 0l (t) represented by the broken line 351 and the function X l (t represented by the solid line 352 in FIG. ) Is shifted by the time ⁇ Xl, it means that the degree of coincidence between the two graphs becomes the largest.
- the deviation time ⁇ X1 obtained from the cross-correlation function FX l ( ⁇ ) is suitable as an index representing the time series accuracy m T
- the deviation times ⁇ Xr , ⁇ Yl , ⁇ Yr is also suitable as an index representing the time series accuracy m T. Therefore, the time series accuracy calculation unit 33 takes the average of the four kinds of deviation times ⁇ Xl , ⁇ Xr , ⁇ Yl , ⁇ Yr calculated as described above, and calculates the time series accuracy m in the tracking task. Define T.
- the cognitive impairment level calculation unit 41 is the same as the method shown in the embodiment of the reaching task.
- the degree of cognitive impairment S can be calculated.
- the cognitive impairment output unit 42 can output the calculated cognitive impairment S and its temporal change graph (see FIG. 9) to the output device 5 such as a liquid crystal display device or a printer.
- the subject or his / her assistant can easily evaluate the degree of cognitive function deterioration (cognitive impairment degree S) including the subject's brain dysfunction by performing the tracking task.
- FIG. 25 is a diagram showing an example of the overall configuration of a brain dysfunction evaluation apparatus 100a according to a modification of the embodiment of the present invention.
- the brain dysfunction evaluation apparatus 100a shown in FIG. 25 separates functions substantially equivalent to the cerebral dysfunction evaluation apparatus 100 shown in FIG. 1 into the terminal device 101 and the server device 102 connected to each other by the communication network 8. It was realized.
- the terminal device 101 plays a role of presenting physical motion to the subject and acquiring data on the physical motion of the subject.
- the server device 102 receives the data of the physical movement of the subject acquired by the terminal device 101 via the communication network 8, and evaluates the degree of cognitive impairment of the subject based on the data of the physical movement of the subject. Fulfill. Except for this, the configuration and function of the cerebral dysfunction evaluation apparatus 100a are the same as the configuration and function of the cerebral dysfunction evaluation apparatus 100 shown in FIG. 1, and therefore only the differences will be described below. To do.
- the configuration of the data processing device 1a of the terminal device 101 is such that the body motion accuracy calculation unit 30 and the cognitive impairment level evaluation unit 40 are excluded from the data processing device 1 of the cerebral dysfunction evaluation device 100 of FIG.
- the cognitive impairment level output unit 70 is added.
- the terminal device 101 is newly provided with a communication device 7 a that connects the data processing device 1 a and the communication network 8.
- the data transmission / reception unit 60a transmits detection data and calibration data of the subject's body movement acquired by the body movement data acquisition unit 20 to the server apparatus 102 via the communication device 7a and the communication network 8.
- the data transmission / reception unit 60 a receives data such as the degree of cognitive impairment of the subject evaluated by the server device 102.
- the cognitive impairment degree output unit 70 outputs data such as the degree of cognitive impairment of the subject by the server apparatus 102 received via the data transmission / reception unit 60 a to the output device 5.
- the server device 102 includes a data processing device 1b, an operation input device 4b, an output device 5b, a storage device 6, a communication device 7b, and the like.
- the data processing device 1b includes a body motion accuracy calculation unit 30, a recognition device.
- the fault degree evaluation unit 40, the data transmission / reception unit 60b, and the like are included.
- the data transmission / reception unit 60b receives detection data and calibration data of the subject's body movement transmitted from the terminal device 101, and also transmits data such as the degree of cognitive impairment of the subject evaluated by the cognitive impairment degree evaluation unit 40 to the terminal device. 101.
- the terminal device 101 having the above-described configuration can be realized by a personal computer, a tablet terminal, a smartphone, or the like possessed by a doctor, a subject, or an assistant thereof.
- the server apparatus 102 can be realized by a high-performance personal computer, workstation, general-purpose computer, or the like.
- a plurality of terminal devices 101 may be connected to one server device 102 via a communication network.
- the terminal device 101 simply acquires the data of the subject's physical movement and displays the evaluation result. Therefore, for example, even if the terminal device 101 is lost, data on the degree of cognitive impairment of the subject does not flow out. Moreover, evaluation results such as the degree of cognitive impairment of the subject are stored in the storage device 6 of the server device 102, so that it becomes easy to access related doctors, nurses, caregivers and the like. Further, the provision of the server device 102 makes it easy to connect the cerebral dysfunction evaluation device 100a to a system that manages other medical information / health information such as an electronic medical record system, a medication recording system, and a health management system. Become.
- the brain dysfunction evaluation apparatuses 100 and 100a are configured to display the physical exercise task selection screen 120 (see FIG. 3) on which the names of a plurality of physical exercise tasks are displayed.
- the subject is allowed to select one physical exercise task, and the subject is caused to perform the selected one physical exercise task.
- the subject may be allowed to select a plurality of physical exercise tasks.
- the cognitive impairment degree evaluation unit 40 calculates a plurality of cognitive impairment degrees S according to each physical exercise task, and stores each of these cognitive impairment degrees in association with the physical exercise task. It may be stored in the device 6. Alternatively, the total cognitive impairment degree S obtained by taking an average or a weighted average of a plurality of cognitive impairment degrees S may be newly calculated, and the result may be stored in the storage device 6.
- the physical exercise task selection unit 11 may be deleted from the brain dysfunction evaluation apparatus 100, 100a, and the subject may perform only one or more predetermined specific physical exercise tasks.
- the present invention is not limited to the embodiment described above, and includes various modifications.
- the above embodiment has been described in detail for easy understanding of the present invention, and is not necessarily limited to the one having all the configurations described.
- a part of the configuration of an embodiment can be replaced with a part of the configuration of another embodiment, and further, a part or all of the configuration of the other embodiment is added to the configuration of the certain embodiment. Is also possible.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Neurology (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Physiology (AREA)
- Neurosurgery (AREA)
- Psychology (AREA)
- Multimedia (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Psychiatry (AREA)
- Hospice & Palliative Care (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Dentistry (AREA)
- Child & Adolescent Psychology (AREA)
- Developmental Disabilities (AREA)
- Acoustics & Sound (AREA)
- Otolaryngology (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
Abstract
Description
なお、以下に説明する実施形態では、脳機能障害とは、いわゆる認知機能低下を発症するもの全般(アルツハイマー型認知症、脳血管性認知症、レビー小体型認知症、パーキンソン病、水頭症、うつ病、統合失調症など)を総称するものするが、脳卒中などによる運動障害などを含む。そして、実施形態の説明では、脳機能障害を単に認知症という場合もある。
図1は、本発明の実施形態に係る脳機能障害評価装置100の全体構成の例を示した図である。図1に示すように、本発明の実施形態に係る脳機能障害評価装置100は、図示しないCPU(Central Processing Unit)やメモリを含んでなるデータ処理装置1に、身体運動提示装置2、身体運動検出センサ3、操作入力装置4、出力装置5、記憶装置6などが結合されて構成される。
また、身体運動検出センサ3は、前記の液晶表示装置に付設されたタッチパネルセンサ(画面接触型センサ)により構成される。
また、操作入力装置4は、例えば、キーボードやマウスなどによって構成されるが、タッチパネルセンサにより構成されてもよく、その場合、操作入力装置4は、身体運動検出センサ3を兼ねるものであってもよい。
また、出力装置5は、例えば、液晶表示装置やプリンタなどにより構成され、身体運動提示装置2を兼ねるものであってもよい。
また、記憶装置6は、ハードディスク装置やSSD(Solid State Disk)などにより構成され、予め格納することが定められたデータやプログラムを記憶する。
身体運動タスク選択部11は、出力装置5(液晶表示装置など)に予め準備されている身体運動タスクの一覧を表示して(後記する図3を参照)、被験者またはその介助者による操作入力装置4の入力操作に基づき、実施する身体運動タスクを選択する。
指示データ生成部12は、選択された身体運動タスクに応じて、被験者に提示する時系列の身体運動指示データを生成する。
指示データ提示部13は、生成された時系列の身体運動指示データ、つまり、被験者が実施すべき身体運動の内容を、身体運動提示装置2(液晶表示装置や音声出力装置など)を介して被験者に提示する。
検出データ取得部21は、身体運動提示装置2に提示された身体運動の内容に応じて、被験者が実施した身体運動のデータ(例えば、身体の特定部位の位置、移動速度、加速度、検出時刻など)を、身体運動検出センサ3などを介して所定の時間間隔(例えば、10ミリ秒)で取得する。すなわち、検出データ取得部21は、被験者の身体運動の時系列データを取得する。
キャリブレーション部22は、認知障害に依存しない被験者それぞれの固有の聴覚、視覚、運動能力のデータを取得し、取得したデータから被験者ごとのキャリブレーションデータを計算し、記憶装置6に格納する。
指示・検出データ比較部31は、身体運動提示装置2(液晶表示装置など)に提示された被験者が実施すべき身体運動のデータと、身体運動検出センサ3を介して取得された被験者がした身体運動のデータと、を比較する。
位置正確度算出部32は、指示・検出データ比較部31によって得られた位置に関する指示データと被験者の身体運動の検出データとの差分データに基づき、被験者の身体運動の位置正確度を算出する。
また、時系列正確度算出部33は、指示・検出データ比較部31によって得られた指示データの指示タイミングと検出データの検出タイミングの差分データに基づき、被験者の身体運動の時系列正確度を算出する。
認知障害度算出部41は、身体運動正確度算出部30で算出された位置正確度および時系列正確度、さらには、キャリブレーション部22で取得されたキャリブレーションデータなどを用いて、被験者の認知障害度を算出する。
また、認知障害度出力部42は、認知障害度算出部41により算出された認知障害度のデータまたはその経時変化データを出力装置5(液晶表示装置など)に表示する(後記する図9を参照)。
続いて、データ処理装置1を構成する各機能ブロックの機能の詳細について説明する。以下、本実施形態では、脳機能障害評価装置100は、タッチパネルセンサ付のタブレットコンピュータやタブレット端末で構成され、そのデータ処理装置1には、脳機能障害評価プログラムがアプリケーションプログラムとして登録されているものとする。
なお、脳機能障害評価プログラムは、データ処理装置1の身体運動指示部10、身体運動データ取得部20、身体運動正確度算出部30、認知障害度評価部40をそれぞれ具現化するプログラムを含んで構成される。
図2は、脳機能障害評価プログラム起動時に表示される被験者登録情報表示画面110の例を示した図である。脳機能障害評価装置100のデータ処理装置1において、所定の操作により脳機能障害評価プログラムが起動されると、出力装置5(液晶表示装置)には、図2に示すような被験者登録情報表示画面110(ただし、最初は、右側の欄は空白)が表示される。
図4は、指示データ提示部13により身体運動提示装置2に提示されるリーチングタスク指示画面210の例を示した図である。本実施形態では、データ処理装置1の指示データ提示部13は、それぞれの身体運動タスクに応じてタッチパネルセンサ(身体運動検出センサ3)付き液晶表示装置(身体運動提示装置2)の表示画面に図形や文字を提示(表示)することで、被験者がその表示画面上でタッチすべき位置やタイミングを指示する。リーチングタスクとは、表示画面上のランダムな位置に特定の図形を提示し、被験者にできる限り速くその図形にタッチさせるタスクである。
なお、健常者の場合、リーチングに要する時間と円形図形213の半径との間には、一定の関係があることが、フィッツの法則として知られている。そこで、リーチングタスク指示画面210aを用い、円形図形213の半径を変化させるようにすれば、その結果から、認知機能など脳機能が低下しても、フィッツの法則の関係が維持されるか否かを調べることができる。
図4または図5に示したようなリーチングタスクが実施されたときには、検出データ取得部21は、被験者がタッチした位置の座標(X,Y)および時刻tについての時系列データを取得する。以下、本明細書では、この座標(X,Y)および時刻tの時系列データを、(Xi(ti),Yi(ti))、または、単に、(X(t),Y(t))と表す。なお、i=1,2,…,Nであり、Nは、リーチングの繰り返し回数である。
位置正確度とは、指示データ提示部13により提示された図形の位置に対する、被験者によりタッチされた位置の一致度をいう。従って、リーチングタスクの場合、位置正確度mdは、例えば、指示・検出データ比較部31により算出されたタッチ位置誤差Diの平均値(=ΣDi/N)であると定義することができ、位置正確度算出部32により算出される。
認知障害度Sは、位置正確度算出部32により算出された位置正確度mdと、時系列正確度算出部33により算出された時系列正確度mTを統合した値として算出される。その算出方法には、以下に示すような様々な方法があるが、そのいずれを用いてもよい。
最も簡単な認知障害度Sの算出方法は、位置正確度mdおよび時系列正確度mTをそれぞれ正規化した上で、両者を単純に足し合わせるというものである。
ここで、位置正確度mdの正規化には、予め健常群の複数の被験者に同様のリーチングタスクを実施し、その複数の被験者から得られた位置正確度mdj(j=1,2,…,P:Pは健常群の被験者数)の平均値MC(mdj)および標準偏差σC(mdj)を用いる。また、時系列正確度mTの正規化には、前記予め健常群の複数の被験者に対して実施したリーチングタスクから得られた複数の被験者の時系列正確度mTj(j=1,2,…,P:Pは健常群の被験者数)の平均値MC(mTj)および標準偏差σC(mTj)を用いる。
第2の方法として、位置正確度mdおよび時系列正確度mTのそれぞれの重要度に応じて重みづけをして足し合わせることにより、認知障害度Sを算出してもよい。その場合、位置正確度mdおよび時系列正確度mTのそれぞれの重みをIdおよびITで表すと、認知障害度Sは、次の式(4)により算出することができる。
ただし、その算出をする際には、健常群のPC人の被験者について、位置正確度mdjの平均値MC(mdj)および標準偏差σC(mdj)、ならびに、時系列正確度mTjの平均値MC(mTj)および標準偏差σC(mTj)がすでに得られ、さらに、認知症群のPP人の被験者について、位置正確度mdkの平均値MP(mdk)および標準偏差σP(mdk)、ならびに、時系列正確度mTkの平均値MP(mTk)および標準偏差σP(mTk)がすでに得られ、記憶装置6に格納されているものとする。なお、j=1,2,…,PC、k=1,2,…,PPである。
さらに、認知障害度Sを評価する他の例として、多変量解析を用いる例を示す。図6は、多変量解析を用いて認知障害度Sを評価する例を模式的に示した図である。すなわち、図6のグラフは、時系列正確度mT(平均タッチ遅延時間)を横軸、位置正確度md(平均タッチ位置誤差)を縦軸とし、各被験者の時系列正確度mTおよび位置正確度mdを散布図として表したものである。
キャリブレーション部22は、被験者の聴覚、視覚、運動などの能力を予め評価しておき、認知障害度Sを評価するためのタスク(本実施形態ではリーチングタスク)に対するこれらの影響を差し引く。その場合、聴覚、視覚、運動能力などすべての影響を差し引いてもよいし、どれか一つの影響のみを差し引いてもよい。
認知障害度出力部42は、認知障害度算出部41により算出された認知障害度Sを記憶装置6に格納する。この際には、認知障害度Sだけでなく、計測日時、被験者ID、被験者の年齢、性別なども合わせて格納するとよい。さらに、医師の問診で得られたMMSE、FAB、長谷川式スケールなどのスコアを併せて格納してもよい。
ここまでに説明した実施形態では、認知障害度Sは、被験者がリーチングタスクを実施した結果に基づいて算出されるものとしているが、他の身体運動タスクを実施した結果に基づいて算出されるとしてもよい。以下、認知障害度Sを求めるための他の身体運動タスクの例について説明するとともに、認知障害度Sを算出する方法などが前記したリーチングタスクでの実施形態と相違する場合には、その相違する点についても説明する。
(a.聴覚刺激によるリズムタッチタスク)
図10は、身体運動提示装置2に提示される聴覚刺激による片手リズムタッチタスク指示画面230の例を示した図である。
身体運動タスク選択画面120(図3参照)を介して聴覚刺激の片手リズムタッチタスクが選択された場合には、身体運動指示部10(図1参照)の指示データ提示部13は、図10に示すような片手リズムタッチタスク指示画面230を身体運動提示装置2に表示する。このとき、片手リズムタッチタスク指示画面230の中には、被験者がタッチすべき円形図形231が表示される。
なお、被験者は高齢である場合が多いため、聴覚刺激用のタッチ指示音は、大きめで高音域を避けたものが好ましい(後記する図11、図12の説明でも同様)。
なお、この片手リズムタッチタスクの効果は、以下に説明する他のリズムタッチタスクでも同様である。
身体運動タスク選択画面120(図3参照)を介して聴覚刺激の両手ズムタッチタスクが選択された場合には、指示データ提示部13は、図11に示すような両手リズムタッチタスク指示画面230aを身体運動提示装置2に表示する。このとき、両手リズムタッチタスク指示画面230aの中には、被験者が右手および左手のそれぞれでタッチすべき2つの円形図形231が表示される。
身体運動タスク選択画面120(図3参照)を介して両手交互リズムタッチタスクが選択された場合には、指示データ提示部13は、図12に示すような両手交互リズムタッチタスク指示画面230bを身体運動提示装置2に表示する。このとき、両手交互リズムタッチタスク指示画面230bの中には、被験者が右手および左手のそれぞれでタッチすべき2つの円形図形231が表示される。
図13は、身体運動提示装置2に提示される視覚刺激による片手リズムタッチタスク指示画面240の例を示した図である。
身体運動タスク選択画面120(図3参照)を介して視覚刺激の片手リズムタッチタスクが選択された場合には、指示データ提示部13は、図13に示すような片手リズムタッチタスク指示画面240を身体運動提示装置2に表示する。このとき、片手リズムタッチタスク指示画面240の中には、被験者がタッチすべき円形図形241が表示される。
なお、視覚刺激用のタッチ指示図形242は、円形黒色に限定されない。被験者は高齢である場合が多いため、むしろ、派手な原色系の色を含んだ目立つ形状であるもののほうが好ましい(図14、図15の説明でも同様)。
身体運動タスク選択画面120(図3参照)を介して視覚刺激の両手リズムタッチタスクが選択された場合には、指示データ提示部13は、図14に示すような両手リズムタッチタスク指示画面240aを身体運動提示装置2に表示する。このとき、両手リズムタッチタスク指示画面240aの中には、被験者が右手および左手のそれぞれでタッチすべき2つの円形図形241が表示される。
身体運動タスク選択画面120(図3参照)を介して視覚刺激の両手交互リズムタッチタスクが選択された場合には、指示データ提示部13は、図15に示すような両手交互リズムタッチタスク指示画面240bを身体運動提示装置2に表示する。このとき、両手交互リズムタッチタスク指示画面240bの中には、被験者が右手および左手のそれぞれでタッチすべき2つの円形図形241が表示される。
図16は、身体運動提示装置2に提示されるメトロノーム型リズムタッチタスク指示画面250の例を示した図である。
身体運動タスク選択画面120(図3参照)を介してメトロノーム型リズムタッチタスクが選択された場合には、指示データ提示部13は、図16に示すようなメトロノーム型リズムタッチタスク指示画面250を身体運動提示装置2に表示する。このとき、メトロノーム型リズムタッチタスク指示画面250の中には、被験者が右手および左手のそれぞれでタッチすべき2つの円形図形251が表示され、さらに、メトロノーム様の振り子252とその振幅範囲を表す扇形図形253が表示される。
以上に説明した各リズムタッチタスクでは、指示・検出データ比較部31(図1参照)は、検出データ取得部21によって取得される被験者によるタッチ位置座標(X,Y)が、先に表示された円形図形231,241,251内に含まれているか否かを判定する。
なお、位置正確度mdとして、タッチ成功率ではなくタッチ失敗率を用いるのはやや違和感があるが、これは、本明細書では、位置正確度mdの意味を、値が小さいほど正確度が高くなると定めていることによる。
図17は、身体運動提示装置2に提示される片手開閉指タップタスク指示画面260の例を示した図である。
身体運動タスク選択画面120(図3参照)を介して片手開閉指タップタスクが選択された場合には、身体運動指示部10(図1参照)の指示データ提示部13は、図17に示すような片手開閉指タップタスク指示画面260を身体運動提示装置2に表示する。ここで、片手開閉指タップタスクとは、被験者の片手の2指(例えば、親指と人差し指)を繰り返し開閉運動させる身体運動タスクをいう。
図20は、開閉指タップタスクが行われる場合に作成される2指間距離L(t)の時間推移グラフの例を示した図である。
図17~図19を用いて説明した開閉指タップタスクが行われる場合、指示・検出データ比較部31(図1参照)は、検出データ取得部21によって、例えば、10ミリ秒ごとに取得される2指間距離L(t)を用いて、図20に示すような2指間距離L(t)の時間推移グラフを作成する。なお、図20のグラフにおいて、横軸は、開閉指タップタスク開始時からの経過時間を表し、縦軸は、経過時間tでの2指間距離L(t)を表している。
図21は、身体運動提示装置2に提示される5指タッチタスク指示画面270の例を示した図である。
身体運動タスク選択画面120(図3参照)を介して5指タッチタスクが選択された場合には、身体運動指示部10(図1参照)の指示データ提示部13は、図21に示すような5指タッチタスク指示画面270を身体運動提示装置2に表示する。このとき、5指タッチタスク指示画面270の中には、左右の5指それぞれに対応する10個のタッチ指示領域271が、例えば、破線の円などで表示されている。
また、タッチ指示図形272の形状は、図21に示したような円形に限定されず、両手5指のタッチが分離して検出できる形状であれば、他の形状であってもよい。
以上に説明した5指タッチタスクでは、位置正確度mdは、前記したリズムタッチタスクの場合と同様に、タッチ失敗率を計算することにより求められる。
すなわち、指示・検出データ比較部31(図1参照)は、タッチ指示図形272が表示されるたびに、その表示に応じて被験者がタッチしたタッチ位置座標(X,Y)を取得する。そして、そのタッチ位置座標(X,Y)が先に表示したタッチ指示図形272またはその位置のタッチ指示領域271の中に含まれていた場合には、タッチ成功と判定し、そうでない場合には、タッチ失敗と判定する。次いで、位置正確度算出部32は、タッチ失敗率(=タッチ失敗回数/(タッチ成功回数+タッチ失敗回数))を算出し、その算出したタッチ失敗率をもって位置正確度mdと定義する。
なお、位置正確度mdとして、タッチ成功率ではなくタッチ失敗率を用いるのは、前記したように、本明細書では、位置正確度mdの意味を、値が小さいほど正確度が高くなると定めていることによる。
図22は、5指タッチタスクの一使用例を説明する図である。図22に示すように、指示データ提示部13は、まず、右手について、特定の順序でタッチ指示領域271を選択しつつ、タッチ指示図形272を表示していき、さらに、左手について、右手と同じ順序でタッチ指示領域271を選択しつつ、タッチ指示図形272を表示していく。例えば、左手で小指→中指→親指→人差し指→薬指→・・・というようにタッチ指示図形272を表示していった場合には、右手でも小指→中指→親指→人差し指→薬指→・・・というように同じ順序でタッチ指示図形272を表示していく。
図23は、身体運動提示装置2に提示される追跡タスク指示画面280の例を示した図である。
身体運動タスク選択画面120(図3参照)を介して追跡タスクが選択された場合には、身体運動指示部10(図1参照)の指示データ提示部13は、図23に示すような追跡タスク指示画面280を身体運動提示装置2に表示する。このとき、追跡タスク指示画面280の中には、左手用および右手用の2つの追跡ターゲット図形281a,281bが表示される。
追跡タスクが実施されると、指示・検出データ比較部31(図1参照)は、追跡ターゲット図形281a,281bが移動する位置座標(X0j(t),Y0j(t))(j=L,R)と被験者の左右の指のタッチ位置座標(Xj(t),Yj(t))(j=l,r)との距離を求め、その距離をLj(t)(j=l,r)と表す。ここで、j=lは、左、j=rは、右を意味する。また、Ll(t)を左手誤差、Lr(t)を右手誤差と呼ぶ。
図24は、追跡ターゲット図形の位置座標と被験者のタッチ位置座標との関係を、模式的な時間推移変化グラフの例として示した図であり、(a)は、X座標の時間推移変化の例、(b)は、Y座標の時間推移変化の例である。
同様に、図24(b)における破線361は、図23に示された左側(または右側)の追跡ターゲット図形281a(または281b)のY座標、すなわち、Y0l(t)(またはY0r(t))の時間推移変化を表している。また、実線362は、被験者の左指(または右指)によるタッチ位置座標のY座標Yl(t)(またはYr(t))の時間推移変化を表している。
同様に、時系列正確度算出部33は、追跡ターゲット図形281a(または281b)のY座標が表す関数Y0l(t)(またはY0r(t))と、被験者の左指(または右指)によるタッチ位置座標のY座標が表す関数Yl(t)(またはYr(t))と、の間の相互相関関数FYl(τ)(またはFYr(τ))を計算する。
なお、相互相関関数は、2つの時系列信号のうち一方の信号を時間τだけずらしたとき両者の相関を評価する場合にしばしば用いられる関数である。
ここで、相互相関関数FXl(τ)の値が最大になるずれ時間τXlとは、図24(a)の破線351が表す関数X0l(t)と実線352が表す関数Xl(t)とのうち一方を時間τXlだけずらしたとき、両者のグラフの一致度が最も大きくなることを意味する。
そこで、時系列正確度算出部33は、以上のようにして算出された4通りのずれの時間τXl、τXr、τYl、τYrの平均をとって、追跡タスクにおける時系列正確度mTと定義する。
<4.1 脳機能障害評価装置の構成の変形例>
図25は、本発明の実施形態の変形例に係る脳機能障害評価装置100aの全体構成の例を示した図である。図25に示す脳機能障害評価装置100aは、図1に示した脳機能障害評価装置100とほぼ同等の機能を、通信ネットワーク8によって相互に接続された端末装置101とサーバ装置102とに分離して実現したものである。
<4.2 その他の変形例>
以上に説明した本発明の実施形態およびその変形例では、脳機能障害評価装置100,100aは、複数の身体運動タスクの名称が表示された身体運動タスク選択画面120(図3参照)を介して、被験者に1つの身体運動タスクを選択させ、選択された1つの身体運動タスクを被験者に実施させるものとしている。この場合、被験者に複数の身体運動タスクを選択させるようにしてもよい。そうした場合には、認知障害度評価部40は、それぞれの身体運動タスクに応じて複数の認知障害度S算出することになるが、そのそれぞれの認知障害度Sを身体運動タスクに対応付けて記憶装置6に格納してもよい。あるいは、複数の認知障害度Sの平均や加重平均を取った総合的な認知障害度Sを新たに計算した上で、その結果を記憶装置6に格納してもよい。
2 身体運動提示装置
3 身体運動検出センサ
4,4b 操作入力装置
5,5b 出力装置
6 記憶装置
7a,7b 通信装置
8 通信ネットワーク
10 身体運動指示部
11 身体運動タスク選択部
12 指示データ生成部
13 指示データ提示部
20 身体運動データ取得部
21 検出データ取得部
22 キャリブレーション部
30 身体運動正確度算出部
31 指示・検出データ比較部
32 位置正確度算出部
33 時系列正確度算出部
40 認知障害度評価部
41 認知障害度算出部
42 認知障害度出力部
60a,60b データ送受信部
70 認知障害度出力部
100.100a 脳機能障害評価装置
101 端末装置
102 サーバ装置
110 被験者登録情報表示画面
120 身体運動タスク選択画面
121 チェックマーク
210,210a リーチングタスク指示画面
211 黒丸図形
212,222 十字型図形
213,225,228,231,241,251 円形図形
230 片手リズムタッチタスク指示画面
230a 両手リズムタッチタスク指示画面
230b 両手交互リズムタッチタスク指示画面
240 片手リズムタッチタスク指示画面
240a 両手リズムタッチタスク指示画面
240b 両手交互リズムタッチタスク指示画面
242 タッチ指示図形
250 メトロノーム型リズムタッチタスク指示画面
252 振り子
253 扇形図形
260 片手開閉指タップタスク指示画面
260a 両手開閉指タップタスク指示画面
260b 両手交互開閉指タップタスク指示画面
261 扇型図形
261 開閉運動指定領域
262,263 両端矢線
270 5指タッチタスク指示画面
271 タッチ指示領域
272 タッチ指示図形
280 追跡タスク指示画面
281a 追跡ターゲット図形
282a 軌跡
301,305 軸
310 直線
311 欠損部
351 破線
352 実線
Claims (15)
- 被験者にさせる身体運動を提示する身体運動提示装置と、前記提示された身体運動に応じて前記被験者がした身体運動の身体運動データを検出する身体運動検出センサと、に接続されたデータ処理装置が、
前記被験者に実施させる身体運動指示データを生成し、前記生成した身体運動指示データに基づく身体運動を、前記身体運動提示装置を介して前記被験者に提示し、その実施を指示する身体運動指示ステップと、
前記被験者がした身体運動の身体運動データを、前記身体運動検出センサを介して時系列に取得する身体運動データ取得ステップと、
前記身体運動指示データと前記身体運動データとに基づき、前記被験者の身体運動の位置正確度および時系列正確度を算出する身体運動正確度算出ステップと、
前記算出された位置正確度および時系列正確度から求められる前記被験者の身体運動の正確度を表す値を、予め取得された健常者の身体運動の正確度を表す統計データと比較することにより、前記被験者の認知障害度を評価する認知障害度評価ステップと、
を実行すること
を特徴とする脳機能障害評価方法。 - 前記データ処理装置は、
前記身体運動指示ステップにおいて、前記被験者が個別に有する視覚、聴覚および運動能力の少なくとも1つの能力を取得するための身体運動を提示する第1の処理と、
前記身体運動データ取得ステップにおいて、前記第1の処理により提示した身体運動に従って前記被験者がした身体運動の身体運動データを、前記被験者の前記視覚、聴覚および運動能力の少なくとも1つの能力の影響を排除したキャリブレーションデータとして取得する第2の処理と、
をさらに実行すること
を特徴とする請求項1に記載の脳機能障害評価方法。 - 前記データ処理装置は、
前記認知障害度評価ステップで評価した前記被験者の認知障害度のデータおよびその経時変化のデータの少なくとも一方を、出力装置に出力すること
を特徴とする請求項1に記載の脳機能障害評価方法。 - 前記データ処理装置は、
前記身体運動指示ステップでは、前記被験者が2つの指を前記身体運動検出センサにタッチさせたままその2つの指を開閉する身体運動を、前記身体運動提示装置に提示すること
を特徴とする請求項1に記載の脳機能障害評価方法。 - 前記データ処理装置は、
前記身体運動指示ステップでは、前記被験者の視覚を刺激する図形を前記身体運動提示装置に表示する処理および前記被験者の聴覚を刺激する音声を前記身体運動提示装置から出力する処理の少なくとも一方を実行するとともに、前記の刺激に応答して前記被験者が前記身体運動検出センサにタッチする身体運動を、前記身体運動提示装置に提示する処理を実行すること
を特徴とする請求項1に記載の脳機能障害評価方法。 - 前記データ処理装置は、
前記身体運動指示ステップでは、移動する図形を前記身体運動提示装置に提示するとともに、前記被験者が前記身体運動検出センサに指をタッチさせたまま前記移動する図形を追跡する身体運動を、前記身体運動提示装置に提示すること
を特徴とする請求項1に記載の脳機能障害評価方法。 - 被験者にさせる身体運動を提示する身体運動提示装置と、前記提示された身体運動に応じて前記被験者がした身体運動の実施データを検出する身体運動検出センサと、に接続されたデータ処理装置に、
前記被験者に実施させる身体運動指示データを生成し、前記生成した身体運動指示データに基づく身体運動を、前記身体運動提示装置を介して前記被験者に提示し、その実施を指示する身体運動指示ステップと、
前記被験者がした身体運動の身体運動データを、前記身体運動検出センサを介して時系列に取得する身体運動データ取得ステップと、
前記身体運動指示データと前記身体運動データとに基づき、前記被験者の身体運動の位置正確度および時系列正確度を算出する身体運動正確度算出ステップと、
前記算出された位置正確度および時系列正確度から求められる前記被験者の身体運動の正確度を表す値を、予め取得された健常者の身体運動の正確度を表す統計データと比較することにより、前記被験者の認知障害度を評価する認知障害度評価ステップと、
を実行させるためのプログラム。 - 前記データ処理装置に、
前記身体運動指示ステップにおいて、前記被験者が個別に有する視覚、聴覚および運動能力の少なくとも1つの能力を取得するための身体運動を提示する第1の処理と、
前記身体運動データ取得ステップにおいて、前記第1の処理により提示した身体運動に従って前記被験者がした身体運動の身体運動データを、前記被験者の前記視覚、聴覚および運動能力の少なくとも1つの能力の影響を排除したキャリブレーションデータとして取得する第2の処理と、
をさらに実行させること
を特徴とする請求項7に記載のプログラム。 - 前記データ処理装置に、
前記認知障害度評価ステップで評価した前記被験者の認知障害度のデータおよびその経時変化のデータの少なくとも一方を、出力装置に出力するステップ
をさらに実行させること
を特徴とする請求項7に記載のプログラム。 - 前記データ処理装置に、
前記身体運動指示ステップでは、前記被験者が2つの指を前記身体運動検出センサにタッチさせたままその2つの指を開閉する身体運動を、前記身体運動提示装置に提示する処理を実行させること
を特徴とする請求項7に記載のプログラム。 - 前記データ処理装置に、
前記身体運動指示ステップでは、前記被験者の視覚を刺激する図形を前記身体運動提示装置に表示する処理および前記被験者の聴覚を刺激する音声を前記身体運動提示装置から出力する処理の少なくとも一方を実行させるとともに、前記の刺激に応答して前記被験者が前記身体運動検出センサにタッチする身体運動を、前記身体運動提示装置に提示する処理を実行させること
を特徴とする請求項7に記載のプログラム。 - 前記データ処理装置に、
前記身体運動指示ステップで、移動する図形を前記身体運動提示装置に提示するとともに、前記被験者が前記身体運動検出センサに指をタッチさせたまま前記移動する図形を追跡する身体運動を、前記身体運動提示装置に提示する処理を実行させること
を特徴とする請求項7に記載のプログラム。 - 被験者にさせる身体運動を提示する身体運動提示装置と、
前記提示された身体運動に応じて前記被験者がした身体運動の身体運動データを検出する身体運動検出センサと、
前記身体運動提示装置と前記身体運動検出センサとに接続されたデータ処理装置と、
を備え、
前記データ処理装置は、
前記被験者に実施させる身体運動指示データを生成し、前記生成した身体運動指示データに基づく身体運動を、前記身体運動提示装置を介して前記被験者に提示し、その実施を指示する身体運動指示部と、
前記被験者がした身体運動の身体運動データを、前記身体運動検出センサを介して時系列に取得する身体運動データ取得部と、
前記身体運動指示データと前記身体運動データとに基づき、前記被験者の身体運動の位置正確度および時系列正確度を算出する身体運動正確度算出部と、
前記算出された位置正確度および時系列正確度から求められる前記被験者の身体運動の正確度を表す値を、予め取得された健常者の身体運動の正確度を表す統計データと比較することにより、前記被験者の認知障害度を評価する認知障害度評価部と、
を有すること
を特徴とする脳機能障害評価装置。 - 前記データ処理装置は、さらに、
前記身体運動指示部の処理として、前記被験者が個別に有する視覚、聴覚および運動能力の少なくとも1つの能力を取得するための身体運動を提示する第1の処理を実行し、
前記身体運動データ取得部の処理として、前記第1の処理で提示された身体運動に従って前記被験者がした身体運動の身体運動データを、前記被験者の前記視覚、聴覚および運動能力の少なくとも1つの能力の影響を排除したキャリブレーションデータとして取得する第2の処理を実行すること
を特徴とする請求項13に記載の脳機能障害評価装置。 - 前記身体運動提示装置と、前記身体運動検出センサと、前記身体運動指示部および前記身体運動データ取得部とを有する第1のデータ処理装置と、備えてなる端末装置と、
前記身体運動正確度算出部と前記認知障害度評価部とを有する第2のデータ処理装置を備えてなるサーバ装置と、
が互いに通信可能に接続されて構成されること
を特徴とする請求項13に記載の脳機能障害評価装置。
Priority Applications (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US14/893,338 US10478114B2 (en) | 2013-09-11 | 2013-09-11 | Brain dysfunction assessment method, brain dysfunction assessment device, and program thereof |
| JP2015536360A JP6122130B2 (ja) | 2013-09-11 | 2013-09-11 | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム |
| PCT/JP2013/074582 WO2015037089A1 (ja) | 2013-09-11 | 2013-09-11 | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム |
| CN201380077561.5A CN105407800B (zh) | 2013-09-11 | 2013-09-11 | 脑功能障碍评价装置及存储介质 |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/JP2013/074582 WO2015037089A1 (ja) | 2013-09-11 | 2013-09-11 | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2015037089A1 true WO2015037089A1 (ja) | 2015-03-19 |
Family
ID=52665231
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2013/074582 Ceased WO2015037089A1 (ja) | 2013-09-11 | 2013-09-11 | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US10478114B2 (ja) |
| JP (1) | JP6122130B2 (ja) |
| CN (1) | CN105407800B (ja) |
| WO (1) | WO2015037089A1 (ja) |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2016049282A (ja) * | 2014-08-29 | 2016-04-11 | 日立マクセル株式会社 | 脳機能障害評価システム、脳機能障害評価方法およびプログラム |
| CN106256312A (zh) * | 2015-06-16 | 2016-12-28 | 日立数字映像(中国)有限公司 | 认知功能障碍评价装置 |
| JP2017012339A (ja) * | 2015-06-30 | 2017-01-19 | 芳彦 佐野 | 認知機能観察システム及び認知機能観察方法 |
| JP2017515543A (ja) * | 2014-04-21 | 2017-06-15 | 日立マクセル株式会社 | 脳機能障害評価装置 |
| JP2017217144A (ja) * | 2016-06-06 | 2017-12-14 | マクセルホールディングス株式会社 | 手指運動練習メニュー生成システム、方法、及びプログラム |
| WO2018062173A1 (ja) * | 2016-09-29 | 2018-04-05 | マクセル株式会社 | タスク実行順序決定システムおよびタスク実行方法 |
| WO2018143147A1 (ja) * | 2017-02-01 | 2018-08-09 | 充宏 池田 | 脳機能を測定する情報処理端末及びプログラム |
| WO2018168915A1 (ja) * | 2017-03-14 | 2018-09-20 | 学校法人日本大学 | 生体機能についての医学的検査の得点判定装置、及びプログラム |
| JP2019522514A (ja) * | 2016-06-07 | 2019-08-15 | セレブラル アセスメント システムズ、エルエルシー | 視覚運動応答の定量的評価のための方法およびシステム |
| JP2019531569A (ja) * | 2016-09-14 | 2019-10-31 | エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患もしくは障害についてのデジタルバイオマーカー |
| JP2021500183A (ja) * | 2017-10-25 | 2021-01-07 | エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患または障害についてのデジタル質測定的バイオマーカー |
| JP2022537197A (ja) * | 2019-06-19 | 2022-08-24 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP2022537267A (ja) * | 2019-06-19 | 2022-08-25 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP2022537742A (ja) * | 2019-06-19 | 2022-08-29 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP2022181549A (ja) * | 2021-05-26 | 2022-12-08 | 国立大学法人 名古屋工業大学 | 認知機能評価プログラム、認知機能評価装置、認知機能評価システム、及び認知機能評価方法 |
| WO2024047738A1 (ja) * | 2022-08-30 | 2024-03-07 | 日本電気株式会社 | 検査装置、検査システム、検査方法、及び非一時的なコンピュータ可読媒体 |
| US12408862B2 (en) | 2019-06-19 | 2025-09-09 | Hoffmann-La Roche Inc. | Digital biomarker |
| JP7770609B1 (ja) * | 2025-08-19 | 2025-11-14 | 社会福祉法人兵庫県社会福祉事業団 | 認知機能評価システム、認知機能評価方法及び認知機能評価プログラム |
Families Citing this family (36)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2961318A4 (en) * | 2013-03-01 | 2016-11-30 | Brainfx Inc | NEUROLOGICAL EVALUATION SYSTEM AND METHOD |
| US9474481B2 (en) | 2013-10-22 | 2016-10-25 | Mindstrong, LLC | Method and system for assessment of cognitive function based on electronic device usage |
| US9420970B2 (en) | 2013-10-22 | 2016-08-23 | Mindstrong, LLC | Method and system for assessment of cognitive function based on mobile device usage |
| US20150196232A1 (en) * | 2014-01-15 | 2015-07-16 | Apptromics LLC | System and method for testing motor and cognitive performance of a human subject with a mobile device |
| US10748439B1 (en) * | 2014-10-13 | 2020-08-18 | The Cognitive Healthcare Company | Automated delivery of unique, equivalent task versions for computer delivered testing environments |
| US10250951B2 (en) | 2014-10-27 | 2019-04-02 | Adobe Inc. | Systems and methods for planning, executing, and reporting a strategic advertising campaign for television |
| US10185971B2 (en) * | 2014-10-27 | 2019-01-22 | Adobe Systems Incorporated | Systems and methods for planning and executing an advertising campaign targeting TV viewers and digital media viewers across formats and screen types |
| US10269454B2 (en) * | 2015-01-06 | 2019-04-23 | Stryker Corporation | Method of configuring devices in an operating theater |
| US11039780B2 (en) * | 2015-10-05 | 2021-06-22 | The Johns Hopkins University | System and method for seizure detection and responsivity testing |
| US20170263146A1 (en) * | 2016-03-11 | 2017-09-14 | Varun Aggarwal | Method and system for building and scoring motor skills tests |
| CN106214165A (zh) * | 2016-07-18 | 2016-12-14 | 北京航空航天大学 | 用于测试车辆驾驶舱内乘员综合反应时的测量仪 |
| WO2018053445A1 (en) * | 2016-09-19 | 2018-03-22 | Baylor College Of Medicine | Instrumented trail making task (itmt) |
| DE102016218356A1 (de) * | 2016-09-23 | 2018-03-29 | Siemens Healthcare Gmbh | Verfahren zum Betrieb einer Magnetresonanzanlage, Datenträger sowie Magnetresonanzanlage |
| WO2018086987A1 (en) * | 2016-11-10 | 2018-05-17 | Koninklijke Philips N.V. | Method and apparatus for determining an indication of cognitive impairment |
| US11291402B2 (en) | 2016-11-10 | 2022-04-05 | Koninklijke Philips N.V. | Method and apparatus for determining an indication of cognitive impairment |
| US10169631B2 (en) * | 2017-03-06 | 2019-01-01 | International Business Machines Corporation | Recognizing fingerprints and fingerprint combinations as inputs |
| CN108567412B (zh) * | 2017-03-08 | 2022-10-21 | 麦克赛尔数字映像(中国)有限公司 | 运动障碍评价装置及方法 |
| CN108962379B (zh) * | 2017-05-26 | 2022-04-22 | 中国科学院软件研究所 | 一种脑神经系统疾病的手机辅助检测系统 |
| WO2019087787A1 (ja) * | 2017-10-30 | 2019-05-09 | マクセル株式会社 | 異常データ処理システムおよび異常データ処理方法 |
| TW201927241A (zh) * | 2017-12-21 | 2019-07-16 | 瑞士商赫孚孟拉羅股份公司 | 用於肌肉失能之數位生物標記 |
| CN113317761B (zh) * | 2018-03-16 | 2024-10-08 | 北京安和福祉科技有限公司 | 一种认知功能障碍预防监测装置 |
| WO2019188405A1 (ja) * | 2018-03-29 | 2019-10-03 | パナソニックIpマネジメント株式会社 | 認知機能評価装置、認知機能評価システム、認知機能評価方法及びプログラム |
| WO2021020128A1 (ja) * | 2019-07-26 | 2021-02-04 | マクセル株式会社 | Tmt検査結果表示のためのシステム、コンピュータプログラムおよび方法 |
| WO2021053580A1 (en) * | 2019-09-17 | 2021-03-25 | Neuroinova, Lda | Method and device for tracking cognitive performance in mild cognitive impairment |
| WO2021086274A1 (en) * | 2019-10-30 | 2021-05-06 | Chulalongkorn University | A stimulating system for collaborative functions of brain and body |
| CN111012315A (zh) * | 2020-01-02 | 2020-04-17 | 辽宁中晨优智医疗技术有限公司 | 一种基于人认知功能的脑健康诊断设备 |
| WO2021253139A1 (en) * | 2020-06-19 | 2021-12-23 | Baycrest Centre For Geriatric Care | Methods for assessing brain health using behavioural and/or electrophysiological measures of visual processing |
| RU2741220C1 (ru) * | 2020-07-23 | 2021-01-22 | Федеральное государственное бюджетное образовательное учреждение высшего образования «Сибирский государственный медицинский университет» Министерства здравоохранения Российской Федерации | Способ диагностики когнитивной дисфункции у пациентов с сахарным диабетом 1-го и 2-го типа с оценкой церебрального кровотока |
| EP4355200A1 (en) * | 2021-06-17 | 2024-04-24 | Altoida Inc. | Method and system for obtaining measurement of cognitive performance |
| CN114052736B (zh) * | 2021-08-31 | 2024-04-05 | 北京未名脑脑科技有限公司 | 认知功能的评估系统和方法 |
| US20230233138A1 (en) * | 2022-01-21 | 2023-07-27 | Haii Corp. | Technique for identifying dementia based on plurality of result data |
| CN114628009B (zh) * | 2022-05-16 | 2022-08-23 | 成都中医药大学 | 康复评估计算机系统、运行方法 |
| US11751799B1 (en) * | 2023-02-08 | 2023-09-12 | Lanny Leo Johnson | Methods and systems for diagnosing cognitive conditions |
| CN116602680B (zh) * | 2023-05-23 | 2025-08-19 | 华南理工大学 | 结合手指灵活度和跨度的认知能力客观定量测量方法 |
| CN119626513A (zh) * | 2024-08-21 | 2025-03-14 | 中国人民解放军陆军军医大学第一附属医院 | 基于对象识别的神经内科认知功能评估系统及方法 |
| CN120090951A (zh) * | 2025-04-30 | 2025-06-03 | 国网江西省电力有限公司信息通信分公司 | 一种智能的网络安全终端防护能力评估模型构建方法 |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06327659A (ja) * | 1993-05-25 | 1994-11-29 | Kyoto Densoku Kk | 知能検査・知能回復訓練装置 |
| JPH11188020A (ja) * | 1997-12-26 | 1999-07-13 | Otsuka Pharmaceut Co Ltd | 神経機能の診断装置 |
| JP2002369818A (ja) * | 2001-06-14 | 2002-12-24 | Yunimekku:Kk | 脳・神経系疾患の病状診断、投薬処方及び機能回復訓練のための支援システム |
| JP2005508211A (ja) * | 2001-08-10 | 2005-03-31 | コッグステイト リミテッド | 認知機能のテストシステム及び方法 |
| JP2006320424A (ja) * | 2005-05-17 | 2006-11-30 | Tama Tlo Kk | 動作教示装置とその方法 |
| JP2011045520A (ja) * | 2009-08-27 | 2011-03-10 | Hitachi Computer Peripherals Co Ltd | 運動機能評価システム、運動機能評価方法およびプログラム |
Family Cites Families (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6413190B1 (en) * | 1999-07-27 | 2002-07-02 | Enhanced Mobility Technologies | Rehabilitation apparatus and method |
| JP3784630B2 (ja) * | 2000-10-06 | 2006-06-14 | 株式会社総合医科学研究所 | 精神検査方法及び精神機能検査装置 |
| US20020192624A1 (en) * | 2001-05-11 | 2002-12-19 | Darby David G. | System and method of testing cognitive function |
| CN100360076C (zh) * | 2005-04-01 | 2008-01-09 | 浙江工业大学 | 计算机辅助早期鉴别和预测老年性痴呆的装置 |
| CN100571622C (zh) * | 2005-09-07 | 2009-12-23 | 首都医科大学宣武医院 | 用于检测手运动功能的装置及其使用方法 |
| CN101657145B (zh) * | 2007-04-13 | 2012-01-25 | 耐克国际有限公司 | 单一式视觉测试中心 |
| US8979754B2 (en) * | 2007-08-09 | 2015-03-17 | FITS—Functional Interactive Timing System Ltd. | Interactive system and method for neuromotor functioning assessment and training |
| WO2009056650A1 (en) * | 2007-11-02 | 2009-05-07 | Siegbert Warkentin | System and methods for assessment of the aging brain and its brain disease induced brain dysfunctions by speech analysis |
| JP5229083B2 (ja) * | 2009-04-14 | 2013-07-03 | ソニー株式会社 | 情報処理装置、情報処理方法及びプログラム |
| US20100292545A1 (en) * | 2009-05-14 | 2010-11-18 | Advanced Brain Monitoring, Inc. | Interactive psychophysiological profiler method and system |
| US20110066068A1 (en) * | 2009-09-16 | 2011-03-17 | Duffy Charles J | Method and system for quantitative assessment of functional impairment |
| JP2011083403A (ja) | 2009-10-15 | 2011-04-28 | Hokkaido Univ | 認知機能評価システム |
| US20130120282A1 (en) * | 2010-05-28 | 2013-05-16 | Tim Kukulski | System and Method for Evaluating Gesture Usability |
| CN103270740B (zh) * | 2010-12-27 | 2016-09-14 | 富士通株式会社 | 声音控制装置、声音控制方法以及移动终端装置 |
| JP2012217797A (ja) | 2011-04-14 | 2012-11-12 | Kumamoto Univ | 記憶保持力評価方法および記憶保持力評価システム |
| WO2012165602A1 (ja) * | 2011-05-31 | 2012-12-06 | 国立大学法人名古屋工業大学 | 認知機能障害判別装置、認知機能障害判別システム、およびプログラム |
| IN2015DN02508A (ja) * | 2012-08-31 | 2015-09-11 | Nat Univ Corp Tokyo Med & Dent | |
| US10448868B2 (en) * | 2013-03-08 | 2019-10-22 | The Regents Of The University Of California | Systems and methods for monitoring hand and wrist movement |
-
2013
- 2013-09-11 WO PCT/JP2013/074582 patent/WO2015037089A1/ja not_active Ceased
- 2013-09-11 US US14/893,338 patent/US10478114B2/en active Active
- 2013-09-11 JP JP2015536360A patent/JP6122130B2/ja active Active
- 2013-09-11 CN CN201380077561.5A patent/CN105407800B/zh active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06327659A (ja) * | 1993-05-25 | 1994-11-29 | Kyoto Densoku Kk | 知能検査・知能回復訓練装置 |
| JPH11188020A (ja) * | 1997-12-26 | 1999-07-13 | Otsuka Pharmaceut Co Ltd | 神経機能の診断装置 |
| JP2002369818A (ja) * | 2001-06-14 | 2002-12-24 | Yunimekku:Kk | 脳・神経系疾患の病状診断、投薬処方及び機能回復訓練のための支援システム |
| JP2005508211A (ja) * | 2001-08-10 | 2005-03-31 | コッグステイト リミテッド | 認知機能のテストシステム及び方法 |
| JP2006320424A (ja) * | 2005-05-17 | 2006-11-30 | Tama Tlo Kk | 動作教示装置とその方法 |
| JP2011045520A (ja) * | 2009-08-27 | 2011-03-10 | Hitachi Computer Peripherals Co Ltd | 運動機能評価システム、運動機能評価方法およびプログラム |
Cited By (29)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2017515543A (ja) * | 2014-04-21 | 2017-06-15 | 日立マクセル株式会社 | 脳機能障害評価装置 |
| JP2016049282A (ja) * | 2014-08-29 | 2016-04-11 | 日立マクセル株式会社 | 脳機能障害評価システム、脳機能障害評価方法およびプログラム |
| CN106256312A (zh) * | 2015-06-16 | 2016-12-28 | 日立数字映像(中国)有限公司 | 认知功能障碍评价装置 |
| CN106256312B (zh) * | 2015-06-16 | 2021-07-23 | 日立数字映像(中国)有限公司 | 认知功能障碍评价装置 |
| JP2017012339A (ja) * | 2015-06-30 | 2017-01-19 | 芳彦 佐野 | 認知機能観察システム及び認知機能観察方法 |
| JP2017217144A (ja) * | 2016-06-06 | 2017-12-14 | マクセルホールディングス株式会社 | 手指運動練習メニュー生成システム、方法、及びプログラム |
| WO2017212719A1 (ja) * | 2016-06-06 | 2017-12-14 | マクセル株式会社 | 手指運動練習メニュー生成システム、方法、及びプログラム |
| US11517788B2 (en) | 2016-06-06 | 2022-12-06 | Maxell, Ltd. | Finger exercise training menu generating system, method thereof, and program thereof |
| JP2019522514A (ja) * | 2016-06-07 | 2019-08-15 | セレブラル アセスメント システムズ、エルエルシー | 視覚運動応答の定量的評価のための方法およびシステム |
| JP2021077412A (ja) * | 2016-09-14 | 2021-05-20 | エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患もしくは障害についてのデジタルバイオマーカー |
| JP2019531569A (ja) * | 2016-09-14 | 2019-10-31 | エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患もしくは障害についてのデジタルバイオマーカー |
| JP2018051003A (ja) * | 2016-09-29 | 2018-04-05 | マクセル株式会社 | タスク実行順序決定システムおよびタスク実行方法 |
| WO2018062173A1 (ja) * | 2016-09-29 | 2018-04-05 | マクセル株式会社 | タスク実行順序決定システムおよびタスク実行方法 |
| US12079633B2 (en) | 2016-09-29 | 2024-09-03 | Maxell, Ltd. | Task execution order determination system and task execution method |
| JP2018121893A (ja) * | 2017-02-01 | 2018-08-09 | 充宏 池田 | 脳機能を測定する情報処理端末及びプログラム |
| WO2018143147A1 (ja) * | 2017-02-01 | 2018-08-09 | 充宏 池田 | 脳機能を測定する情報処理端末及びプログラム |
| WO2018168915A1 (ja) * | 2017-03-14 | 2018-09-20 | 学校法人日本大学 | 生体機能についての医学的検査の得点判定装置、及びプログラム |
| JP2021500183A (ja) * | 2017-10-25 | 2021-01-07 | エフ ホフマン−ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患または障害についてのデジタル質測定的バイオマーカー |
| JP7280876B2 (ja) | 2017-10-25 | 2023-05-24 | エフ ホフマン-ラ ロッシュ アクチェン ゲゼルシャフト | 認知および動作の疾患または障害についてのデジタル質測定的バイオマーカー |
| JP2022537197A (ja) * | 2019-06-19 | 2022-08-24 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP2022537742A (ja) * | 2019-06-19 | 2022-08-29 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP2022537267A (ja) * | 2019-06-19 | 2022-08-25 | エフ.ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| JP7728184B2 (ja) | 2019-06-19 | 2025-08-22 | エフ. ホフマン-ラ ロシュ アーゲー | デジタルバイオマーカー |
| US12408862B2 (en) | 2019-06-19 | 2025-09-09 | Hoffmann-La Roche Inc. | Digital biomarker |
| US12507947B2 (en) | 2019-06-19 | 2025-12-30 | Hoffmann-La Roche Inc. | Digital biomarker |
| JP2022181549A (ja) * | 2021-05-26 | 2022-12-08 | 国立大学法人 名古屋工業大学 | 認知機能評価プログラム、認知機能評価装置、認知機能評価システム、及び認知機能評価方法 |
| JP7493723B2 (ja) | 2021-05-26 | 2024-06-03 | 国立大学法人 名古屋工業大学 | 認知機能評価プログラム、認知機能評価装置、認知機能評価システム、及び認知機能評価方法 |
| WO2024047738A1 (ja) * | 2022-08-30 | 2024-03-07 | 日本電気株式会社 | 検査装置、検査システム、検査方法、及び非一時的なコンピュータ可読媒体 |
| JP7770609B1 (ja) * | 2025-08-19 | 2025-11-14 | 社会福祉法人兵庫県社会福祉事業団 | 認知機能評価システム、認知機能評価方法及び認知機能評価プログラム |
Also Published As
| Publication number | Publication date |
|---|---|
| CN105407800B (zh) | 2019-04-26 |
| CN105407800A (zh) | 2016-03-16 |
| US10478114B2 (en) | 2019-11-19 |
| US20160100788A1 (en) | 2016-04-14 |
| JP6122130B2 (ja) | 2017-04-26 |
| JPWO2015037089A1 (ja) | 2017-03-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP6122130B2 (ja) | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム | |
| JP6291107B2 (ja) | 脳機能障害評価方法、脳機能障害評価装置およびそのプログラム | |
| Schwenk et al. | Sensor-derived physical activity parameters can predict future falls in people with dementia | |
| KR102477327B1 (ko) | 인지 능력 측정을 위한 프로세서 구현 시스템 및 방법 | |
| JP6289313B2 (ja) | 脳機能障害評価システム、脳機能障害評価方法およびプログラム | |
| Dasenbrock et al. | Technology-based measurements for screening, monitoring and preventing frailty | |
| CN106256312B (zh) | 认知功能障碍评价装置 | |
| JPWO2018131542A1 (ja) | 認知機能評価システム | |
| KR20190041081A (ko) | 인지장애 진단을 위한 vr기반 인지능력 평가시스템 | |
| JP6899111B2 (ja) | サーバシステム、サーバシステムによって実行される方法及びプログラム | |
| CN114727761A (zh) | 患有运动障碍的患者的个性化保健的改善 | |
| Agurto et al. | Parkinson’s disease medication state and severity assessment based on coordination during walking | |
| JP7563984B2 (ja) | 情報処理速度を判定するためのデジタルクオリメトリックバイオマーカ | |
| Li et al. | The role of wrist-worn technology in the management of Parkinson’s disease in daily life: A narrative review | |
| Ngo et al. | Technological evolution in the instrumentation of ataxia severity measurement | |
| CN108697389B (zh) | 用于支持神经状态评估和神经康复的系统和方法,尤其是认知和/或言语功能障碍 | |
| WO2015161763A1 (zh) | 脑功能障碍评价装置 | |
| CN114423336A (zh) | 用于评估亨廷顿病(hd)的装置和方法 | |
| Camerlingo et al. | Measuring gait parameters from a single chest-worn accelerometer in healthy individuals: a validation study | |
| Bonzano et al. | An engineered glove for investigating the neural correlates of finger movements using functional magnetic resonance imaging | |
| JP2023500648A (ja) | 不顕性期のハンチントン病を評定する手段および方法 | |
| Shin et al. | An objective pronator drift test application (iPronator) using handheld device | |
| Cinaz et al. | Implementation and evaluation of wearable reaction time tests | |
| CN115315217A (zh) | 认知功能障碍诊断装置以及认知功能障碍诊断程序 | |
| JP7770609B1 (ja) | 認知機能評価システム、認知機能評価方法及び認知機能評価プログラム |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 201380077561.5 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13893577 Country of ref document: EP Kind code of ref document: A1 |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 14893338 Country of ref document: US |
|
| ENP | Entry into the national phase |
Ref document number: 2015536360 Country of ref document: JP Kind code of ref document: A |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 13893577 Country of ref document: EP Kind code of ref document: A1 |