WO2019022102A1 - 活動支援方法、プログラム、及び活動支援システム - Google Patents
活動支援方法、プログラム、及び活動支援システム Download PDFInfo
- Publication number
- WO2019022102A1 WO2019022102A1 PCT/JP2018/027797 JP2018027797W WO2019022102A1 WO 2019022102 A1 WO2019022102 A1 WO 2019022102A1 JP 2018027797 W JP2018027797 W JP 2018027797W WO 2019022102 A1 WO2019022102 A1 WO 2019022102A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- activity
- menu
- target person
- information
- subject
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61H—PHYSICAL THERAPY APPARATUS, e.g. DEVICES FOR LOCATING OR STIMULATING REFLEX POINTS IN THE BODY; ARTIFICIAL RESPIRATION; MASSAGE; BATHING DEVICES FOR SPECIAL THERAPEUTIC OR HYGIENIC PURPOSES OR SPECIFIC PARTS OF THE BODY
- A61H1/00—Apparatus for passive exercising; Vibrating apparatus; Chiropractic devices, e.g. body impacting devices, external devices for briefly extending or aligning unbroken bones
- A61H1/02—Stretching or bending or torsioning apparatus for exercising
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/22—Social work or social welfare, e.g. community support activities or counselling services
Definitions
- the present disclosure generally relates to an activity support method, a program, and an activity support system, and more particularly, to an activity support method, a program, and an activity support system for supporting a target person's activity.
- an exercise support system that supports the exercise of a user is known, and is disclosed, for example, in Patent Document 1.
- the exercise support system described in Patent Document 1 includes a wrist device and a chest device, an imaging device, a network server, and a user terminal.
- the wrist device and the chest device acquire sensor data during the running operation of the user.
- the imaging device acquires a running image in synchronization with sensor data and the like.
- the network server processes and analyzes the running video such as sensor data, compares the video between the user and the elite runner, the index displayed superimposed on the comparative video for each teaching item, and the advice text according to the teaching item Create advice data to include.
- the user terminal displays the advice data in a predetermined display form via the network.
- the target person is provided with advice for the running motion (activity) spontaneously performed by the user (target person), but is the activity appropriate for the target person? It is unknown whether or not. Therefore, in this exercise support system, there is a problem that it is difficult to present appropriate activities to be performed on the subject.
- the present disclosure has been made in view of the above, and it is an object of the present disclosure to provide an activity support method, program, and activity support system that easily present an activity to be performed appropriately to a target person.
- the activity support method includes a generation step, an acquisition step, and a presentation step.
- the generation step is a step of generating an activity menu of the subject by one or more processors based on the input physical information.
- the acquisition step is a step of acquiring the activity menu generated in the generation step via a network.
- the presenting step is a step of presenting the activity menu acquired in the acquiring step.
- a program according to an aspect of the present disclosure is a program for causing one or more processors to execute the above-described activity support method.
- An activity support system includes a generation unit, an acquisition unit, and a presentation unit.
- the generation unit generates an activity menu of the subject by one or more processors based on the input physical information.
- the acquisition unit acquires the activity menu generated by the generation unit via a network.
- the presentation unit presents the activity menu acquired by the acquisition unit.
- FIG. 1 is a flowchart illustrating an example of an operation (activity support method) of an activity support system according to an embodiment of the present disclosure.
- FIG. 2 is a conceptual diagram showing an example of the operation at the facility in the above-described activity support system.
- FIG. 3 is a conceptual diagram showing an example of the operation at the user's home in the above-described activity support system.
- FIG. 4 is a block diagram showing the configuration of the above-mentioned activity support system.
- 5A to 5C are conceptual diagrams showing an example of an activity menu presented by the presentation unit in the above-described activity support system.
- FIG. 6 is a flowchart showing another example of the operation (activity support method) of the above-described activity support system.
- FIG. 7A to FIG. 7C are conceptual diagrams showing that the target person performs rehabilitation using the facility system in the above-described activity support system.
- 8A to 8C are conceptual diagrams showing how a plurality of subjects perform rehabilitation using the facility system in the above-menti
- the activity support method is a method for supporting the activity of the target person 200.
- the “activity” in the present disclosure means the general behavior of the target person 200 in daily life. That is, “activity” is not only exercise of the target person 200 such as rehabilitation (rehabilitation) and training but also mental activity such as an action in which the target person 200 eats nutrition such as food and a circle activity by the target person 200 Also includes.
- the “rehabilitation” referred to in the present disclosure is performed, for example, in order to enable a subject's independent daily life, targeting a person in a state in which physical ability and cognitive function etc. have been reduced due to aging, illness or injury. Means physical or psychological training.
- the activity support method described below is a method for supporting the rehabilitation of the subject person 200.
- the activity support method includes a generation step S2, an acquisition step S4, and a presentation step S5.
- the generation step S2 is a step of generating the activity menu M1 (see FIGS. 5A to 5C) of the subject 200 by one or more processors based on the input physical information.
- the “physical information” in the present disclosure includes, for example, the age, sex, height, weight, BMI (Body Mass Index), exercise capacity, and presence or absence of a physical or mental disease of the subject 200.
- the “exercise ability” is the ability of the subject 200 to move the body (hand, foot, neck, waist, etc.), and includes, for example, grip strength, ability to maintain posture (time to keep open one's foot standing), etc.
- the “activity menu” in the present disclosure is a menu presented to the target person 200 in order to instruct the target person 200 on a specific activity or the like.
- the input of the physical information is performed at a facility 1 such as a rehabilitation center (hereinafter, also referred to as a “first place”).
- the input of the physical information is performed by measuring the motion of the object person 200 as shown in FIG. 2 as an example.
- the input of physical information may be performed in the form of a question using an input device such as a keyboard or a microphone, for example.
- the input of the physical information is performed by the target person 200 without the assistance of a therapist such as a physical therapist, an occupational therapist and a speech therapist, but the target person with the assistance of the therapist 200 may go.
- the therapist, an agent of the target person 200 such as the family of the target person 200, or the like may perform the input instead of the target person 200.
- an activity for rehabilitation that one or more processors can execute at the home 4 (hereinafter, also referred to as "the second place") based on the physical information input at the facility 1.
- the activity menu M1 includes, for example, a menu for training various operations of the object person 200 required in daily life, such as a walking operation, a single-foot standing operation, an uprising operation, a standing and sitting operation, and a platform lifting operation.
- the “rising movement” in the present disclosure means the movement of the target person 200 rising from the lying state
- the “stand-up movement” in the present disclosure means the movement of the target person 200 rising from the chair and / or the movement of sitting on the chair.
- the activity menu M1 includes, for example, a recipe of cooking for obtaining nutrients necessary to restore or maintain the health of the subject 200.
- “generation” in the present disclosure includes generating a new activity menu M1 based on physical information, and changing a part of the existing activity menu M1 based on physical information.
- “generation” includes selecting an activity menu M1 suitable for the target person 200 from a plurality of existing activity menus M1 based on physical information.
- the activity menu M1 generated in the generation step S2 is uploaded to the server 2 via the network N1 and stored in the server 2 (see FIG. 4).
- the acquisition step S4 is a step of acquiring the activity menu M1 generated in the generation step S2 via the network N1.
- the obtaining step S4 is performed at the home 4 of the target person 200, as shown in FIG.
- the acquisition step S4 is executed by the target person 200 operating the operation terminal 3 owned by the target person 200 and downloading the activity menu M1 stored in the server 2 via the network N1.
- the operation terminal 3 is, for example, a portable information terminal such as a tablet terminal or a smartphone, a personal computer (including a laptop type), a television receiver, various wearable terminals such as a watch type, or a dedicated device.
- the presenting step S5 is a step of presenting the activity menu M1 acquired in the acquiring step S4.
- the presentation step S5 is performed at the home 4 of the target person 200, as in the acquisition step S4.
- the activity menu M1 downloaded to the operation terminal 3 is output from the operation terminal 3 by voice, for example, or displayed on the display unit 34 of the operation terminal 3 by an image (still image or moving image)
- the presenting step S5 is performed (see FIGS. 5A to 5C).
- the generation step S2 is performed at the first location, and the acquisition step S4 and the presentation step S5 are performed at the second location remote from the first location.
- the present embodiment it is possible to present the activity menu M1 generated by the target person 200 inputting physical information at the facility 1 at the home 4 of the target person 200. Therefore, the present embodiment has an advantage that it is easy to present the target person 200 an appropriate activity menu M1, that is, an activity to be performed.
- the activity support system 100 which is a system for implementing the activity support method according to the present embodiment, will be described below with reference to FIGS.
- the activity support system 100 includes a facility system 10 provided in the facility 1, a server 2, and an operation terminal 3.
- the facility system 10, the server 2 and the operation terminal 3 are connected to each other via the network N1.
- the server 2 is described as not being a component of the activity support system 100 in the present embodiment, the server 2 may be included in the component of the activity support system 100.
- the server 2 is not an essential component, and may be omitted as appropriate.
- the facility system 10 includes a first input unit 11, a first processing unit 12, a first communication unit 13, and a first storage unit 14.
- the facility system 10 is implemented mainly with a computer system having one or more processors and a memory.
- the functions of the first input unit 11, the first processing unit 12, and the first communication unit 13 are realized by one or more processors executing appropriate programs.
- the program may be pre-recorded in a memory, or may be provided through a telecommunication line such as the Internet or in a non-transitory recording medium such as a memory card.
- the first input unit 11 is an input device for the subject person 200 to input physical information.
- the physical information input to the first input unit 11 is provided to the first processing unit 12.
- a sensor device 111 and a display device 112 are connected to the facility system 10.
- a communication method between the facility system 10 and each of the sensor device 111 and the display device 112 is, for example, bidirectional wire communication via a network such as a LAN (Local Area Network).
- the communication system between the facility system 10 and the sensor device 111 or the display device 112 is not limited to wired communication, and may be wireless communication.
- the sensor device 111 is a device that detects the position of the object person 200 in the detection space and the posture of the object person 200.
- the “detection space” referred to in the present disclosure is a space of an appropriate size defined by the sensor device 111, and the target person 200 is in the detection space when inputting the physical information to the first input unit 11.
- the sensor device 111 is installed, for example, on a wall surface 300 in a room. Since an image is projected on the wall surface 300 by the display device 112 as described later, basically, the object person 200 faces the wall surface 300 side (the sensor device 111 side).
- the sensor device 111 includes a plurality of sensors such as a camera (image sensor) and a depth sensor.
- the sensor device 111 further includes a processor or the like that performs appropriate signal processing on the outputs of the plurality of sensors.
- the sensor device 111 detects a captured image obtained by imaging the target person 200, the position of the target person 200 including the lateral direction and the depth direction (the front-rear direction of the target person 200), and the posture of the target person 200. . That is, the sensor device 111 detects the position (including the center of gravity position) of the target person 200 in the horizontal plane. Furthermore, the sensor device 111 detects, for example, which direction the target person 200 is bent forward and backward, and in what direction, and how many directions such as the back, the waist, the knee, and the like are bent.
- the information on the position of the target person 200 in the detection space and the information on the posture of the target person 200 detected by the sensor device 111 are the first information of the facility system 10 as physical information of the target person 200. It is given to the input unit 11. In this manner, whether the subject person 200 has sufficient muscle strength depending on whether or not the subject person 200 is given a specific posture (here, one foot standing) and the subject person 200 can maintain the specific posture for a predetermined time It is possible to measure whether or not the joint flexibility is sufficient.
- the display device 112 is, for example, a projector device that projects an image on a part (screen area 301) of the indoor wall surface 300.
- the display device 112 is attached to, for example, a ceiling surface in a room.
- the display device 112 projects an arbitrary full-color image on a screen area 301 set below the sensor device 111 on the wall surface 300.
- the display device 112 can project not only the wall surface 300 but also an image on a floor surface, a ceiling surface, a dedicated screen, or the like.
- the display device 112 is not limited to the configuration for displaying a two-dimensional video, and may display a three-dimensional video using a technique such as 3D (three dimensions) projection mapping, for example.
- 3D three dimensions
- the reverse video 302 and the sample video 303 of the target person 200 are displayed in the screen area 301.
- the reverse video 302 is a video obtained by horizontally reversing a video of the entire body of the subject 200 captured from the front of the subject 200 with the camera of the sensor device 111.
- the size and the display position in the screen area 301 are adjusted so that the reverse video 302 is displayed in substantially real time and viewed in the same manner as the image (mirror image) of the subject person 200 in the mirror. It is done.
- the sample image 303 is an image defining a movement (posture and the like) which becomes an example when inputting physical information.
- a stick picture indicating the correct posture in “one foot standing” is displayed as the sample video 303 so as to be superimposed on the reverse video 302.
- the first processing unit 12 (hereinafter also referred to as “generation unit 12”) has a function of executing the above-described generation step S2. That is, based on the physical information of the target person 200 input to the first input unit 11, the generation unit 12 generates the activity menu M1 of the target person 200 by one or more processors. In the present embodiment, the first processing unit 12 determines the difference between the posture of the target person 200 input as physical information and the reference posture specified by the reference data (the size of the deviation), and the target person 200 has one foot The exercise ability of the subject 200 is evaluated from the length of time for which the standing posture can be maintained.
- the first processing unit 12 selects the activity menu M1 from the plurality of activity menus M1 stored in the first storage unit 14 in accordance with the evaluation of the exercise capacity of the object person 200, whereby the activity for the object person 200 is performed. Generate menu M1.
- the first processing unit 12 has a function of executing an updating step S9 (see FIG. 6) of updating the activity menu M1 to be generated.
- the updating step S9 is a step of updating the activity menu M1 by one or more processors based on the activity result (described later) of the subject 200 acquired in the result acquiring step S6 (described later).
- the “activity result” in the present disclosure is the result of the subject person 200 executing the presented activity menu M1.
- the update step S9 will be described in detail in "(3.2) Evaluation of activity result of target person and update of activity menu" described later.
- the first communication unit 13 is a communication interface for communicating with the server 2 or the operation terminal 3 via the network N1.
- the communication method between the first communication unit 13 and the server 2 or the operation terminal 3 is bidirectional wireless communication.
- the communication method between the first communication unit 13 and the server 2 or the operation terminal 3 is not limited to wireless communication, and may be wired communication.
- the first communication unit 13 transmits the activity menu M1 generated by the first processing unit 12 to the server 2 via the network N1.
- the first storage unit 14 includes, for example, a rewritable nonvolatile memory such as an EEPROM (Electrically Erasable Programmable Read-Only Memory).
- the first storage unit 14 stores a plurality of activity menus M1 that can be selected when the first processing unit 12 generates the activity menu M1.
- the server 2 includes a second communication unit 21, a second processing unit 22, and a second storage unit 23.
- the server 2 is implemented mainly with a computer system having one or more processors and a memory.
- the functions of the second communication unit 21 and the second processing unit 22 are realized by one or more processors executing appropriate programs.
- the program may be pre-recorded in a memory, or may be provided through a telecommunication line such as the Internet or in a non-transitory recording medium such as a memory card.
- the second communication unit 21 is a communication interface for communicating with the facility system 10 or the operation terminal 3 via the network N1.
- the communication scheme between the second communication unit 21 and the facility system 10 or the operation terminal 3 is bidirectional wireless communication.
- the communication method between the second communication unit 21 and the facility system 10 or the operation terminal 3 is not limited to wireless communication, and may be wired communication.
- the second communication unit 21 communicates with the first communication unit 13 of the facility system 10 via the network N1 to receive the activity menu M1 generated by the first processing unit 12.
- the second communication unit 21 is controlled by the second processing unit 22 to transmit the activity result of the target person 200 described later and the evaluation on the activity result to the facility system 10 via the network N1.
- the second communication unit 21 is controlled by the second processing unit 22 so that the activity menu M1 for the target person 200 stored in the second storage unit 23 (that is, the activity menu M1 generated by the first processing unit 12). ) Is transmitted to the operation terminal 3 via the network N1.
- the second communication unit 21 communicates with the third communication unit 31 via the network N1 to receive the activity result input to the operation terminal 3.
- the second processing unit 22 has a function of executing an evaluation step S8 (see FIG. 6) for evaluating the activity result of the object person 200.
- an evaluation step S8 for evaluating the activity result of the object person 200.
- one or more activities of the target person 200 are performed based on the activity menu M1 generated in the generation step S2 and the activity result of the target person 200 acquired in the result acquisition step S6 (described later). It is a step evaluated by a processor.
- the evaluation step S8 will be described in detail later in “(3.2) Evaluation of activity result of target person and update of activity menu”.
- the second storage unit 23 includes, for example, an auxiliary storage device such as a hard disk drive (HDD) or a solid state drive (SSD).
- the second storage unit 23 stores the activity menu M1 received by the second communication unit 21 in association with identification information (user ID) for identifying the target person 200.
- the second storage unit 23 stores the activity result received by the second communication unit 21 in association with the identification information of the target person 200.
- the operation terminal 3 includes a third communication unit 31, a third processing unit 32, a second input unit 33, and a display unit 34.
- the operation terminal 3 is realized as a main configuration of a computer system having one or more processors and memories, and is a general-purpose tablet terminal as an example. Then, in the operation terminal 3, dedicated application software is installed, and when this application software is activated, the functions of the third communication unit 31, the third processing unit 32, the second input unit 33, and the display unit 34 are obtained. To be realized.
- the operation terminal 3 has a touch panel display, and the touch panel display realizes a function of receiving an operation of the target person 200 and a function of displaying information on the target person 200.
- the touch panel display is configured of, for example, a liquid crystal display or an organic EL (Electro Luminescence) display.
- the operation terminal 3 determines that an object such as a button is operated by detecting an operation (tap, swipe, drag, etc.) of an object such as a button on the screen displayed on the touch panel display.
- the touch panel display functions as a user interface that receives an operation input from the target person 200. That is, in the present embodiment, the touch panel display of the operation terminal 3 implements the functions of the second input unit 33 and the display unit 34.
- the third communication unit 31 is a communication interface for communicating with the facility system 10 or the server 2 via the network N1.
- the communication method between the third communication unit 31 and the facility system 10 or the server 2 is bidirectional wireless communication.
- the third communication unit 31 communicates with the second communication unit 21 of the server 2 via the network N1 to receive the activity menu M1 for the target person 200 stored in the server 2.
- the third communication unit 31 (hereinafter, also referred to as “acquisition unit 31”) has a function of executing the above-described acquisition step S4. That is, the acquisition unit 31 acquires the activity menu M1 generated by the first processing unit (generation unit) 12 via the network N1.
- the third processing unit 32 requests the server 2 to transmit the activity menu M1 linked to the target person 200 by receiving the request operation by the target person 200 in the second input unit 33.
- the third communication unit 31 receives the activity menu M1 from the server 2 via the network N1.
- the target person 200 reads the QR code (registered trademark) distributed at the facility 1 using the built-in camera of the operation terminal 3, and accesses a URL (Uniform Resource Locator) included in the code.
- the third communication unit 31 receives the activity menu M1 from the server 2 via the network N1.
- the activity menu M1 transmitted from the server 2 is stored in the second storage unit 23 of the server 2. That is, in the present embodiment, in the acquisition step S4, the activity menu M1 is acquired from the second storage unit 23 (storage unit) that stores the activity menu M1 generated in the generation step S2. In other words, in this embodiment, in the acquisition step S4, the activity menu M1 temporarily stored in the server 2 is not acquired, but the activity menu M1 generated in the generation step S2 is acquired from the facility system 10. It is acquired from server 2.
- the activity menu M1 is not presented to the subject 200 at the time when the activity menu M1 is generated at the facility 1, but the activity menu M1 is given to the subject 200 after leaving a predetermined time from this time. It will be possible to present In addition, if the history of the activity menu M1 presented to the object person 200 in the past is stored in the second storage unit 23, the object person 200 can refer to the history of the activity menu M1 by requesting the server 2 It will be possible.
- the third processing unit 32 (hereinafter, also referred to as “presentation unit 32”) has a function of executing the above-described presentation step S5. That is, the presentation unit 32 presents the activity menu M1 acquired by the third communication unit (acquisition unit) 31.
- the activity menu M1 is presented to the target person 200 by being displayed on the display unit 34 of the operation terminal 3 as shown in, for example, FIGS. 5A to 5C.
- an exercise menu M11 and a cooking menu M12 are displayed on the display unit 34 as the activity menu M1.
- the exercise menu M11 is, for example, a menu of exercises for training one or more specific portions of the subject 200's body to be strengthened, such as push-ups, squats, and single-legged eyes.
- the display unit 34 displays, as an exercise menu M11, text and a figure (including a picture) for explaining the exercise method to be executed by the subject 200.
- the cooking menu M12 is, for example, a recipe for cooking suitable for supplementing the nutrition that the subject 200 is underfeeding, such as a salad.
- the display unit 34 displays, as the cooking menu M12, text and a diagram (including a photo) for explaining the food to be prepared by the subject.
- the display unit 34 displays, as the exercise menu M11, a moving image for explaining the exercise to be performed by the subject 200. This moving image is displayed, for example, when the target person 200 performs a specific operation on the operation terminal 3, such as touching the exercise menu M11 while the image shown in FIG. 5A is displayed on the display unit 34. Ru. Further, in the example shown in FIG. 5C, the display unit 34 displays, as the cooking menu M12, a moving image for explaining the food to be prepared by the target person 200. This moving image is displayed, for example, when the object person 200 performs a specific operation on the operation terminal 3 such as touching the cooking menu M12 while the image shown in FIG. 5A is displayed on the display unit 34. Ru.
- step S1 Operation (3.1) Presentation of Activity Menu to Target Person
- the target person 200 or an agent of the target person 200 inputs physical information of the target person 200 (step S1).
- the physical information of the target person 200 is transmitted to the first input unit of the facility system 10. It is input to 11.
- one or more processors of the first processing unit 12 generate an activity menu M1 based on the physical information of the object person 200 input to the first input unit 11 (generation step S2).
- the generated activity menu M1 is transmitted to the server 2 via the first communication unit 13 and the network N1.
- the second processing unit 22 of the server 2 receives the activity menu M1 at the second communication unit 21, the second processing unit 22 associates the received activity menu M1 with the target person 200 and stores the result in the second storage unit 23 (step S3).
- the target person 200 operates the operation terminal 3 at home 4 and downloads it from the server 2 to acquire the activity menu M1 from the server 2 (acquisition step S4).
- the activity menu M1 acquired from the server 2 is stored in the memory of the operation terminal 3.
- the target person 200 operates the operation terminal 3 at home 4 and causes the display unit 34 to display the activity menu M1, whereby the activity menu M1 is presented to the target person 200 (presentation step S5). Therefore, the target person 200 can carry out the activity menu M1 at home 4 while looking at the activity menu M1 displayed on the display unit 34.
- the present embodiment it is possible to present the activity menu M1 generated by the target person 200 inputting physical information at the facility 1 at the home 4 of the target person 200. Therefore, the present embodiment has an advantage that it is easy to present the target person 200 an appropriate activity menu M1, that is, an activity to be performed. Moreover, in the present embodiment, the target person 200 only needs to input physical information at the facility 1, and there is an advantage that it is not necessary to draw up the activity menu M1 by oneself.
- the present embodiment has an advantage that the target person 200 can easily carry out the activity menu M1 at a place different from the facility 1 (here, the home 4 of the target person 200). For example, even if the subject 200 carries out the activity menu M1 with the assistance of a therapist at the facility 1, there is a possibility that the subject 200 can not acquire the activity menu M1 or forgets the acquired activity menu M1. There is. In such a case, it is difficult for the target person 200 to carry out the same activity menu M1 at home 4. In addition, for example, when there is no space necessary for the activity menu M1 to be implemented in the facility 1, or when the facility 1 is a facility easily accessible to the public, the target person 200 implements the activity menu M1 in the facility 1 Can not do it. In such a case, the target person 200 does not have an opportunity to acquire the activity menu M1 at the facility 1 in the first place.
- the target person 200 can carry out the activity menu M1 at home 4. Further, in the present embodiment, the therapist does not have to instruct the subject person 200 on the activity menu M1, and there is also an advantage that the burden on the therapist can be reduced.
- the target person 200 implements the activity menu M1 while looking at the activity menu M1 displayed on the display unit 34. Then, the target person 200 operates the operation terminal 3 to input the result of performing the activity menu M1 (that is, the activity result). For example, if the activity menu M1 is the exercise menu M11, the target person 200 inputs the result of the activity to the operation terminal 3 by inputting to the operation terminal 3 a moving image obtained by imaging the figure of the person performing the exercise menu M11. Do. Furthermore, for example, if the activity menu M1 is the cooking menu M12, the target person 200 inputs the image of the food prepared based on the cooking menu M12 into the operation terminal 3 to input the activity result to the operation terminal 3 input. In addition, the target person 200 may input the result of the activity to the operation terminal 3 by inputting that the activity menu M1 has been performed, the time zone in which the activity menu M1 has been performed, or the like.
- step S6 is a step of acquiring the activity result of the object person 200 based on the activity menu M1 generated in the generation step S2.
- step S7 (hereinafter, also referred to as “storage step S7") is a step of storing the activity result acquired in the result acquisition step S6 in the second storage unit (storage unit) 23.
- the second processing unit 22 of the server 2 evaluates the activity of the subject 200 by one or more processors based on the acquired activity result of the subject 200 (evaluation step S8).
- the second processing unit 22 compares the moving image for which the target person 200 performs the exercise menu M11 with the reference moving image for which the trainer performs the exercise menu M11 using, for example, image analysis technology or the like. The implementation accuracy of the exercise menu M11 by the person 200 is evaluated.
- the second processing unit 22 links the evaluation result to the target person 200 and stores the result in the second storage unit 23. Further, the second processing unit 22 transmits the evaluation result to the facility system 10 via the network N1 by controlling the second communication unit 21.
- the first processing unit 12 of the facility system 10 When the first processing unit 12 of the facility system 10 receives the evaluation result in the first communication unit 13, the first processing unit 12 performs the activity by one or more processors based on the received evaluation result (in other words, the activity result of the target person 200).
- the menu M1 is updated (update step S9). That is, even when the same physical information is input at the facility 1, the activity menu M1 generated is different before and after the update. Of course, the update may cause no change in the activity menu M1.
- the first processing unit 12 updates the activity menu M1 by performing machine learning using, for example, an evaluation result (activity result of the target person 200). For example, it is assumed that a plurality of target persons 200 use the facility 1. In this case, the first processing unit 12 may preferentially present the activity menu M1 presented to the majority of the subjects 200 by referring to the evaluation results of the plurality of subjects 200. . In addition, for example, when the first processing unit 12 obtains an evaluation result that many subjects 200 have difficulty in performing an exercise included in the activity menu M1, the first processing unit 12 removes the exercise or a difficulty in place of the exercise.
- the activity menu M1 may be updated by adding a low degree exercise or the like.
- the degree of implementation of the activity menu M1 by the target person 200 is evaluated. Therefore, in the present embodiment, for example, the evaluation result is displayed on the display unit 34 of the operation terminal 3 in response to the request of the target person 200, and the evaluation result is presented to the target person 200. There is an advantage that it is possible to improve motivation. Further, in the present embodiment, for example, the evaluation result (the activity result of the target person 200) is fed back to the facility system 10 to update the activity menu M1, so that the activity menu M1 can be easily presented to the target person 200. Has the advantage of
- the physical information is detected by the target person 200 executing the instruction menu that the facility system 10 presents to the target person 200 at the facility 1.
- Ru That is, when the target person 200 executes the instruction menu, information on the position of the target person 200 in the detection space and information on the posture of the target person 200, which are detected by the sensor device 111, are first information as physical information. It is given to the input unit 11.
- the “instruction menu” in the present disclosure is any menu selected from a plurality of rehabilitation menus, and is presented to the target person 200 to instruct the target person 200 on a specific training or the like. Be done.
- a plurality of rehabilitation menus are classified into a plurality of first stage menus and a plurality of second stage menus.
- Each of the plurality of first stage menus is, for example, a training menu of the motion itself that the subject person 200 performs in daily life such as a walking motion.
- Each of the plurality of second stage menus is, for example, a training menu of element operations necessary for operations performed by the subject 200 in daily life.
- the “element operation” in the present disclosure is an individual operation after decomposition when the operation performed by the target person 200 in daily life is decomposed into a plurality of elements.
- the motion to bend the hip joint the motion to stretch the hip joint, the motion to bend the knee joint, the motion to stretch the knee joint, the motion to stretch the knee joint, the motion to bend the ankle, the motion to stretch the ankle, the motion to swing the arm forward and the arm backward It is decomposed into a plurality of element operations including a shaking operation and the like.
- a plurality of element movements are defined for each part of the body and for each movement of each part.
- the facility system 10 first performs screening for asking the target person 200 a question, as shown in FIG. 7A. At this time, the facility system 10 causes the screen area 301 to display questions about the target person 200, and acquires an answer to the question that the target person 200 has input to the operation terminal 5.
- the operation terminal 5 is, for example, a tablet terminal or a smartphone, and presents information to the target person 200 (display and / or voice output), a communication function with the facility system 10, a function to receive the operation of the target person 200, and the like. Function to
- the facility system 10 acquires data for determining the physical ability of the subject 200 in addition to the information on the attributes of the subject 200 (hereinafter, referred to as "third information").
- Institutional system 10 determines the necessity of confirmation of physical ability based on the result of screening. For example, when it is judged that there is no problem in physical ability in daily life as a result of screening, confirmation of physical ability is judged to be unnecessary.
- the facility system 10 executes a pre-menu selection process.
- the “pre-menu” in the present disclosure is a menu to be executed by the target person 200 prior to the instruction menu in order to select (determine) the instruction menu.
- the facility system 10 selects one of the first stage menus suitable for the subject 200 from among the plurality of first stage menus stored in the first storage unit 14 based on the result of the screening.
- the first storage unit 14 stores a conditional expression for selecting a pre-menu from the result of screening in addition to the plurality of first-stage menus, and the facility system 10 performs the preliminary processing according to the conditional expression. Select menu
- the facility system 10 controls the display device 112 based on pre-information representing the selected pre-menu.
- the display device 112 displays, on the screen area 301, the pre-menu to be performed by the subject 200 and the support information to support the execution of the pre-menu by the subject 200.
- the menu "one foot standing" is selected as the pre-menu, and the reverse video 302 and the sample video 303 of the target person 200 are displayed in the screen area 301 as support information. There is.
- the sample image 303 is generated from the reference data stored in the first storage unit 14 in association with the rehabilitation menu (first stage menu) selected as the pre-menu as an example, and serves as an example in the pre-menu It is an image that defines movement (posture etc.).
- the rehabilitation menu first stage menu
- It is an image that defines movement (posture etc.).
- a stick picture indicating the correct posture in the “one-foot standing” is displayed as the sample video 303 so as to be superimposed on the reverse video 302.
- the sensor device 111 detects the movement of the target person 200, and the facility system 10 executes an instruction menu selection process.
- the facility system 10 first acquires, from the sensor device 111, information (hereinafter referred to as "first information") regarding the operation of the target person 200 in operation according to the pre-menu. Furthermore, the facility system 10 acquires information (hereinafter, referred to as “second information”) for proposing any menu selected from the plurality of rehabilitation menus as an instruction menu. In the present input example, the facility system 10 acquires, from the first storage unit 14, second information for selecting any one of the plurality of second stage menus as an instruction menu. At this time, the facility system 10 acquires reference data defining at least an exemplary movement in the pre-menu in the second information.
- the facility system 10 evaluates the operation of the target person 200 based on the first information and the second information, and further, the third information acquired in the screening.
- the facility system 10 digitizes the difference (magnitude of deviation) between the posture of the subject 200 and the reference posture specified by the reference data, and evaluates the operation of the subject 200 by this difference.
- the menu “stand on one foot” is selected as the pre-menu as shown in FIG. 7B, for example, the evaluation of the motion of the target person 200 for the length of time in which the target person 200 can maintain the posture of one leg stand Be added to
- the facility system 10 determines (selects) an instruction menu based on the evaluation result of the target person 200 operating according to the pre-menu.
- the facility system 10 controls the display device 112 based on the instruction information representing the selected instruction menu.
- the display device 112 displays, on the screen area 301, an instruction menu to be executed by the subject 200 and support information for supporting the execution of the instruction menu by the subject 200.
- the menu “foot-up in the horizontal direction” is selected as the instruction menu, and the reverse video 302 and the marker 304 are displayed in the screen area 301 as support information.
- the marker 304 is, for example, an image imitating a ball, and is displayed around the foot of the object person 200 in the reverse video 302.
- the facility system 10 can, for example, instruct the target person 200 to kick the ball represented by the marker 304, thereby prompting the target person 200 to move the foot in the lateral direction.
- the facility system 10 is also based on the effect information indicating an effect expected to be obtained by the target person 200 executing the instruction menu represented by the instruction information.
- Control the display device 112. Therefore, in addition to the instruction menu and the support information, the display device 112 displays, on the screen area 301, effect information indicating an effect expected by the execution of the instruction menu.
- effect information indicating an effect expected by the execution of the instruction menu.
- FIG. 7C as an effect expected to be obtained by the subject person 200 executing the instruction menu of “foot-up in the lateral direction”, for example, an effect of “being able to keep the body straight”. Information is displayed.
- the sensor device 111 detects the movement of the object person 200, and the facility system 10 controls the display device 112 based on the result information indicating the evaluation result of the object person 200.
- the display device 112 presents the evaluation result to the target person 200 in real time (immediately) by displaying the evaluation result in the screen area 301.
- the facility system 10 changes the color of the marker 304 when the difference between the posture of the target person 200 and the reference posture specified by the reference data (the size of the deviation) exceeds the allowable range. And present the evaluation results.
- the mode of presentation by the facility system 10 is not limited to this mode, and, for example, display of a message for the target person 200, voice (including warning sound) output, data transmission to terminals such as printout (printing) or a smartphone Or the like.
- the facility system 10 does not have to perform the process of selecting the pre-menu and the instruction menu described above each time, and may execute the process of presenting the instruction menu. For example, at the time of screening, if it is found from the information such as the name of the target person 200 that the target menu 200 has already been selected, the facility system 10 determines that confirmation of physical ability is unnecessary, Skip the process of selecting a menu and an instruction menu. As a result, in the target person 200, the implementation of the pre-menu as shown in FIG. 7B can be skipped, and the implementation of the instruction menu as shown in FIG. 7C becomes possible.
- the target person 200 executes the instruction menu as described above, information regarding the position of the target person 200 in the detection space and information regarding the posture of the target person 200 detected by the sensor device 111 Is given to the first input unit 11 as physical information.
- the facility system 10 excludes a specific evaluation item among the plurality of evaluation items from the evaluation targets and makes the evaluation. It is preferred to do. That is, it is preferable that the facility system 10 excludes an evaluation item related to the motion of the left knee joint from the evaluation targets, for example, for the subject 200 having a disorder in the left knee joint.
- the institutional system 10 may not perform the evaluation on the evaluation item related to the motion of the left knee joint at all, for example, the threshold on the evaluation item related to the movement of the left knee joint may be lowered and evaluated. You may
- the facility system 10 sends the target person 200 of a plurality (two in this case) to the screening, the execution of the pre-menu, and the execution of the instruction menu. It differs from the first input example in that it is assumed to be simultaneously performed. Therefore, in the following description, descriptions of points common to the first input example will be omitted. However, the facility system 10 may perform at least the execution of at least the instruction menu at the same time by a plurality of subjects 200, and at least one of the screening and the pre-menu may be performed individually for each subject 200. It is also good. In this input example, in the case where a plurality of (two in this case) target persons 200 are distinguished, each target person is referred to as “target person 200A” or “target person 200B”.
- institution system 10 determines the necessity of confirmation of physical ability based on the result of screening. For example, when it is determined that there is no problem in physical ability in daily life for both the subjects 200A and 200B as a result of screening, confirmation of the physical ability is determined to be unnecessary. On the other hand, if it is determined as a result of the screening that at least one of the subjects 200A and 200B needs to confirm the physical ability, the facility system 10 executes a pre-menu selection process.
- the facility system 10 controls the display device 112 based on pre-information representing the selected pre-menu.
- the display device 112 displays, on the screen area 301, the pre-menu to be performed by the subject 200 and the support information to support the execution of the pre-menu by the subject 200.
- the menu “one foot standing” is selected as the pre-menu
- the reverse video 302A, 302B and the sample video 303A, 303B of the target person 200 are screen area 301 as support information. Is displayed on.
- the sensor device 111 detects the movement of the target person 200, and the facility system 10 executes an instruction menu selection process. Then, the facility system 10 evaluates the operation of the target person 200 based on the first information and the second information, and further, the third information acquired in the screening.
- evaluation in the facility system 10 is performed individually for each target person 200, for example, a plurality of target persons 200A and 200B may select the same instruction menu. Evaluation results of the target persons 200A and 200B are comprehensively determined.
- the instruction menu is selected based on the average value of the evaluation results for a plurality of subjects 200A and 200B. That is, a common instruction menu is selected for a plurality of target persons 200A and 200B.
- the evaluation result for each target person 200 may be used to adjust the size of the load applied to the target person 200 when the instruction menu is performed in the selection process of the instruction menu. In this case, the size of the load can be adjusted for each target person 200 at the start of the instruction menu.
- the facility system 10 controls the display device 112 based on the instruction information representing the selected instruction menu.
- the display device 112 displays, on the screen area 301, an instruction menu to be executed by the subject 200 and support information for supporting the execution of the instruction menu by the subject 200.
- the menu "Large foot in the horizontal direction" is selected as the instruction menu, and the reverse images 302A and 302B and the markers 304A and 304B are displayed on the screen area 301 as support information. It is displayed.
- the markers 304A and 304B are not particularly distinguished between the target person 200A and the target person 200B, they are simply referred to as "markers 304".
- the display device 112 causes the screen area 301 to arrange a plurality of similar instruction menus and support information in the horizontal direction so that a plurality of target persons 200A and 200B can simultaneously execute the instruction menu. ing. That is, the screen area 301 is divided into the first area 311 and the second area 312 in the left-right direction. Then, the display device 112 displays an instruction menu and support information (reverse video 302A and marker 304A) for the target person 200A in the first area 311 on the left side when viewing the screen area 301 from the front.
- an instruction menu and support information reverse video 302A and marker 304A
- the display device 112 displays an instruction menu and support information (reverse video 302B and marker 304B) for the target person 200B in the second area 312 on the right side when viewing the screen area 301 from the front.
- the plurality of target persons 200A and 200B can simultaneously execute the instruction menu in a state where they are aligned in the left and right direction in front of the screen area 301.
- the facility system 10 is expected to be obtained by the target person 200 executing the instruction menu represented by the instruction information, in addition to the instruction information, as in the first input example.
- the display device 112 is controlled based also on the effect information indicating the effect. Therefore, in addition to the instruction menu and the support information, the display device 112 displays, on the screen area 301, effect information indicating an effect expected by the execution of the instruction menu.
- the sensor device 111 detects the movement of the target person 200, and the facility system 10 evaluates the degree of achievement of the instruction menu for each target person 200.
- the facility system 10 acquires, from the sensor device 111, first information (operation information) on the operation of the target person 200 in operation according to the instruction menu.
- the first information includes information such as a heartbeat measured by a wearable sensor terminal worn by each of the target persons 200A and 200B.
- the “sensor terminal” mentioned here includes sensors such as a gyro sensor, an acceleration sensor, an activity meter, and a heart rate meter, and can measure, for example, the heartbeat of the subject 200.
- the facility system 10 acquires, from the first storage unit 14 as second information, reference data defining at least an exemplary movement in the instruction menu.
- the facility system 10 evaluates the degree of achievement of the instruction menu for each target person 200 based on the first information and the second information, and further, the third information acquired in the screening.
- the facility system 10 digitizes the difference (magnitude of the deviation) between the posture of the target person 200 and the reference posture specified in the reference data, and evaluates the achievement degree of the instruction menu by this difference.
- the menu "foot raising in the lateral direction" is selected as the instruction menu as shown in FIG. 8C, for example, the time taken for the target person 200 to raise the foot to the height of the marker 304 (reaction time)
- the length of the subject is also added to the evaluation of the motion of the subject 200.
- the facility system 10 adjusts the size of the load applied when performing the instruction menu for each target person 200 according to the evaluation result of the achievement level. For example, when the target person 200 exercises according to the instruction menu, the facility system 10 increases the amount of movement of a specific part of the body so that the physical load on the target person 200 becomes large, Increase the load by speeding up the movement.
- the facility system 10 displays, for example, the heights of the markers 304 displayed in the screen area 301 for each target person 200. Adjust the size of the load by adjusting (position). In the example of FIG. 8C, the marker 304A for the subject 200A is displayed at a higher position than the marker 304B for the subject 200B so that the load on the subject 200A is larger than that of the subject 200B.
- the facility system 10 presents an evaluation result for each target person 200.
- the facility system 10 controls the display device 112 based on the result information indicating the evaluation result of the target person 200.
- the display device 112 presents the evaluation result to the target person 200 in real time (immediately) by displaying the evaluation result in the screen area 301.
- the facility system 10 determines, for example, whether the instruction menu has ended. If the instruction menu has not ended, the facility system 10 repeats the processing after presenting the instruction menu. If the instruction menu has ended, the facility system 10 ends a series of processing.
- information on the position of the target person 200 in the detection space and information on the posture of the target person 200 detected by the sensor device 111 as a plurality of target persons 200 execute the instruction menu. Is given to the first input unit 11 as physical information. At this time, physical information is given to the first input unit 11 for each target person 200.
- each target person 200 can work on the instruction menu with another target person 200 instead of one person.
- the individual subjects 200 can easily communicate with other subjects 200 and can keep their motivation higher than working alone.
- the embodiment described above is only one of the various embodiments of the present disclosure.
- the above-mentioned embodiment can be variously changed according to design etc. if the object of the present disclosure can be achieved.
- the same function as the activity support method may be embodied by a (computer) program or a non-temporary recording medium or the like recording the program.
- a (computer) program according to one aspect is a program for causing one or more processors to execute the above-described activity support method.
- the activity support system 100 in the present disclosure includes a computer system.
- a computer system mainly includes one or more processors and memory as hardware.
- the function as the activity support system 100 in the present disclosure is realized by one or more processors executing a program recorded in the memory of the computer system.
- the program may be pre-recorded in the memory of the computer system, may be provided through a telecommunication line, and recorded in a non-transitory recording medium such as a computer system-readable memory card, an optical disc, a hard disk drive, etc. It may be provided.
- Each of the one or more processors of the computer system is configured with one or more electronic circuits including a semiconductor integrated circuit (IC) or a large scale integrated circuit (LSI).
- IC semiconductor integrated circuit
- LSI large scale integrated circuit
- integrated circuit such as IC or LSI
- IC integrated circuit
- LSI very large scale integration
- ULSI ultra large scale integration
- use as a processor also a field-programmable gate array (FPGA) or a logic device capable of reconfiguring junction relations inside the LSI or reconfiguring circuit sections inside the LSI, which are programmed after the LSI is manufactured.
- FPGA field-programmable gate array
- the plurality of electronic circuits may be integrated into one chip or may be distributed to a plurality of chips.
- the plurality of chips may be integrated into one device or may be distributed to a plurality of devices.
- a computer system as referred to herein includes a microcontroller having one or more processors and one or more memories. Therefore, the microcontroller is also configured with one or more electronic circuits including a semiconductor integrated circuit or a large scale integrated circuit.
- the fact that the third communication unit (acquisition unit) 31 and the third processing unit (presentation unit) 32 are integrated in one case is not an essential configuration for the activity support system 100. These may be provided separately in a plurality of housings. Also, at least part of the functions of the activity support system 100 may be realized by, for example, a server or a cloud (cloud computing).
- exercise menu M11 and cooking menu M12 are mentioned as an example of activity menu M1, it is not the meaning which limits activity menu M1.
- the activity menu M1 proposes participation in a circle activity for the purpose of recovery from the psychiatric disorder.
- the menu may be
- the one or more processors generate the activity menu M1 based on the physical information input at the facility 1, but the present invention is not limited thereto.
- the one or more processors may generate the activity menu M1 based on the physical information and the supplementary information obtained by the therapist examining the subject 200.
- the supplementary information is preferably input by the therapist at the same time as the physical information is input.
- the activity menu M1 generated by the first processing unit (generation unit) 12 at the generation step S2 may be plural.
- the facility system 10 includes, for example, a device such as a touch panel display that displays a plurality of activity menus M1, and a device that receives an operation to select one of the plurality of activity menus M1. Is preferred.
- the selected activity menu M1 is uploaded to the server 2.
- the request of the target person 200 may be further input.
- the one or more processors can generate the activity menu M1 after excluding in advance the activity menu M1 which is not suitable for the request of the object person 200.
- the server 2 acquires not only the activity menu M1 but also the physical information of the target person 200 from the facility system 10, and stores the acquired activity menu M1 and the physical information in the second storage unit 23 There is.
- an evaluation of the physical information of the subject 200 may be presented in the presentation step S5. This evaluation may be performed by the facility system 10 or may be performed by the server 2.
- effect information may be presented representing an effect expected to be obtained by the subject person 200 executing the activity menu M1.
- the second place is not limited to the home 4 of the target person 200.
- the second place may be an office of a company where the target person 200 works, a public facility such as a public hall, or a park. That is, the second place may be a place different from the facility 1 that generates the activity menu M1, in particular, a place where the target person 200 visits in daily life.
- first place and the second place may be different areas in the same facility.
- the facility including the first place and the second place is a welfare facility for the elderly
- the first place may be a common area on the first floor of the facility
- the second place may be a living area on the second floor of the facility .
- the activity menu M1 is updated based on the activity result of the object person 200 in update step S9, it is not the meaning limited to this. For example, if the target person 200 measures physical information again at the facility 1 before executing the update step S9, based on the activity result of the target person 200 and the latest physical information of the target person 200.
- the activity menu M1 may be updated.
- physical information is physical information of subject 200 who is shown activity menu M1, it is not the meaning limited to this.
- the physical information may be physical information of a person targeted by the subject 200, or physical information of a patient having the same disease as the subject 200. That is, if the activity menu M1 capable of supporting the activity of the object person 200 can be presented to the object person 200, the activity menu M1 may be generated based on the physical information of another person different from the object person 200. Good.
- institution 1 is a medical institution which performs rehabilitation of a rehabilitation center etc., it is not the meaning limited to this.
- the facility 1 may be another medical delivery facility such as a pharmacy.
- the facility 1 may be a fitness facility or a commercial facility such as a shopping mall.
- the facility system 10 may be provided in any facility.
- the sensor device 111 is not limited to a configuration having a camera and a depth sensor, and instead of or in addition to these sensors, for example, a load sensor, an infrared sensor, a thermography, a radio wave (microwave) sensor, etc. You may have.
- the sensor device 111 may include a sensor such as a gyro sensor, an acceleration sensor, an activity meter, and a heart rate sensor worn by the target person 200. In this case, by using the sensor device 111, it is possible to measure the exercise ability other than the exercise ability to maintain the posture of the subject 200, and to input the measured exercise ability as physical information.
- the first communication unit 13 may be configured to communicate with the server 2 or the operation terminal 3 via, for example, a relay such as a router and the network N1.
- the second communication unit 21 and the third communication unit 31 may be configured to communicate via the relay device and the network N1.
- all of the first communication unit 13, the second communication unit 21, and the third communication unit 31 may be connected to the network N1 via a mobile phone network (carrier network) provided by a communication carrier.
- the mobile telephone network includes, for example, a 3G (third generation) line, an LTE (Long Term Evolution) line, and the like.
- the third processing unit (presentation unit) 32 not only displays the activity menu M1, but also displays a specific instruction for causing the subject person 200 to execute the activity menu M1. It may be displayed on the screen.
- the third processing unit 32 specifically instructs support information indicating how to move the body, posture, rhythm of walking, etc. necessary for correct walking motion. May be displayed on the display unit 34 as
- the operation terminal 3 may cooperate with an exercise device owned by the subject person 200.
- the “exercise device” in the present disclosure is, for example, a device that exerts a force on at least a part of the body of the subject 200 and passively exercises at least a part of the body of the subject 200.
- the activity menu M1 is downloaded from the operation terminal 3 to the exercise device, the activity menu M1 is displayed on the display device of the exercise device, or the activity menu M1. It is possible to make a voice guide.
- the exercise apparatus has a function of measuring the exercise of the subject 200, the measurement result is uploaded from the exercise apparatus to the server 2 through the operation terminal 3 as the activity result of the subject 200. It is possible.
- the activity menu M1 after the activity menu M1 is temporarily stored in the server 2, it is transmitted to the operation terminal 3 in response to the request of the object person 200, but the present invention is not limited thereto.
- the activity menu M1 may be transmitted from the facility system 10 to the operation terminal 3 via the network N1 without the server 2 in response to the request of the object person 200.
- the activity menu M1 generated by the first processing unit 12 is stored in the first storage unit 14 of the facility system 10. That is, in this case, the first storage unit 14 corresponds to the second storage unit 23 of the server 2.
- the activity support method includes the generation step (S2), the acquisition step (S4), and the presentation step (S5).
- the generation step (S2) is a step of generating the activity menu (M1) of the subject (200) by one or more processors based on the input physical information.
- the acquisition step (S4) is a step of acquiring the activity menu (M1) generated in the generation step (S2) via the network.
- the presenting step (S5) is a step of presenting the activity menu (M1) acquired in the acquiring step (S4).
- a storage unit (second storage unit) that stores the activity menu (M1) generated in the generation step (S2) in the acquisition step (S4)
- the activity menu (M1) is obtained from (23).
- the activity menu (M1) can be easily presented to the subject (200) at a timing desired by the subject (200), such as when the subject (200) is at home.
- the activity support method further includes a result acquisition step (S6) in any of the first to third aspects.
- the result acquisition step (S6) is a step of acquiring the activity result of the object person (200) based on the activity menu (M1) generated in the generation step (S2).
- the activity support method further includes an evaluation step (S8) in the fourth aspect.
- the evaluation step (S8) based on the activity menu (M1) generated in the generation step (S2) and the activity result acquired in the result acquisition step (S6), the activity of the target person (200) is 1 It is a step evaluated by the above processor.
- the activity support method further includes a storing step (S7) in the fourth or fifth aspect.
- the storing step (S7) is a step of storing the activity result acquired in the result acquiring step (S6) in the storage unit (23) storing the activity menu (M1) generated in the generating step (S2).
- the activity support method further includes an updating step (S9) in any of the fourth to sixth aspects.
- the updating step (S9) is a step of updating the activity menu (M1) by one or more processors based on the activity result acquired in the result acquiring step (S6).
- the physical information input in the generation step (S2) includes the following information. That is, the physical information includes at least one of information on the position of the subject (200) detected by the subject (200) executing the instruction menu in the detection space and information on the posture of the subject (200). .
- the instruction menu is presented to the subject (200) by the following method. That is, this method acquires the first information, acquires the second information, and outputs the instruction information.
- the first information is information on the operation of the target person (200).
- the second information is information for proposing, as an instruction menu, any menu selected from among a plurality of rehabilitation menus.
- the instruction information is information representing an instruction menu selected based on at least the first information and the second information.
- the instruction menu is selected. Then, since instruction information representing the selected instruction menu is output, it is possible to present the instruction menu to the target person (200). In this manner, in this aspect, it is possible to automatically determine which subject (200) is to perform which rehabilitation menu, based on the action of the subject (200). That is, it is possible to automatically propose a menu for rehabilitation suitable for the subject (200) without intervention by a therapist or the like who assists the rehabilitation of the subject (200). Therefore, according to the rehabilitation support method, there is an advantage that it is possible to reduce the burden of a therapist or the like who assists the rehabilitation of the subject (200).
- a plurality of subjects (200) are included, and the following method is included. That is, this method outputs instruction information so as to be simultaneously performed on a plurality of subjects (200).
- this method the degree of achievement of the instruction menu is evaluated for each target person (200) based on operation information on the operation of each of a plurality of target persons (200) operating according to the instruction menu. Furthermore, this method adjusts the size of the load applied when the instruction menu is implemented for each subject (200) according to the evaluation result of the achievement level.
- the instruction menu can be simultaneously implemented to a plurality of subjects (200).
- the degree of achievement of the instruction menu is evaluated for each target person (200) based on the operation information on the operation of each of a plurality of target persons (200) operating according to the instruction menu, one target person ( 200) There is no need for a single therapist etc. Therefore, when the number of subjects (200) increases, an increase in the burden on the therapist or the like is suppressed.
- the size of the load applied when performing the instruction menu is adjusted for each target person (200). Therefore, even if there is not one therapist etc. per one subject (200), an effective rehabilitation can be practiced for each subject (200). Therefore, according to this aspect, there is an advantage that the burden on the therapist or the like can be reduced when a plurality of subjects (200) perform rehabilitation simultaneously.
- a program according to an eleventh aspect is a program for causing one or more processors to execute the activity support method according to any one of the first to tenth aspects.
- the activity support system (100) includes a generation unit (first processing unit) (12), an acquisition unit (third communication unit) (31), and a presentation unit (32).
- the generation unit (12) generates an activity menu (M1) of the subject (200) by the one or more processors based on the input physical information.
- the acquisition unit (31) acquires the activity menu (M1) generated by the generation unit (12) via the network (N1).
- the presentation unit (32) presents the activity menu (M1) acquired by the acquisition unit (31).
- the methods according to the second to tenth aspects are not essential to the activity support method, and can be omitted as appropriate.
Landscapes
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Business, Economics & Management (AREA)
- Life Sciences & Earth Sciences (AREA)
- Tourism & Hospitality (AREA)
- Physical Education & Sports Medicine (AREA)
- Engineering & Computer Science (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Marketing (AREA)
- Pain & Pain Management (AREA)
- Veterinary Medicine (AREA)
- Child & Adolescent Psychology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Economics (AREA)
- Human Resources & Organizations (AREA)
- Rehabilitation Therapy (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Medical Treatment And Welfare Office Work (AREA)
- Rehabilitation Tools (AREA)
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JP2017144008A JP2019024580A (ja) | 2017-07-25 | 2017-07-25 | リハビリテーション支援システム、リハビリテーション支援方法及びプログラム |
| JP2017-144007 | 2017-07-25 | ||
| JP2017144007A JP2019024579A (ja) | 2017-07-25 | 2017-07-25 | リハビリテーション支援システム、リハビリテーション支援方法及びプログラム |
| JP2017-144008 | 2017-07-25 | ||
| JP2017184111A JP2019058285A (ja) | 2017-09-25 | 2017-09-25 | 活動支援方法、プログラム、及び活動支援システム |
| JP2017-184111 | 2017-09-25 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2019022102A1 true WO2019022102A1 (ja) | 2019-01-31 |
Family
ID=65040712
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/JP2018/027797 Ceased WO2019022102A1 (ja) | 2017-07-25 | 2018-07-25 | 活動支援方法、プログラム、及び活動支援システム |
Country Status (2)
| Country | Link |
|---|---|
| TW (1) | TW201909058A (zh) |
| WO (1) | WO2019022102A1 (zh) |
Cited By (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021246423A1 (ja) * | 2020-06-01 | 2021-12-09 | 株式会社Arblet | 情報処理システム、サーバ、情報処理方法及びプログラム |
| JP2021190129A (ja) * | 2020-06-01 | 2021-12-13 | 株式会社Arblet | 情報処理システム、サーバ、情報処理方法及びプログラム |
| WO2022269930A1 (ja) * | 2021-06-25 | 2022-12-29 | 株式会社Cureapp | 情報処理装置、情報処理方法及び情報処理プログラム |
| WO2023127870A1 (ja) * | 2021-12-28 | 2023-07-06 | 株式会社Sportip | 介護支援装置、介護支援プログラム、介護支援方法 |
| JP7344622B1 (ja) * | 2022-05-09 | 2023-09-14 | 株式会社Utヘルステック | 整形外科患者のためのリハビリプログラムの選択と実行を支援するための遠隔診療支援システム。 |
| WO2023219056A1 (ja) * | 2022-05-09 | 2023-11-16 | 株式会社Utヘルステック | 整形外科患者のためのリハビリプログラムの選択と実行を支援するための遠隔診療支援システム |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005018653A (ja) * | 2003-06-27 | 2005-01-20 | Nissan Motor Co Ltd | リハビリメニュー提示装置及びこれを用いた介護サービス支援システム |
| JP2006302122A (ja) * | 2005-04-22 | 2006-11-02 | Nippon Telegr & Teleph Corp <Ntt> | 運動支援システムとその利用者端末装置及び運動支援プログラム |
| WO2012168999A1 (ja) * | 2011-06-06 | 2012-12-13 | システム・インスツルメンツ株式会社 | トレーニング装置 |
| JP2014104139A (ja) * | 2012-11-27 | 2014-06-09 | Toshiba Corp | リハビリテーション情報処理システム、情報処理装置、および情報管理装置 |
| WO2015019477A1 (ja) * | 2013-08-08 | 2015-02-12 | 株式会社日立製作所 | リハビリシステム及びその制御方法 |
| JP2017060572A (ja) * | 2015-09-24 | 2017-03-30 | パナソニックIpマネジメント株式会社 | 機能訓練装置 |
-
2018
- 2018-07-25 WO PCT/JP2018/027797 patent/WO2019022102A1/ja not_active Ceased
- 2018-07-25 TW TW107125750A patent/TW201909058A/zh unknown
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2005018653A (ja) * | 2003-06-27 | 2005-01-20 | Nissan Motor Co Ltd | リハビリメニュー提示装置及びこれを用いた介護サービス支援システム |
| JP2006302122A (ja) * | 2005-04-22 | 2006-11-02 | Nippon Telegr & Teleph Corp <Ntt> | 運動支援システムとその利用者端末装置及び運動支援プログラム |
| WO2012168999A1 (ja) * | 2011-06-06 | 2012-12-13 | システム・インスツルメンツ株式会社 | トレーニング装置 |
| JP2014104139A (ja) * | 2012-11-27 | 2014-06-09 | Toshiba Corp | リハビリテーション情報処理システム、情報処理装置、および情報管理装置 |
| WO2015019477A1 (ja) * | 2013-08-08 | 2015-02-12 | 株式会社日立製作所 | リハビリシステム及びその制御方法 |
| JP2017060572A (ja) * | 2015-09-24 | 2017-03-30 | パナソニックIpマネジメント株式会社 | 機能訓練装置 |
Cited By (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2021246423A1 (ja) * | 2020-06-01 | 2021-12-09 | 株式会社Arblet | 情報処理システム、サーバ、情報処理方法及びプログラム |
| JP2021190129A (ja) * | 2020-06-01 | 2021-12-13 | 株式会社Arblet | 情報処理システム、サーバ、情報処理方法及びプログラム |
| WO2022269930A1 (ja) * | 2021-06-25 | 2022-12-29 | 株式会社Cureapp | 情報処理装置、情報処理方法及び情報処理プログラム |
| WO2023127870A1 (ja) * | 2021-12-28 | 2023-07-06 | 株式会社Sportip | 介護支援装置、介護支援プログラム、介護支援方法 |
| JP2023097545A (ja) * | 2021-12-28 | 2023-07-10 | 株式会社Sportip | 介護支援装置、介護支援プログラム、介護支援方法 |
| JP7344622B1 (ja) * | 2022-05-09 | 2023-09-14 | 株式会社Utヘルステック | 整形外科患者のためのリハビリプログラムの選択と実行を支援するための遠隔診療支援システム。 |
| WO2023219056A1 (ja) * | 2022-05-09 | 2023-11-16 | 株式会社Utヘルステック | 整形外科患者のためのリハビリプログラムの選択と実行を支援するための遠隔診療支援システム |
Also Published As
| Publication number | Publication date |
|---|---|
| TW201909058A (zh) | 2019-03-01 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019022102A1 (ja) | 活動支援方法、プログラム、及び活動支援システム | |
| AU2022201300B2 (en) | Augmented reality therapeutic movement display and gesture analyzer | |
| JP2019058285A (ja) | 活動支援方法、プログラム、及び活動支援システム | |
| JP6871379B2 (ja) | 治療及び/又は運動の指導プロセス管理システム、治療及び/又は運動の指導プロセス管理のためのプログラム、コンピュータ装置、並びに方法 | |
| KR102099316B1 (ko) | 헬스케어를 위한 증강현실 디스플레이 장치 및 이를 이용한 헬스케어 시스템 | |
| JP4594157B2 (ja) | 運動支援システムとその利用者端末装置及び運動支援プログラム | |
| CN105493146A (zh) | 用于实现用户独自测定身体尺寸和身材信息和在网络环境中利用此类信息的装置、框架以及方法 | |
| JP2021051400A (ja) | リハビリ支援装置、リハビリ支援システム及びリハビリ支援方法 | |
| US20240225558A1 (en) | Information processing apparatus and information processing method | |
| JP2021049319A (ja) | リハビリ動作評価方法及びリハビリ動作評価装置 | |
| JP2021049208A (ja) | 運動評価システム | |
| WO2022034771A1 (ja) | プログラム、方法、情報処理装置 | |
| JP2019024579A (ja) | リハビリテーション支援システム、リハビリテーション支援方法及びプログラム | |
| JP7675387B2 (ja) | 判定方法、判定装置、及び、判定システム | |
| JP2020081413A (ja) | 動作検出装置、動作検出システム、動作検出方法及びプログラム | |
| JPWO2009040947A1 (ja) | 運動処方提案装置 | |
| JP2015035171A (ja) | 医用情報処理装置、プログラム及びシステム | |
| JPWO2019003429A1 (ja) | 人体モデル表示システム、人体モデル表示方法、通信端末装置、及びコンピュータプログラム | |
| JP6995737B2 (ja) | 支援装置 | |
| US20250041668A1 (en) | Exercise menu management device, exercise management method, and computer program | |
| JP2019024580A (ja) | リハビリテーション支援システム、リハビリテーション支援方法及びプログラム | |
| JP2022080824A (ja) | プログラム、方法、情報処理装置、システム | |
| JP7810597B2 (ja) | エクササイズシステム、エクササイズメニュー管理方法およびコンピュータプログラム | |
| WO2024159402A1 (en) | An activity tracking apparatus and system | |
| Jiménez et al. | Monitoring of motor function in the rehabilitation room |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18838986 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 18838986 Country of ref document: EP Kind code of ref document: A1 |