[go: up one dir, main page]

WO2022190187A1 - Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition - Google Patents

Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition Download PDF

Info

Publication number
WO2022190187A1
WO2022190187A1 PCT/JP2021/009158 JP2021009158W WO2022190187A1 WO 2022190187 A1 WO2022190187 A1 WO 2022190187A1 JP 2021009158 W JP2021009158 W JP 2021009158W WO 2022190187 A1 WO2022190187 A1 WO 2022190187A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vibration
unit
vibration information
diagnostic
Prior art date
Application number
PCT/JP2021/009158
Other languages
French (fr)
Japanese (ja)
Inventor
理絵子 鈴木
Original Assignee
株式会社ファセテラピー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ファセテラピー filed Critical 株式会社ファセテラピー
Priority to PCT/JP2021/009158 priority Critical patent/WO2022190187A1/en
Priority to JP2023504902A priority patent/JP7554515B2/en
Publication of WO2022190187A1 publication Critical patent/WO2022190187A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state

Definitions

  • the present invention relates to a vibration information providing system, a vibration information providing server device, a vibration information acquiring device, a vibration information providing method, a vibration information providing program, and a vibration information acquiring program, and particularly to providing vibration information having a predetermined tactile quality. It is related to the technology for
  • target information such as voice information, image information, text information, heartbeat, body movement, and blood flow rate is analyzed to identify the tactile mode, and vibration information (predetermined tactile sensation) is provided to the user according to the tactile mode.
  • vibration information predetermined tactile sensation
  • a system is known in which tactile content generated so as to make an image is supplied to a content utilization device (see, for example, Patent Document 1).
  • tactile content that causes a specific tactile sensation by vibration is appropriately selected according to the tactile quality of the target information used by the user or the target information related to the user's biological body. It becomes possible to provide
  • Patent Document 1 supplies tactile content that matches the tactile quality analyzed from target information to a content-using device. , beauty instruments, in-vehicle equipment, home appliances, audio equipment, virtual experience equipment, medical equipment, welfare equipment, toys, game machines or their controllers, switches and buttons, vibration alarms, touch panels with vibration feedback, etc. However, it does not specifically describe how the haptic content is used in the medical field.
  • a sensor detects information about the body tissue of the subject and A device is known that generates a stimulation signal based on a detection result and applies vibration stimulation to tendons of muscles of a plurality of fingers of a worker holding a tool (see, for example, Patent Document 2).
  • the tactile force sense information presentation device described in Patent Document 2 can be used by a worker who holds a special tool necessary for work and concentrates his vision on the work without reducing workability and operability. , to present information tactilely.
  • a contact state between the head portion of the endoscope and the subject is detected by a sensor, a stimulus signal having a magnitude corresponding to the degree of contact is generated, and a plurality of fingers of an operator holding the endoscope are generated. Vibration stimulation is applied to the tendons of each muscle.
  • Patent Documents 3 to 5 There are also known therapeutic devices that generate vibrations according to symptoms (see Patent Documents 3 to 5, for example).
  • a bone conduction speaker is applied as a vibrator of a massager, and a signal with a frequency and output that matches the symptom is sent to the bone conduction speaker to vibrate it.
  • a plurality of massage modes are displayed on a massage selection screen, and a vibrator is operated based on the massage mode selected and input by the user to perform massage according to the symptom of stiffness. I am trying to provide an effect.
  • the massage device described in Patent Document 5 uses an artificial intelligence unit to make a diagnosis according to the user based on the data on the doctor's knowledge and know-how based on the user's voice information input by dialogue with the user. Based on the diagnosis result, it is configured to recommend a suitable massage course.
  • Patent Literature 5 also discloses that a diagnosis is made in consideration of physical conditions such as the user's pulse rate, body weight, body fat percentage, and stress index.
  • the tactile force sense information presentation device described in Patent Document 2 conveys information useful for the work to a worker performing medical treatment in the form of vibration stimulation. It does not apply vibration stimulation to the subject who is in the room.
  • the therapy devices described in Patent Documents 3 to 5 vibrations corresponding to symptoms are given to the user, but there is no mention of selecting vibrations corresponding to symptoms from the viewpoint of tactile sensation.
  • An object of the present invention is to provide the user with vibration information related to vibration that produces an appropriate tactile sensation according to the user's physical or mental state.
  • the present invention acquires diagnostic information about a user's physical condition or mental condition, and selects among a plurality of pieces of vibration information about vibrations with different tactile qualities (characteristics of vibrations that cause specific tactile sensations). Therefore, the vibration information about the vibration having the tactile quality corresponding to the diagnostic information is acquired and provided to the user.
  • vibration information related to vibration that produces an appropriate tactile sensation according to the physical or mental state of the user.
  • the diagnostic information about the physical or mental state caused by the symptom provides vibration information about vibration that produces a specific tactile sensation from the viewpoint of alleviating or improving the symptom. be able to.
  • FIG. 2 is a block diagram showing an example functional configuration of a user terminal according to the first embodiment
  • FIG. 3 is a block diagram showing a functional configuration example of a server device according to the first embodiment
  • FIG. It is a figure which shows an example of the screen displayed on a display by a face image display part.
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information; It is a figure for demonstrating two tactile parameters.
  • FIG. 11 is a diagram for explaining another example of section division;
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information;
  • FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information
  • FIG. 11 is a block diagram showing an example functional configuration of a user terminal according to the second embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to the second embodiment
  • FIG. 12 is a block diagram showing an example of functional configuration of a user terminal according to the third embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a medical institution terminal according to the third embodiment
  • FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to a third embodiment
  • FIG. 1 is a diagram showing an overall configuration example of a vibration information providing system according to a first embodiment.
  • the vibration information providing system according to the first embodiment includes a user terminal 100 used by a user, a medical institution terminal 200 used in a medical institution, and signals from the user terminal 100 and the medical institution terminal 200. and a server device 300 that receives access and provides information.
  • the user terminal 100 and the server device 300 and the medical institution terminal 200 and the server device 300 are connected via a communication network 500 such as the Internet or a mobile phone network.
  • the user terminal 100 is configured by, for example, a personal computer, tablet, smartphone, or the like.
  • the medical institution terminal 200 is configured by, for example, a personal computer, a tablet, or a smart phone.
  • the user terminal 100 has a camera for capturing images (either the user terminal 100 itself is equipped with a camera or the user terminal 100 is configured to be connectable with a camera).
  • the user terminal 100 and the medical institution terminal 200 have displays for displaying images and the like.
  • the user terminal 100 functions as a vibration information acquisition device
  • the server device 300 functions as a vibration information providing server device. That is, in the first embodiment, the camera of the user terminal 100 captures an image of the user's face and transmits the captured image to the server device 300 .
  • the server device 300 analyzes the photographed image of the face transmitted from the user terminal 100 , selects vibration information according to the diagnosis result based on the face image, and provides the user terminal 100 with the vibration information.
  • the vibration information provided here is selected according to the diagnosis result regarding the user's physical condition or mental condition analyzed from the captured image of the face, and is vibration related to vibration having a tactile quality that causes a specific tactile sensation.
  • the vibration information is waveform information (either analog information or digital information) that serves as a vibration source for generating vibration by a vibrator or the like. Using this vibration information, the user can apply vibrations generated by a vibrator to his or her face, thereby alleviating or improving physical or mental disorders and symptoms.
  • the server device 300 also provides the user terminal 100 with diagnostic information indicating the results of diagnosis performed by the server device 300 using the captured image of the face in association with the captured image of the face transmitted from the user terminal 100 .
  • the server device 300 provides the medical institution terminal 200 with the photographed image of the face transmitted from the user terminal 100 and diagnostic information generated by the server device 300 using the photographed image.
  • the photographed image of the face and the diagnostic information are shared between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder or symptoms, and it is also possible to receive appropriate advice from the medical institution.
  • the user terminal 100 has a function of transmitting a photographed face image to the server device 300, a function of receiving vibration information and diagnostic information from the server device 300, and displaying the photographed face image and diagnostic information on the display.
  • An application having functions hereinafter referred to as vibration-related application is installed.
  • FIG. 2 is a block diagram showing a functional configuration example of the user terminal 100 according to the first embodiment.
  • the user terminal 100 includes, as a functional configuration, a facial image capturing unit 11, a facial image recording unit 12, a facial image transmitting unit 13, a vibration information receiving unit 14, a vibration information recording unit 15, and a diagnostic information receiving unit. 16 , a diagnostic information recording unit 17 , a vibration applying unit 18 and a face image display unit 19 .
  • the user terminal 100 also includes a face image storage unit 10A and a vibration information storage unit 10B as storage media.
  • Each of the above functional blocks 11 to 19 can be configured by hardware, DSP (Digital Signal Processor), or software.
  • DSP Digital Signal Processor
  • each of the functional blocks 11 to 19 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by the operation of related application programs.
  • the functions of the facial image photographing unit 11, the facial image transmitting unit 13, the vibration information receiving unit 14, and the diagnostic information receiving unit 16 are realized by the vibration information acquisition program.
  • the face image capturing unit 11 uses the camera provided in the user terminal 100 to capture the user's face. Actually, the face image capturing unit 11 captures an image including the user's face and its surrounding background.
  • the face image recording unit 12 stores the photographed image of the face obtained by the photographing by the face image photographing unit 11 in the face image storage unit 10A together with information indicating the photographing date and time. Storing the photographed image together with the photographing date/time information includes storing the photographed image in a manner in which the photographing date/time information is included as one of the attribute information attached to the image data (image file).
  • the facial image transmission unit 13 transmits to the server device 300 the photographed image of the face obtained by the photographing by the facial image photographing unit 11 .
  • the face image transmission unit 13 extracts only the user's face from the captured image including the background, and transmits the image of the extracted face to the server device 300 .
  • the photographed image of the face including the background may be transmitted to the server device 300 as it is, and the server device 300 may extract only the face of the user from the photographed image including the background.
  • the server device 300 may recognize the user's facial portion in the photographed image including the background, and perform analysis for diagnosis on the recognized facial portion.
  • a photographed image of a face with or without a background may simply be referred to as a face image.
  • the face image transmitting unit 13 transmits the captured image to the server device 300.
  • the face image transmission unit 13 After the face image captured by the face image capturing unit 11 is stored in the face image storage unit 10A by the face image recording unit 12, the face image transmission unit 13, at an arbitrary timing designated by the user, It is also possible to read the photographed image from the face image storage unit 10A and transmit it to the server device 300.
  • the facial image transmission unit 13 transmits the photographed image of the user's face to the server device 300 together with the photographing date/time information and the user ID.
  • the user ID is identification information set for the vibration-related application when installing the vibration-related application in the user terminal 100, for example.
  • the shooting date/time information is used as information for specifying the user's face image.
  • a user ID is used as information for identifying a user.
  • the vibration information receiving unit 14 receives vibration information provided from the server device 300 .
  • the details of the function of server device 300 to provide vibration information will be described later.
  • the vibration information recording unit 15 stores the vibration information received by the vibration information receiving unit 14 in the vibration information storage unit 10B.
  • the diagnostic information receiving unit 16 receives diagnostic information provided from the server device 300 in association with the photographed image of the face. As will be described later, the association between the diagnostic information and the photographed image of the face is performed based on the user ID and photographing date and time information. The details of the function of the server device 300 to provide diagnostic information will be described later.
  • the diagnostic information recording unit 17 stores the diagnostic information received by the diagnostic information receiving unit 16 in the face image storage unit 10A in association with the photographed image of the face to be diagnosed.
  • the photographed image of the face to be diagnosed is specified by the photographing date and time information.
  • the vibration applying unit 18 performs processing for applying vibration to the user's face by supplying the vibration information stored in the vibration information storage unit 10B to a vibrator for generating vibration.
  • the vibrator may be equipped with the user terminal 100, or may be equipped with a vibration supplier connected to the user terminal 100 by wire or wirelessly.
  • the vibration supplier may be a sheet or mask that is put on the user's face, or a pad that is put on the user's face.
  • the vibration supplier may be a speaker unit that reproduces low frequencies, such as a woofer, and instead of applying vibrations directly to the user, low-frequency vibrations may be emitted into the space where the user is present. .
  • the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator by operating a user interface for instructing vibration generation provided by a vibration-related application, for example. By doing so, the user terminal 100 is brought into contact with the user's face in a state of generating vibration.
  • the vibration information storage unit 10B is operated by operating the user interface for instructing vibration generation while the vibration supplier is in contact with the user's face. vibrate the vibration supplier by supplying the vibration information stored in the vibrator to the vibrator.
  • the vibration supplier is a speaker unit
  • the speaker unit is installed at an arbitrary place in the space, and the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator, thereby emitting vibration to the space.
  • the facial image display unit 19 causes the display to display the user's facial image stored in the facial image storage unit 10A and diagnostic information associated with the facial image.
  • FIG. 4 is a diagram showing an example of a screen displayed on the display by the face image display section 19. As shown in FIG. As shown in FIG. 4, the face image display unit 19 displays one or more pieces of set information, with a photographed image of the face, information indicating the photographing date and time of the photographed face, and diagnosis information acquired from the server device 300 as one piece of set information. List on screen.
  • the user captures an image of the face on a certain date and time and transmits the captured image to the server device 300 .
  • the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image.
  • the photographed image of the face obtained at this time, photographing date and time information, and diagnostic information constitute one set of information, which are stored in the face image storage unit 10A in association with each other.
  • the user uses the vibration information acquired at this time, the user causes the vibrator to generate vibration and vibrate the face.
  • the user takes an image of the face on another date and time and transmits the taken image to the server device 300 .
  • the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image.
  • the photographed image of the face obtained at this time, photographing date and time information, and diagnostic information are separate set information, and are stored in the face image storage unit 10A in association with each other.
  • the user applies vibration to the face using the vibration information acquired at this time.
  • One or more pieces of set information sequentially accumulated in the face image storage unit 10A in this manner are displayed by the face image display unit 19 as a list as shown in FIG.
  • This list display allows the user to confirm changes in the facial image and diagnostic information resulting from the application of vibration.
  • FIG. 3 is a block diagram showing a functional configuration example of the server device 300 according to the first embodiment.
  • the server device 300 of the present embodiment includes a diagnostic information acquiring unit 31, a vibration information acquiring unit 32, a vibration information providing unit 33, a diagnostic information providing unit 34, and a diagnostic information providing unit 35 as functional configurations. It has The diagnostic information acquisition unit 31 has a face image acquisition unit 31a and an image analysis unit 31b as more specific functional configurations.
  • the server device 300 also includes a face image storage section 30A and a vibration information storage section 30B as storage media.
  • Each of the functional blocks 31 to 35 can be configured by hardware, DSP, or software.
  • each of the functional blocks 31 to 35 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by running the information providing program.
  • the diagnostic information acquisition unit 31 acquires diagnostic information regarding the user's physical condition or mental condition. Specifically, the diagnostic information acquisition unit 31 acquires diagnostic information by the face image acquisition unit 31a and the image analysis unit 31b.
  • the facial image acquiring unit 31a acquires the photographed image of the user's face transmitted by the facial image transmitting unit 13 of the user terminal 100, and stores it in the facial image storage unit 30A together with the photographing date/time information and the user ID.
  • the image analysis unit 31b acquires diagnostic information by analyzing the physical state or mental state of the user based on the photographed image of the face acquired by the face image acquisition unit 31a, and obtains the photographed image of the face (image date and time information). and stored in the face image storage unit 30A in association with the user ID.
  • the image analysis unit 31b analyzes the photographed image of the face acquired by the face image acquisition unit 31a to obtain hardness information indicating the degree of hardness of the facial expression and facial expression. Distortion information indicating the degree of distortion is acquired as diagnostic information.
  • the hardness information indicating the degree of hardness of facial expressions is information that classifies the degree of hardness of facial expressions into a plurality of levels.
  • the degrees of facial expression hardness are classified into two levels.
  • the facial expression hardness information is information indicating whether the facial expression is "hard” or "soft.” Note that the number of classification levels is not limited to two.
  • the distortion information indicating the degree of facial distortion is information that classifies the degree of facial distortion into a plurality of levels.
  • the degree of facial distortion is classified into two levels.
  • the facial distortion information is information indicating whether the distortion is "large” or "small.” Note that the number of classification levels is not limited to two.
  • facial muscles become stiff and stiff, and their facial expressions become stiff. It is known that facial muscles become stiff when a person is tense, tired, uncomfortable, worried or stressed. Therefore, the firmness of the facial expression can be used as information suggesting the degree of tension, fatigue, discomfort, worries, and stress of a person.
  • the neural network parameter it is possible to analyze the hardness of facial expressions based on captured images of faces using a classification model generated by machine learning.
  • the neural network parameter It is possible to generate an adjusted classification model.
  • the learning data used here are images of a person's face when he/she feels tense, tired, uncomfortable, concerned, or stressed, and images of the same person's face when he/she feels tense, tired, uncomfortable, concerned, or stress-free. It is assumed that the photographed image of the face when the person is in the room is provided with label information indicating whether the facial expression is "hard” or "soft".
  • the form of the generated classification model is not limited to the neural network model. For example, any one of a regression model, a tree model, a Bayesian model, a clustering model, etc. is also possible.
  • the image analysis unit 31b acquires distortion information indicating the degree of distortion of the face by analyzing the left-right symmetry of the face.
  • the image analysis unit 31b sets a vertical virtual line passing through the center point of the nose, and determines the amount of deviation between each part of the face on the left side of the virtual line and each part of the face on the right side of the virtual line. It is possible to analyze and determine that the distortion is "large” if the magnitude of the deviation is greater than or equal to the threshold, and that the distortion is "small” if the magnitude of the deviation is less than the threshold.
  • the magnitude of the deviation of each part is obtained by generating a line-symmetrical mirror image from the facial image on the left side of the virtual line, and superimposing this mirror image on the facial image on the right side of the virtual line. can be detected as the amount of difference between the pixel positions corresponding to .
  • the amount of misalignment is detected and scored for each part such as the eyebrows, eyes, lips, cheeks, etc., and the total score is calculated by weighting and adding the scores obtained for each part.
  • the distortion of the face may be classified as either “large” or “small” depending on whether it is equal to or greater than the threshold.
  • a first type F1 with a “hard” facial expression and “large” distortion a first type F1 with a “hard” facial expression and “large” distortion
  • a second type F2 with a “soft” facial expression and a “large” distortion a second type F2 with a “soft” facial expression and a “large” distortion
  • a “soft” and “large” facial expression It is possible to classify face images into either a third type F3 with "small distortion” or a fourth type F4 with a "hard” facial expression and a "small distortion”.
  • the vibration information acquisition unit 32 selects the diagnostic information acquired by the diagnostic information acquisition unit 31 from among a plurality of pieces of vibration information related to vibrations with different tactile qualities (characteristics of vibration that cause a specific tactile sensation). Vibration information related to vibration having a tactile quality corresponding to the diagnostic information stored in the face image storage unit 30A is acquired. That is, the vibration information acquisition unit 32 acquires vibration information about vibration having a tactile quality corresponding to the hardness information and distortion information of the user's face analyzed by the image analysis unit 31b.
  • the vibration information includes a first type V1 with a “hard” and “coarse” tactile sensation, a first type V1 with a “soft” and “coarse” tactile sensation, and a second type V1 with a “soft” and “coarse” tactile sensation.
  • vibration tactile sensation types It is classified into two types V2, a third type V3 having a "soft” and “smooth” tactile sensation, and a fourth type V4 having a “hard” and “smooth” tactile sensation. These types V1 to V4 are hereinafter referred to as vibration tactile sensation types.
  • one piece of vibration information is stored in the vibration information storage unit 30B in advance for each type in a mode classified into four vibration tactile sensation types V1 to V4 in association with the hardness and roughness of the tactile sensation.
  • the vibration information is preferably low-frequency vibration information of 40 Hz or less (preferably 20 Hz or less, more preferably 10 Hz or less, still more preferably 5 Hz or less). It should be noted that audible sound such as music may be mixed with low-frequency vibration information.
  • the vibration information acquisition unit 32 obtains the diagnosis result types F1 to F4 specified by the combination of the hardness information of the user's facial expression and the facial distortion information, and the hardness (softness) and roughness of the tactile sensation of the vibration ( Vibration information of the tactile quality corresponding to the diagnosis information is acquired from the vibration information storage unit 30B based on the correspondence relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the smoothness). Details of the vibration information stored in the vibration information storage unit 30B and the operation of the vibration information acquisition unit 32 will be described later.
  • the vibration information providing unit 33 provides the user terminal 100 with the vibration information acquired by the vibration information acquiring unit 32 .
  • the vibration information provided by the vibration information providing unit 33 is received by the vibration information receiving unit 14 of the user terminal 100, and is stored by the vibration information recording unit 15 in the vibration information storage unit 10B.
  • the diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information acquired by the image analysis unit 31b in association with the photographed image of the face acquired by the face image acquisition unit 31a. That is, the diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information stored in the face image storage unit 30A in association with the photographed image of the face, together with the photographing date and time information capable of specifying the photographed image of the face. do.
  • the diagnostic information provided by the diagnostic information providing unit 34 together with the photographing date/time information is received by the diagnostic information receiving unit 16 of the user terminal 100, and the diagnostic information recording unit 17 associates the face image with the face image corresponding to the photographing date/time information. It is stored in the storage unit 10A.
  • the diagnostic information providing unit 35 obtains the photographed image of the user's face obtained by the face image obtaining unit 31a and the diagnostic information (facial image storage) obtained by the image analysis unit 31b.
  • the photographed image of the face stored in the unit 30A and diagnostic information associated therewith) is provided to the medical institution terminal 200 as diagnostic information.
  • the medical institution terminal 200 accesses the server device 300, designates the user ID provided by the user, and requests acquisition of diagnostic information.
  • the medical examination information providing unit 35 generates a screen containing a list of photographed images of the user's face and diagnostic information stored in the facial image storage unit 30A in association with the user ID, and sends it to the medical institution terminal. 200 offers.
  • a screen including information similar to the screen example shown in FIG. 4 is displayed on the display of the medical institution terminal 200 as well.
  • the vibration information stored in the vibration information storage unit 30B As described above, a plurality of pieces of vibration information with different tactile qualities are classified in association with the hardness (softness) and roughness (smoothness) of the tactile sensation of the vibration information. That is, the vibration information stored in the vibration information storage section 30B has a unique tactile effect with respect to the hardness and roughness of the tactile sensation.
  • the haptic effect of vibration information is specified by the inherent tactile parameters of the vibration information.
  • the tactile parameter is a parameter representing the degree of opposing tactile qualities (hereinafter referred to as a tactile pair) such as ⁇ hard-soft> and ⁇ rough-smooth>, and constitutes one element of tactile sensation.
  • a tactile pair such as ⁇ hard-soft> and ⁇ rough-smooth>
  • the intensity of the vibration waveform first tactile parameter h
  • the tactile parameter related to the ⁇ rough-smooth> tactile pair it is possible to use the length of the divided section of the vibration waveform (second tactile parameter t). That is, the larger the value representing the length of the divided section, the smoother it is, and the smaller the value representing the length, the rougher it is.
  • FIG. 6 is a diagram for explaining two tactile parameters h and t, and schematically shows an envelope waveform (hereinafter referred to as a vibration waveform) of vibration information.
  • the vibration waveform is divided into a plurality of parts along the time axis.
  • it is divided for each time when the amplitude of the vibration waveform becomes minimum. That is, the first divided section T1 is from the start point of the vibration waveform to the first minimum value, the second divided section T2 is from the first minimum value to the second minimum value, and the second minimum
  • the vibration waveform is divided into a plurality of sections in the direction of the time axis such as a third division section T3, . . .
  • the method of dividing the vibration waveform is not limited to the example shown in FIG.
  • the vibration waveform may be divided into a plurality of sections at each time when the amplitude reaches its maximum.
  • the vibration waveform may be divided into a plurality of sections each time the amplitude value becomes zero.
  • a plurality of characteristic points that can be distinguished from other points in the vibration waveform may be extracted, and the vibration waveform may be divided into a plurality of sections for each characteristic point.
  • characteristic points F 1 , F 2 , F 3 , . . . may be extracted, and a plurality of divided sections T1, T2, T3, .
  • vibration information is configured by a pattern in which the same vibration waveform is repeated multiple times, and a plurality of divided sections T1, T2, T3, . ⁇ may be set.
  • representative amplitudes h1, h2, h3, Identify the lengths of time t1, t2, t3, .
  • the representative amplitudes h1, h2, h3, The values of the differences from the maximum values in the intervals T1, T2, T3, . . . are shown.
  • the difference between this local minimum value and the local maximum value is the representative amplitude h1.
  • the minimum value at the start point of the section is larger than the minimum value at the end point, so the difference between the minimum value and the maximum value at the start point becomes the representative amplitude h2.
  • the minimum value at the end point is larger than the minimum value at the start point of the section, so the difference between the minimum value and the maximum value at the end point is the representative amplitude h3.
  • the method of specifying the representative amplitude shown here is an example, and is not limited to this.
  • the smaller one of the minimum value of the start point or the minimum value of the end point in each divided section T1, T2, T3, . . . and the maximum value in the divided section T1, T2, T3, may be specified as the representative amplitude.
  • the positive maximum value or negative minimum value in each divided section is the first It may be specified as a representative amplitude of the tactile parameter h.
  • the absolute value of the negative minimum value may be specified as the representative amplitude of the first tactile parameter h.
  • the first tactile parameter h is, for example, the average value, maximum value, minimum value, or median value of the representative amplitudes h1, h2, h3, .
  • the second tactile parameter t is the average value, maximum value, minimum value, or median value of the time lengths t1, t2, t3, . .
  • the number of representative amplitudes h1, h2, h3, . , . . . exceeding a threshold value or the ratio to the total number may be used as the second tactile parameter t.
  • the vibration information includes the first tactile parameter h (intensity of the vibration waveform) representing the hardness of the tactile sensation of vibration, and the second tactile parameter t (the number of divided sections) representing the roughness of the tactile sensation of vibration. length), and can be classified into any of four vibrotactile sensation types V1 to V4 by the combination of the first tactile parameter h and the second tactile parameter t. .
  • the vibration information is a first type V1 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal (1/t) of the second tactile parameter t is equal to or greater than the second threshold Th2.
  • a second type V2 in which the quality parameter h is less than the first threshold Th1 and the reciprocal of the second tactile parameter t is equal to or greater than the second threshold Th2, the first tactile parameter h is less than the first threshold Th1 and the second tactile
  • a third type V3 in which the reciprocal of the parameter t is less than the second threshold Th2
  • a fourth type V3 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal of the second tactile parameter t is less than the second threshold Th2 It is classified into four types of type V4.
  • the vibration information acquisition unit 32 acquires the diagnosis result types F1 to F4 specified by the combination of the facial expression hardness information and the facial distortion information of the user, and the hardness and coarseness of the tactile sensation of the vibration information.
  • Vibration information of the tactile quality corresponding to the diagnosis information of the user's face is obtained from the vibration information storage unit 30B based on the corresponding relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the tactile sensations.
  • the vibration information acquisition unit 32 selects the diagnosis result type F1 so that both the hardness of the facial expression and the facial distortion of the user approach a moderate state (the state of the position indicated by Fc in FIG. 5A).
  • Vibration information belonging to the vibration tactile sensation types V1 to V4 associated with F4 is obtained from the vibration information storage unit 30B.
  • the diagnosis result types F1 to F4 and the vibration tactile sensation types V1 to V4 are associated as ⁇ F1-V3>, ⁇ F2-V4>, ⁇ F3-V1> and ⁇ F4-V2>.
  • This provides vibration information of a third type V3 on the opposite side when the diagnostic information is of the first type F1, and provides vibration information of a fourth type V4 on the opposite side when the diagnostic information is of the second type F2.
  • the vibration information of the first type V1 on the opposite electrode side is provided, and when the diagnostic information is the fourth type F4, the vibration information of the second type V2 on the opposite electrode side is provided. means to provide.
  • the diagnostic information is of the first type F1
  • the diagnostic information is of the third type V3.
  • Vibration information that is, vibration information having a "soft” and “smooth” tactile sensation, is provided to the user. This vibration information acts to soften the hard expression of the user's face and to smooth and reduce the distortion of the user's face. When the user actually applies this vibration to the face, it is expected that the state of the face belonging to the first type F1 approaches the moderate state Fc.
  • hardness information indicating the degree of hardness of the user's facial expression and distortion indicating the degree of facial distortion are used as diagnostic information regarding the user's physical state or mental state.
  • Information is acquired, and vibration information having a tactile quality corresponding to the diagnostic information is acquired from among a plurality of pieces of vibration information having different tactile qualities, and is provided to the user.
  • vibration information regarding vibrations that produce an appropriate tactile sensation according to the user's physical or mental state For example, for a user with a certain symptom, vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information on the physical condition or mental condition that appears on the face due to the symptom. can be provided.
  • the first embodiment provides a mechanism for sharing the photographed image of the user's face and diagnostic information between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • the diagnostic information is classified into four diagnostic result types F1 to F4 from the two viewpoints of the hardness of the user's facial expression and the degree of facial distortion, and the hardness of the tactile sensation is classified into four types.
  • vibration information is classified into four vibration tactile sensation types V1 to V4 from the two viewpoints of roughness and roughness
  • the present invention is not limited to this.
  • the diagnostic information is classified into two diagnostic result types F1′ and F2′ from the viewpoint of the hardness of the facial expression of the user, and the vibration information is divided into two types from the viewpoint of the hardness of the tactile sensation. It may be classified into vibration tactile sensation types V1' and V2'. Further, as shown in FIG.
  • the diagnostic information is classified into two diagnostic result types F1'' and F2'' from the viewpoint of the degree of distortion of the user's face, and the vibration information is divided into two types from the viewpoint of the roughness of the tactile sensation. It may be classified into vibration tactile sensation types V1′′ and V2′′.
  • the face image captured by the face image capturing unit 11 may be a moving image.
  • the image analysis shown in the above embodiment is executed for each frame, and index values relating to at least one of changes in the degree of facial expression hardness and changes in the degree of facial distortion are calculated based on the physical state of the user. Or you may make it acquire as diagnostic information regarding a mental state.
  • FIG. 10 is a block diagram showing a functional configuration example of the user terminal 100 according to the second embodiment.
  • the parts denoted by the same reference numerals as those shown in FIG. 2 have the same functions, and redundant description will be omitted here.
  • the user terminal 100 includes the face image photographing unit 11, the face image recording unit 12, the face image transmitting unit 13, the diagnostic information receiving unit 16, and the diagnostic information recording unit shown in FIG.
  • an inquiry execution unit 21 an answer information recording unit 22, an answer information transmission unit 23, a diagnosis information reception unit 26, a diagnosis information recording unit 27, and an answer information display It has a section 29 and an answer information storage section 20A.
  • the medical inquiry execution unit 21 provides the user with medical inquiry information for performing self-diagnosis on a specific symptom, and acquires answer information indicating the answer input by the user to the medical inquiry information.
  • Particular symptoms are, for example, depression, dementia, insomnia, and the like.
  • Self-diagnosis includes, for example, SDS (Self-rating Depression Scale), SDC (Symbol Digit Coding), EQ5D (EuroQol 5 Dimension), QIDS (Quick Inventory of Depressive Symptomatology), BDI (Beck Depression Inventory), CES-D (Center for Epidemiologic Studies Depression Scale), DSM-IV (The Diagnostic and Statistical Manual of Mental Disorders), EPDS (Edinburgh Postnatal Depression Scale), MMSE (Mini Mental State Examination:), HDS-R (Hasegawa's) Dementia Scale-Revised: Revised Hasegawa Simple Intelligence Assessment Scale), ADAS-cog (Alzheimer's Disease Assessment Scale-cognitive subscale),
  • the medical interview execution unit 21 outputs a message prompting the medical interview to the display at a predetermined timing in accordance with an appropriate predetermined cycle (for example, every other day cycle) according to the self-assessment scale used in the medical interview. Then, the medical inquiry execution unit 21 presents the medical inquiry information to the user through a predetermined medical inquiry screen in accordance with the instruction operation by the user who has seen this message. The inquiry executing unit 21 inputs the user's answer information to the inquiry information presented through the inquiry screen.
  • the answer information recording unit 22 stores the answer information obtained by executing the inquiry by the inquiry execution unit 21 in the answer information storage unit 20A together with the information indicating the date and time of the inquiry. Storing the answer information together with the implementation date/time information includes storing the answer information in a manner in which the implementation date/time information is included as one of the attribute information accompanying the text data (text file) of the answer.
  • the response information transmission unit 23 transmits the response information acquired by the medical inquiry execution unit 21 to the server device 300 together with the date and time information of the medical inquiry and the user ID.
  • the diagnostic information receiving unit 26 receives the diagnostic information provided from the server device 300 in association with the answer information of the inquiry. Diagnosis information and inquiry response information are associated based on the user ID and inquiry date and time information.
  • the diagnostic information recording unit 27 causes the diagnostic information received by the diagnostic information receiving unit 26 to be stored in the answer information storage unit 20A in association with the answer information of the medical interview targeted for diagnosis.
  • Questionnaire response information that is the target of diagnosis is specified based on the date and time information of the inquiry.
  • the answer information display unit 29 causes the display to display the inquiry answer information stored in the answer information storage unit 20A and diagnostic information (symptom degree information described later) associated with the answer information.
  • FIG. 12 is a diagram showing an example of a screen displayed on the display by the answer information display section 29. As shown in FIG. As shown in FIG. 12 , the answer information display unit 29 displays one or more sets of information including the answer information of the medical inquiry, the information indicating the date and time of the medical inquiry, and the diagnostic information acquired from the server device 300 as one piece of set information. are listed on the screen.
  • FIG. 11 is a block diagram showing a functional configuration example of the server device 300 according to the second embodiment.
  • the same reference numerals as those shown in FIG. 3 have the same functions, and redundant description will be omitted here.
  • the server device 300 includes the diagnostic information acquiring unit 31, the vibration information acquiring unit 32, the diagnostic information providing unit 34, the diagnostic information providing unit 35, and the facial image shown in FIG.
  • a diagnostic information acquisition section 41, a vibration information acquisition section 42, a diagnostic information provision section 44, a medical examination information provision section 45, an answer information storage section 40A and a vibration information storage section 40B are provided.
  • the server device 300 according to the second embodiment further includes an answer information acquisition unit 46 .
  • the response information acquisition unit 46 acquires the response information transmitted by the response information transmission unit 23 of the user terminal 100, and stores it in the response information storage unit 40A together with the date and time information of the medical interview and the user ID.
  • the diagnostic information acquisition unit 41 analyzes the response information acquired by the response information acquisition unit 46, and obtains symptom degree information indicating the user's self-diagnosis result regarding the degree of specific symptoms related to the user's physical condition or mental condition. is acquired as diagnostic information, and stored in the answer information storage unit 40A in association with the answer information of the inquiry and the user ID.
  • the diagnostic information acquisition unit 41 calculates the score of the self-assessment scale based on the answer information, and classifies the degree of symptoms into a plurality of levels according to the score.
  • the degree of symptoms is classified into two levels of "large” and “small” depending on whether the score of the self-assessment scale is equal to or higher than the threshold.
  • Information indicating any level classified in this way is symptom degree information.
  • the vibration information acquiring unit 42 selects diagnostic information acquired by the diagnostic information acquiring unit 41 (diagnostic information stored in the answer information storage unit 40A in association with question answer information) from among a plurality of pieces of vibration information related to vibrations having different tactile properties. ) to acquire vibration information about vibration having a tactile quality according to That is, the vibration information acquisition unit 42 acquires vibration information regarding vibration having a tactile quality corresponding to the symptom degree information acquired by the vibration information acquisition unit 42 .
  • the plurality of pieces of vibration information related to vibrations with different tactile qualities are classified in association with the hardness (softness) or roughness (smoothness) of the tactile sensation of the vibrations.
  • the vibration information is pre-stored in the vibration information storage unit 40B in a manner classified into a first type V1′ with a “hard” tactile sensation and a second type V2′ with a “soft” tactile sensation.
  • the vibration information is stored in advance in the vibration information storage unit 40B in a manner classified into a first type V1 with a "rough” tactile sensation and a second type V2 with a "smooth” tactile sensation. remembered.
  • the vibration information acquisition unit 42 When the symptom degree information acquired by the diagnostic information acquisition unit 41 indicates that the degree of the symptom is "high”, the vibration information acquisition unit 42 performs the operation shown in FIG. 8 ( In the case of b), the vibration information of the first type V1′ is acquired from the vibration information storage unit 40B, and in the case of FIG. If the symptom level information indicates that the symptom level is "small", the vibration information acquisition unit 42 selects the second type V2 in the case of FIG. ', and in the case of FIG. 9B, the vibration information of the second type V2'' is acquired from the vibration information storage unit 40B.
  • the diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 41 in association with the answer information acquired by the answer information acquiring unit 46 . That is, the diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information stored in the answer information storage unit 40A in association with the answer information of the medical inquiry, together with the date and time information of the medical inquiry that can specify the answer information. do.
  • the diagnostic information provided by the diagnostic information providing unit 44 together with the implementation date and time information is received by the diagnostic information receiving unit 26 of the user terminal 100, and the diagnostic information recording unit 27 associates the response information with the response information corresponding to the implementation date and time information. It is stored in the storage unit 20A.
  • the medical examination information providing unit 45 stores the answer information acquired by the answer information acquisition unit 46 and the diagnostic information acquired by the diagnostic information acquisition unit 41 (stored in the answer information storage unit 40A), for example, in response to a request from the medical institution terminal 200.
  • the stored answer information and symptom degree information associated therewith) are provided to the medical institution terminal 200 as examination information.
  • symptom degree information indicating a self-diagnosis result based on a rating scale corresponding to a specific symptom is acquired as diagnostic information about the user's physical condition or mental condition.
  • Vibration information having a tactile quality corresponding to the symptom degree information is acquired from among a plurality of pieces of vibration information having different degrees of symptoms, and is provided to the user.
  • the second embodiment configured in this way, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms (results of self-diagnosis) related to the user's physical condition or mental condition. .
  • vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information regarding the degree of the symptom.
  • the second embodiment provides a mechanism for sharing the answer information and diagnosis information of the medical interview between the user and the medical institution.
  • the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • FIG. 13 is a block diagram showing a functional configuration example of the user terminal 100 according to the third embodiment.
  • the components denoted by the same reference numerals as those shown in FIG. 10 have the same functions, and redundant description will be omitted here.
  • FIG. 14 is a block diagram showing a functional configuration example of the server device 300 according to the third embodiment.
  • the same reference numerals as those shown in FIG. 11 have the same functions, and redundant description will be omitted here.
  • FIG. 15 is a block diagram showing a functional configuration example of the medical institution terminal 200 according to the third embodiment.
  • the functional configuration of the medical institution terminal 200 according to the third embodiment includes a medical examination information input section 71, a medical examination information recording section 72, a medical examination information transmitting section 73, a diagnostic information receiving section 76, and a diagnostic information recording section 77. and an examination information display section 79 .
  • the medical institution terminal 200 according to the third embodiment includes a medical examination information storage unit 70A as a storage medium.
  • the medical examination information input unit 71 inputs medical examination information indicating the results of medical examinations performed on the user by doctors of medical institutions regarding specific symptoms. Medical examinations performed by a doctor include medical examinations by interviewing the user, medical examinations by observing the inside or outside of the user's body using medical equipment, and medical examinations by measuring the user's biometric information using measuring equipment. .
  • the medical examination information input unit 71 acquires medical examination information input by a doctor by operating an input device such as a keyboard. Further, the medical examination information input unit 71 may input biological information measured by a measuring device as part of the medical examination information.
  • the medical examination information recording unit 72 stores the medical examination information input by the medical examination information input unit 71 in the medical examination information storage unit 70A together with the medical examination date and time information.
  • the medical examination information transmission unit 73 transmits the medical examination information input by the medical examination information input unit 71 to the server device 300 together with the medical examination date/time information and the user ID.
  • the diagnostic information receiving unit 76 receives diagnostic information (symptom degree information described later) provided from the server device 300 in association with the medical examination information. Diagnosis information and consultation information are associated based on the user ID and consultation date/time information.
  • the diagnostic information recording section 77 stores the diagnostic information received by the diagnostic information receiving section 76 in the medical examination information storage section 70A in association with the medical examination information that is the target of the diagnosis. The examination information targeted for diagnosis is specified based on the examination date and time information.
  • the medical examination information display unit 79 displays the medical examination information stored in the medical examination information storage unit 70A and the diagnostic information (symptom degree information) associated with the medical examination information on the display.
  • the screen displayed on the display by the examination information display unit 79 is a list of one or more pieces of set information, for example, including examination information, examination date and time information, and diagnosis information.
  • the diagnosis information may be added to the chart screen displaying the consultation information and the consultation date/time information.
  • the server device 300 includes the answer information acquiring unit 46, the diagnostic information acquiring unit 41, the diagnostic information providing unit 44, the diagnostic information providing unit 45, and the answer information shown in FIG.
  • a diagnosis information acquisition unit 66 a diagnosis information acquisition unit 61, a diagnosis information provision unit 64, a diagnosis result information provision unit 65, and a diagnosis information storage unit 60A are provided.
  • the consultation information acquisition unit 66 acquires the consultation information transmitted by the consultation information transmission unit 73 of the medical institution terminal 200, and stores it in the consultation information storage unit 60A together with the consultation date/time information and the user ID.
  • the diagnostic information acquisition unit 61 analyzes the medical examination information acquired by the medical examination information acquisition unit 66 to acquire symptom level information indicating the degree of specific symptoms related to the user's physical condition or mental condition as diagnostic information.
  • the information is stored in the medical examination information storage unit 60A in association with the information and the user ID.
  • the diagnostic information acquisition unit 61 classifies the degree of symptoms into a plurality of levels based on the medical examination information, and acquires information indicating one of the classified levels as symptom degree information.
  • the degree of symptoms is classified into two levels of "large” and "small".
  • Classification according to the degree of symptoms based on medical examination information can be performed, for example, using a classification model generated by machine learning. For example, by performing machine learning using medical examination information obtained from multiple people as learning data, a classification in which neural network parameters are adjusted so that symptom degree information is output when medical examination information is input. It is possible to generate a model. Instead of the neural network model, it is also possible to generate a classification model in any form of a regression model, a tree model, a Bayesian model, a clustering model, and the like.
  • the diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 61 in association with the medical examination information acquired by the medical examination information acquiring unit 66 . That is, the diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information stored in the medical examination information storage unit 60A in association with the medical examination information together with the medical examination date and time information that can specify the medical examination information.
  • the diagnostic information provided by the diagnostic information providing unit 64 together with the examination date and time information is received by the diagnostic information receiving unit 76 of the medical institution terminal 200, and the diagnostic information recording unit 77 associates the examination date and time information with the examination information corresponding to the examination date information. It is stored in the information storage unit 70A.
  • the diagnostic result information providing unit 65 receives the medical examination information acquired by the medical examination information acquiring unit 66 and the diagnostic information acquired by the diagnostic information acquiring unit 61 (the medical examination information stored in the medical examination information storage unit 60A and the symptom degree associated therewith). information) is provided to the user terminal 100 as diagnosis result information.
  • the user terminal 100 according to the third embodiment does not include the medical interview execution unit 21, answer information recording unit 22, and answer information transmission unit 23 shown in FIG. 10 as functional configurations. Further, in the user terminal 100 according to the third embodiment, instead of the diagnostic information receiving section 26, the diagnostic information recording section 27, the answer information display section 29, and the answer information storage section 20A shown in FIG. 56, a diagnostic result information recording unit 57, a diagnostic result information display unit 25, and a diagnostic result information storage unit 50A.
  • the diagnostic result information receiving unit 56 receives diagnostic result information provided from the server device 300 .
  • the diagnostic result information recording unit 57 stores the diagnostic result information received by the diagnostic result information receiving unit 56 in the diagnostic result information storage unit 50A.
  • the diagnostic result information display unit 59 displays the diagnostic result information stored in the diagnostic result information storage unit 50A on the display.
  • the third embodiment configured as described above, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms related to the user's physical condition or mental condition (result of medical examination by a doctor). can. Further, in the third embodiment, a mechanism is provided for sharing between the user and the medical institution the examination information indicating the examination result by the doctor and the diagnostic information (symptom degree information) analyzed based on the examination result. there is As a result, the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
  • the vibration waveform (substance of vibration) that is the source of vibration is stored as vibration information in the vibration information storage units 30B and 40B, and the vibration is transmitted from the server device 300 to the user terminal 100.
  • the vibration information storage units 30 ⁇ /b>B and 40 ⁇ /b>B may store vibration identification information for specifying the entity of vibration as vibration information, and provide the vibration identification information from the server device 300 to the user terminal 100 .
  • the user terminal 100 accesses the server device 300 or another server device and downloads the substance of the vibration by making a request using the vibration identification information.
  • the vibration identification information may be a URL (Uniform Resource Locator) for accessing the download destination.
  • one piece of vibration information is stored in the vibration information storage unit 30B for each type. , 40B, and obtains one piece of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type from the vibration information storage units 30B and 40B, but the present invention is not limited to this. .
  • a plurality of pieces of vibration information are stored in advance in the vibration information storage units 30B and 40B for one vibration tactile sensation type, and any one of the plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type is selected.
  • One may be selected and acquired from the vibration information storage units 30B and 40B. The selection in this case may be automatically performed by the vibration information acquisition units 32 and 42, or may be manually performed by the user.
  • vibration information acquisition units 32 and 42 automatically select, for example, it is possible to select randomly. Alternatively, selection is made so that the same vibration information is not continuously provided to the same user every time a photographed image of the face or answer information to an inquiry is sent from one user. good too. For example, if the diagnostic information is continuously classified into the same diagnostic result type, vibration information that is different from the previously selected vibration information among a plurality of vibration information belonging to the vibration tactile sensation type corresponding to the diagnostic result type. to get
  • the user terminal 100 is presented with vibration identification information representing a plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type. It is possible to select vibration information (substance of vibration) corresponding to the vibration identification information selected by . Along with the vibration identification information, comment information describing the tactile sensation of the vibration and the like may be presented to the user.
  • the hardness information indicating the degree of facial expression hardness and the distortion information indicating the degree of facial distortion are used as diagnostic information regarding the user's physical condition or mental condition.
  • symptom degree information indicating a diagnosis result by oneself or by a doctor regarding the degree of a specific symptom has been described, but the present invention is not limited to this.
  • the user's biometric information measured regarding the user's physical condition or mental condition deviation degree information indicating the degree of deviation from the standard value or normal value of the biometric information is used as diagnostic information, and a touch corresponding to the deviation degree information is used.
  • Qualitative vibration information may be provided.
  • symptom degree information or the like may be acquired as diagnostic information by analyzing the user's physical condition or mental condition based on the audio information of the user's voice.
  • the configuration in which the server device 300 includes the diagnostic information acquiring units 31, 41, the vibration information acquiring units 32, 42, and the vibration information providing unit 33 has been described.
  • the user terminal 100 may include the diagnostic information acquisition units 31 and 41 and transmit diagnostic information from the user terminal 100 to the server device 300 .
  • the user terminal 100 may be provided with all the functional configurations provided by the server device 300 .
  • the vibration information providing unit 33 presents the vibration identification information to the user, accesses the server device 300 or another server device from the user terminal 100, and makes a request using the vibration identification information, You may make it download the substance of a vibration.
  • the configuration including the medical institution terminal 200 has been described, but the first and second embodiments may be configured without the medical institution terminal 200.
  • Diagnosis information storage unit 10A face image storage unit 10B vibration information storage unit 11 face image photographing unit 12 face image recording unit 13 face image transmission unit 14 vibration information reception unit 15 vibration information recording unit 16 diagnosis information reception unit 17 diagnosis information recording unit 18 vibration imparting unit 19 Face image display unit 20A Face image storage unit 21 Interview execution unit 22 Answer information recording unit 23 Answer information transmission unit 26 Diagnosis information reception unit 27 Diagnosis information recording unit 29 Answer information display unit 30A Face image storage unit 30B Vibration information storage unit 31 Diagnosis Information Acquisition Part 31a Face Image Acquisition Part 31b Image Analysis Part 32 Vibration Information Acquisition Part 33 Vibration Information Provision Part 34 Diagnosis Information Provision Part 35 Diagnostic Information Provision Part 40A Answer Information Storage Part 40B Vibration Information Storage Part 41 Diagnosis Information Acquisition Part 42 Vibration Information acquisition unit 44 Diagnosis information provision unit 45 Diagnosis information provision unit 46 Answer information acquisition unit 50A Diagnosis result information storage unit 56 Diagnosis result information reception unit 57 Diagnosis result information recording unit 59 Diagnosis result information display

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Educational Technology (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Medical Treatment And Welfare Office Work (AREA)

Abstract

A vibration information-providing system equipped with a diagnostic information acquisition unit 31 for acquiring diagnostic information pertaining to the physical condition or mental state of a user, a vibration information acquisition unit 32 for acquiring vibration information which pertains to vibrations having a tactile quality corresponding to the diagnostic information acquired by the diagnostic information acquisition unit 31 from among the plurality of units of vibration information which pertain to vibrations having different tactile qualities, which are the properties of vibrations which cause a specific tactile sensation, and a vibration information-providing unit 33 for providing vibration information acquired by the vibration information acquisition unit 32 to the user, wherein vibration information, which has the tactile quality corresponding to the diagnostic information pertaining to the physical condition or mental state of the user, is acquired from among the plurality of units of vibration information which have different tactile qualities, and is provided to the user, and by doing so, for example, it is possible to provide vibration information pertaining to vibrations which cause a specific tactile sensation from the perspective of alleviating or improving certain symptoms, according to diagnostic information pertaining to a physical condition or mental state caused by said symptoms, to a user who has said symptoms.

Description

振動情報提供システム、振動情報提供サーバ装置、振動情報取得装置、振動情報提供方法、振動情報提供用プログラムおよび振動情報取得用プログラムVibration Information Providing System, Vibration Information Providing Server Device, Vibration Information Acquiring Device, Vibration Information Providing Method, Vibration Information Providing Program, and Vibration Information Acquiring Program
 本発明は、振動情報提供システム、振動情報提供サーバ装置、振動情報取得装置、振動情報提供方法、振動情報提供用プログラムおよび振動情報取得用プログラムに関し、特に、所定の触質を有する振動情報を提供するための技術に関するものである。 TECHNICAL FIELD The present invention relates to a vibration information providing system, a vibration information providing server device, a vibration information acquiring device, a vibration information providing method, a vibration information providing program, and a vibration information acquiring program, and particularly to providing vibration information having a predetermined tactile quality. It is related to the technology for
 従来、音声情報、画像情報、テキスト情報、心拍、身体の動き、血流量などの対象情報を解析して触質モードを特定し、当該触質モードに応じた振動情報(所定の触覚をユーザにイメージさせるように生成された触覚コンテンツ)をコンテンツ利用機器に供給するようにしたシステムが知られている(例えば、特許文献1参照)。この特許文献1に記載の技術によれば、振動によって特定の触感を生じさせる触覚コンテンツを、ユーザが使用する対象情報またはユーザの生体に関する対象情報が有している触質に応じて適切に選定して提供することが可能となる。 Conventionally, target information such as voice information, image information, text information, heartbeat, body movement, and blood flow rate is analyzed to identify the tactile mode, and vibration information (predetermined tactile sensation) is provided to the user according to the tactile mode. A system is known in which tactile content generated so as to make an image is supplied to a content utilization device (see, for example, Patent Document 1). According to the technique described in Patent Document 1, tactile content that causes a specific tactile sensation by vibration is appropriately selected according to the tactile quality of the target information used by the user or the target information related to the user's biological body. It becomes possible to provide
 上記特許文献1に記載の技術は、対象情報から分析される触質に応じて、当該触質に合った触覚コンテンツをコンテンツ利用機器に供給するものであり、コンテンツ利用機器の例として、マッサージ器、美容器、車載機器、家電、音響機器、仮想体験型機器、医療機器、福祉用機器、玩具、ゲーム機またはそのコントローラ、スイッチやボタン、振動アラーム、振動フィードバック付きタッチパネルなどを挙げている。しかしながら、医療分野において触覚コンテンツを利用する形態については具体的に述べていない。 The technology described in Patent Document 1 supplies tactile content that matches the tactile quality analyzed from target information to a content-using device. , beauty instruments, in-vehicle equipment, home appliances, audio equipment, virtual experience equipment, medical equipment, welfare equipment, toys, game machines or their controllers, switches and buttons, vibration alarms, touch panels with vibration feedback, etc. However, it does not specifically describe how the haptic content is used in the medical field.
 一方、医療行為を行う作業者が内視鏡、メス、鉗子、操作ハンドルといった道具を使って被験者に対して作業を実施しているときに、被験者の生体組織に関する情報をセンサにより検出し、その検出結果に基づいて刺激信号を生成して、道具を持つ作業者の複数の指の各々の筋肉の腱に振動刺激を与えるようにした装置が知られている(例えば、特許文献2参照)。 On the other hand, when a medical worker uses tools such as an endoscope, a scalpel, forceps, and an operating handle to perform work on a subject, a sensor detects information about the body tissue of the subject and A device is known that generates a stimulation signal based on a detection result and applies vibration stimulation to tendons of muscles of a plurality of fingers of a worker holding a tool (see, for example, Patent Document 2).
 特許文献2に記載の触力覚的情報提示装置は、作業に必要な専用の道具を把持し、かつ視覚を作業に集中している作業者に対し、作業性や操作性を低下させることなく、触力覚的に情報を提示できるようにしたものである。一例として、内視鏡のヘッド部分と被験者との接触状態をセンサにより検知し、接触状態の程度に応じた大きさの刺激信号を生成して、内視鏡を把持する作業者の複数の指の各々の筋肉の腱に振動刺激を与えるようにしている。 The tactile force sense information presentation device described in Patent Document 2 can be used by a worker who holds a special tool necessary for work and concentrates his vision on the work without reducing workability and operability. , to present information tactilely. As an example, a contact state between the head portion of the endoscope and the subject is detected by a sensor, a stimulus signal having a magnitude corresponding to the degree of contact is generated, and a plurality of fingers of an operator holding the endoscope are generated. Vibration stimulation is applied to the tendons of each muscle.
 また、症状に応じた振動を発生する治療器も知られている(例えば、特許文献3~5参照)。特許文献3に記載の音波治療器では、骨伝導スピーカーをマッサージ器の振動子として適用し、症状にあった周波数および出力の信号を骨伝導スピーカーに送って振動させる。特許文献4に記載のマッサージ機では、複数種のマッサージモード(振動形態)をマッサージ選択画面に表示し、ユーザが選択入力したマッサージモードに基づいてバイブレータを作動させることにより、凝り症状に応じたマッサージ効果を提供するようにしている。 There are also known therapeutic devices that generate vibrations according to symptoms (see Patent Documents 3 to 5, for example). In the sound wave therapy device described in Patent Document 3, a bone conduction speaker is applied as a vibrator of a massager, and a signal with a frequency and output that matches the symptom is sent to the bone conduction speaker to vibrate it. In the massage machine described in Patent Document 4, a plurality of massage modes (vibration modes) are displayed on a massage selection screen, and a vibrator is operated based on the massage mode selected and input by the user to perform massage according to the symptom of stiffness. I am trying to provide an effect.
 また、特許文献5に記載のマッサージ装置は、利用者との対話によって入力された利用者の音声情報に基づいて、医師の知識・ノウハウに関するデータに基づく利用者に応じた診断を人工知能部により行い、その診断結果に基づいて、適したマッサージコースの推奨を行うように構成されている。特許文献5には、利用者の脈拍数、体重、体脂肪率、ストレス指数など身体状態を加味して診断を行うことも開示されている。 In addition, the massage device described in Patent Document 5 uses an artificial intelligence unit to make a diagnosis according to the user based on the data on the doctor's knowledge and know-how based on the user's voice information input by dialogue with the user. Based on the diagnosis result, it is configured to recommend a suitable massage course. Patent Literature 5 also discloses that a diagnosis is made in consideration of physical conditions such as the user's pulse rate, body weight, body fat percentage, and stress index.
 しかしながら、特許文献2に記載の触力覚的情報提示装置は、医療行為をしている作業者に対して、その作業に役立つ情報を振動刺激の形態で伝えるものであり、医療行為を受けている被験者に対して振動刺激を働きかけるものではない。一方、特許文献3~5に記載の治療器では、症状に応じた振動がユーザに与えられることになるが、症状に応じた振動を触感という観点から選定することについては言及されていない。 However, the tactile force sense information presentation device described in Patent Document 2 conveys information useful for the work to a worker performing medical treatment in the form of vibration stimulation. It does not apply vibration stimulation to the subject who is in the room. On the other hand, in the therapy devices described in Patent Documents 3 to 5, vibrations corresponding to symptoms are given to the user, but there is no mention of selecting vibrations corresponding to symptoms from the viewpoint of tactile sensation.
特許第6644293号公報Japanese Patent No. 6644293 特開2013-52046号公報JP 2013-52046 A 特開2012-101018号公報JP 2012-101018 A 特開2002-84344号公報JP-A-2002-84344 特開2018-183474号公報JP 2018-183474 A
 本発明は、ユーザの身体状態または精神状態に応じて、適切な触感を生じさせる振動に関する振動情報をユーザに提供できるようにすることを目的とする。 An object of the present invention is to provide the user with vibration information related to vibration that produces an appropriate tactile sensation according to the user's physical or mental state.
 上記した課題を解決するために、本発明では、ユーザの身体状態または精神状態に関する診断情報を取得し、触質(特定の触感を生じさせる振動の性質)が異なる振動に関する複数の振動情報の中から、診断情報に応じた触質を有する振動に関する振動情報を取得してユーザに提供するようにしている。 In order to solve the above-described problems, the present invention acquires diagnostic information about a user's physical condition or mental condition, and selects among a plurality of pieces of vibration information about vibrations with different tactile qualities (characteristics of vibrations that cause specific tactile sensations). Therefore, the vibration information about the vibration having the tactile quality corresponding to the diagnostic information is acquired and provided to the user.
 上記のように構成した本発明によれば、ユーザの身体状態または精神状態に応じて、適切な触感を生じさせる振動に関する振動情報をユーザに提供することができる。例えば、ある症状を有するユーザに対して、その症状に起因する身体状態または精神状態に関する診断情報に応じて、その症状を緩和あるいは改善する観点から特定の触感を生じさせる振動に関する振動情報を提供することができる。 According to the present invention configured as described above, it is possible to provide the user with vibration information related to vibration that produces an appropriate tactile sensation according to the physical or mental state of the user. For example, for a user who has a certain symptom, in accordance with the diagnostic information about the physical or mental state caused by the symptom, provides vibration information about vibration that produces a specific tactile sensation from the viewpoint of alleviating or improving the symptom. be able to.
第1の実施形態による振動情報提供システムの全体構成例を示す図である。BRIEF DESCRIPTION OF THE DRAWINGS It is a figure which shows the whole structural example of the vibration information provision system by 1st Embodiment. 第1の実施形態によるユーザ端末の機能構成例を示すブロック図である。FIG. 2 is a block diagram showing an example functional configuration of a user terminal according to the first embodiment; FIG. 第1の実施形態によるサーバ装置の機能構成例を示すブロック図である。3 is a block diagram showing a functional configuration example of a server device according to the first embodiment; FIG. 顔画像表示部によりディスプレイに表示される画面の一例を示す図である。It is a figure which shows an example of the screen displayed on a display by a face image display part. 診断情報および振動情報の分類例を示す図である。FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information; 2つの触質パラメータを説明するための図である。It is a figure for demonstrating two tactile parameters. 区間分割の他の例を説明するための図である。FIG. 11 is a diagram for explaining another example of section division; 診断情報および振動情報の分類例を示す図である。FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information; 診断情報および振動情報の分類例を示す図である。FIG. 4 is a diagram showing an example of classification of diagnostic information and vibration information; 第2の実施形態によるユーザ端末の機能構成例を示すブロック図である。FIG. 11 is a block diagram showing an example functional configuration of a user terminal according to the second embodiment; 第2の実施形態によるサーバ装置の機能構成例を示すブロック図である。FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to the second embodiment; FIG. 回答情報表示部によりディスプレイに表示される画面の一例を示す図である。It is a figure which shows an example of the screen displayed on a display by an answer information display part. 第3の実施形態によるユーザ端末の機能構成例を示すブロック図である。FIG. 12 is a block diagram showing an example of functional configuration of a user terminal according to the third embodiment; FIG. 第3の実施形態による医療機関端末の機能構成例を示すブロック図である。FIG. 11 is a block diagram showing an example of the functional configuration of a medical institution terminal according to the third embodiment; 第3の実施形態によるサーバ装置の機能構成例を示すブロック図である。FIG. 11 is a block diagram showing an example of the functional configuration of a server device according to a third embodiment; FIG.
(第1の実施形態)
 以下、本発明の第1の実施形態を図面に基づいて説明する。図1は、第1の実施形態による振動情報提供システムの全体構成例を示す図である。図1に示すように、第1の実施形態による振動情報提供システムは、ユーザが使用するユーザ端末100と、医療機関にて使用する医療機関端末200と、ユーザ端末100および医療機関端末200からのアクセスを受けて情報を提供するサーバ装置300とを備えて構成される。ユーザ端末100とサーバ装置300との間および医療機関端末200とサーバ装置300との間は、インターネットまたは携帯電話網等の通信ネットワーク500を介して接続される。
(First embodiment)
A first embodiment of the present invention will be described below with reference to the drawings. FIG. 1 is a diagram showing an overall configuration example of a vibration information providing system according to a first embodiment. As shown in FIG. 1, the vibration information providing system according to the first embodiment includes a user terminal 100 used by a user, a medical institution terminal 200 used in a medical institution, and signals from the user terminal 100 and the medical institution terminal 200. and a server device 300 that receives access and provides information. The user terminal 100 and the server device 300 and the medical institution terminal 200 and the server device 300 are connected via a communication network 500 such as the Internet or a mobile phone network.
 ユーザ端末100は、例えばパーソナルコンピュータ、タブレット、スマートフォンなどにより構成される。医療機関端末200も同様、例えばパーソナルコンピュータ、タブレット、スマートフォンなどにより構成される。ユーザ端末100は、画像を撮影するカメラを備えている(ユーザ端末100自体がカメラを装備している場合およびユーザ端末100がカメラを接続可能に構成されている場合のどちらでもよい)。ユーザ端末100および医療機関端末200は、画像等を表示するディスプレイを備えている。 The user terminal 100 is configured by, for example, a personal computer, tablet, smartphone, or the like. Similarly, the medical institution terminal 200 is configured by, for example, a personal computer, a tablet, or a smart phone. The user terminal 100 has a camera for capturing images (either the user terminal 100 itself is equipped with a camera or the user terminal 100 is configured to be connectable with a camera). The user terminal 100 and the medical institution terminal 200 have displays for displaying images and the like.
 第1の実施形態では、ユーザ端末100は振動情報取得装置として機能し、サーバ装置300は振動情報提供サーバ装置として機能する。すなわち、第1の実施形態では、ユーザ端末100においてカメラによりユーザの顔の画像を撮影し、その撮影画像をサーバ装置300に送信する。サーバ装置300は、ユーザ端末100から送信された顔の撮影画像を解析し、顔画像に基づく診断結果に応じた振動情報を選定してユーザ端末100に提供する。 In the first embodiment, the user terminal 100 functions as a vibration information acquisition device, and the server device 300 functions as a vibration information providing server device. That is, in the first embodiment, the camera of the user terminal 100 captures an image of the user's face and transmits the captured image to the server device 300 . The server device 300 analyzes the photographed image of the face transmitted from the user terminal 100 , selects vibration information according to the diagnosis result based on the face image, and provides the user terminal 100 with the vibration information.
 ここで提供される振動情報は、顔の撮影画像から解析されるユーザの身体状態または精神状態に関する診断結果に応じて選定されるものであり、特定の触感を生じさせる触質を有する振動に関する振動情報である。振動情報は、バイブレータ等により振動を発生させるための振動源としなる波形情報である(アナログ情報、デジタル情報のどちらでもよい)。ユーザは、この振動情報を使ってバイブレータにより発生させた振動を自身の顔に与えることにより、身体状態または精神状態に関する不調や症状の緩和あるいは改善を図ることが可能である。 The vibration information provided here is selected according to the diagnosis result regarding the user's physical condition or mental condition analyzed from the captured image of the face, and is vibration related to vibration having a tactile quality that causes a specific tactile sensation. Information. The vibration information is waveform information (either analog information or digital information) that serves as a vibration source for generating vibration by a vibrator or the like. Using this vibration information, the user can apply vibrations generated by a vibrator to his or her face, thereby alleviating or improving physical or mental disorders and symptoms.
 サーバ装置300は、ユーザ端末100から送信された顔の撮影画像に関連付けて、その撮影画像を用いてサーバ装置300が行った診断の結果を示す診断情報もユーザ端末100に提供する。また、サーバ装置300は、ユーザ端末100から送信された顔の撮影画像と、その撮影画像を用いてサーバ装置300が生成した診断情報とを医療機関端末200に提供する。これにより、ユーザと医療機関との間で顔の撮影画像と診断情報とが共有される。ユーザは、振動情報を使って不調や症状の改善を図る際に、共有された情報を用いて医療機関と対話を行うことができ、医療機関から適切なアドバイスを受けることも可能である。 The server device 300 also provides the user terminal 100 with diagnostic information indicating the results of diagnosis performed by the server device 300 using the captured image of the face in association with the captured image of the face transmitted from the user terminal 100 . In addition, the server device 300 provides the medical institution terminal 200 with the photographed image of the face transmitted from the user terminal 100 and diagnostic information generated by the server device 300 using the photographed image. Thereby, the photographed image of the face and the diagnostic information are shared between the user and the medical institution. The user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder or symptoms, and it is also possible to receive appropriate advice from the medical institution.
 ユーザ端末100には、上述したように顔の撮影画像をサーバ装置300に送信する機能、サーバ装置300から振動情報および診断情報の提供を受ける機能、顔の撮影画像および診断情報をディスプレイに表示する機能などを有するアプリケーション(以下、振動関連アプリという)がインストールされている。 As described above, the user terminal 100 has a function of transmitting a photographed face image to the server device 300, a function of receiving vibration information and diagnostic information from the server device 300, and displaying the photographed face image and diagnostic information on the display. An application having functions (hereinafter referred to as vibration-related application) is installed.
 図2は、第1の実施形態によるユーザ端末100の機能構成例を示すブロック図である。図2に示すように、ユーザ端末100は、機能構成として、顔画像撮影部11、顔画像記録部12、顔画像送信部13、振動情報受信部14、振動情報記録部15、診断情報受信部16、診断情報記録部17、振動付与部18および顔画像表示部19を備えている。また、ユーザ端末100は、記憶媒体として、顔画像記憶部10Aおよび振動情報記憶部10Bを備えている。 FIG. 2 is a block diagram showing a functional configuration example of the user terminal 100 according to the first embodiment. As shown in FIG. 2, the user terminal 100 includes, as a functional configuration, a facial image capturing unit 11, a facial image recording unit 12, a facial image transmitting unit 13, a vibration information receiving unit 14, a vibration information recording unit 15, and a diagnostic information receiving unit. 16 , a diagnostic information recording unit 17 , a vibration applying unit 18 and a face image display unit 19 . The user terminal 100 also includes a face image storage unit 10A and a vibration information storage unit 10B as storage media.
 上記各機能ブロック11~19は、ハードウェア、DSP(Digital Signal Processor)、ソフトウェアの何れによっても構成することが可能である。例えばソフトウェアによって構成する場合、上記各機能ブロック11~19は、実際にはコンピュータのCPU、RAM、ROMなどを備えて構成され、RAMやROM、ハードディスクまたは半導体メモリ等の記憶媒体に記憶された振動関連アプリのプログラムが動作することによって実現される。特に、顔画像撮影部11、顔画像送信部13、振動情報受信部14、診断情報受信部16の機能は、振動情報取得用プログラムによって実現される。 Each of the above functional blocks 11 to 19 can be configured by hardware, DSP (Digital Signal Processor), or software. For example, when configured by software, each of the functional blocks 11 to 19 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by the operation of related application programs. In particular, the functions of the facial image photographing unit 11, the facial image transmitting unit 13, the vibration information receiving unit 14, and the diagnostic information receiving unit 16 are realized by the vibration information acquisition program.
 顔画像撮影部11は、ユーザ端末100が備えるカメラを用いて、ユーザの顔を撮影する。実際には、顔画像撮影部11は、ユーザの顔およびその周囲の背景を含む画像を撮影する。顔画像記録部12は、顔画像撮影部11による撮影により得られた顔の撮影画像を、その撮影日時を示す情報と共に顔画像記憶部10Aに記憶させる。撮影日時情報と共に撮影画像を記憶することは、画像データ(画像ファイル)に付随する属性情報の1つとして撮影日時情報が含まれる態様で撮影画像を記憶することを含む。 The face image capturing unit 11 uses the camera provided in the user terminal 100 to capture the user's face. Actually, the face image capturing unit 11 captures an image including the user's face and its surrounding background. The face image recording unit 12 stores the photographed image of the face obtained by the photographing by the face image photographing unit 11 in the face image storage unit 10A together with information indicating the photographing date and time. Storing the photographed image together with the photographing date/time information includes storing the photographed image in a manner in which the photographing date/time information is included as one of the attribute information attached to the image data (image file).
 顔画像送信部13は、顔画像撮影部11による撮影により得られた顔の撮影画像をサーバ装置300に送信する。ここで、顔画像送信部13は、背景を含む撮影画像からユーザの顔の部分だけを抽出し、抽出した顔の部分の画像をサーバ装置300に送信する。なお、背景を含む顔の撮影画像をそのままサーバ装置300に送信し、サーバ装置300において、背景を含む撮影画像からユーザの顔の部分だけを抽出するようにしてもよい。あるいは、サーバ装置300において、背景を含む撮影画像においてユーザの顔の部分を認識し、認識した顔の部分を対象として診断のための解析を行うようにしてもよい。以下の説明において、背景を含むまたは含まない顔の撮影画像を単に顔画像ということがある。 The facial image transmission unit 13 transmits to the server device 300 the photographed image of the face obtained by the photographing by the facial image photographing unit 11 . Here, the face image transmission unit 13 extracts only the user's face from the captured image including the background, and transmits the image of the extracted face to the server device 300 . Alternatively, the photographed image of the face including the background may be transmitted to the server device 300 as it is, and the server device 300 may extract only the face of the user from the photographed image including the background. Alternatively, the server device 300 may recognize the user's facial portion in the photographed image including the background, and perform analysis for diagnosis on the recognized facial portion. In the following description, a photographed image of a face with or without a background may simply be referred to as a face image.
 顔画像送信部13は、例えば、顔画像撮影部11により顔画像が撮影されたときに、その撮影画像をサーバ装置300に送信する。また、顔画像送信部13は、顔画像撮影部11による撮影により得られた顔画像が顔画像記録部12によって顔画像記憶部10Aに記憶された後、ユーザにより指定される任意のタイミングで、顔画像記憶部10Aから撮影画像を読み出してサーバ装置300に送信するようにすることも可能である。 For example, when a face image is captured by the face image capturing unit 11, the face image transmitting unit 13 transmits the captured image to the server device 300. In addition, after the face image captured by the face image capturing unit 11 is stored in the face image storage unit 10A by the face image recording unit 12, the face image transmission unit 13, at an arbitrary timing designated by the user, It is also possible to read the photographed image from the face image storage unit 10A and transmit it to the server device 300. FIG.
 ここで、顔画像送信部13は、ユーザの顔の撮影画像を撮影日時情報およびユーザIDと共にサーバ装置300に送信する。ユーザIDは、例えば振動関連アプリをユーザ端末100にインストールする際に振動関連アプリに対して設定される識別情報である。撮影日時情報は、ユーザの顔画像を特定するための情報として利用される。ユーザIDは、ユーザを特定するための情報として利用される。 Here, the facial image transmission unit 13 transmits the photographed image of the user's face to the server device 300 together with the photographing date/time information and the user ID. The user ID is identification information set for the vibration-related application when installing the vibration-related application in the user terminal 100, for example. The shooting date/time information is used as information for specifying the user's face image. A user ID is used as information for identifying a user.
 振動情報受信部14は、サーバ装置300から提供された振動情報を受信する。サーバ装置300が振動情報を提供する機能の詳細については後述する。振動情報記録部15は、振動情報受信部14により受信された振動情報を振動情報記憶部10Bに記憶させる。 The vibration information receiving unit 14 receives vibration information provided from the server device 300 . The details of the function of server device 300 to provide vibration information will be described later. The vibration information recording unit 15 stores the vibration information received by the vibration information receiving unit 14 in the vibration information storage unit 10B.
 診断情報受信部16は、サーバ装置300から顔の撮影画像に関連付けて提供された診断情報を受信する。後述するように、診断情報と顔の撮影画像との関連付けは、ユーザIDおよび撮影日時情報に基づいて行われる。サーバ装置300が診断情報を提供する機能の詳細については後述する。 The diagnostic information receiving unit 16 receives diagnostic information provided from the server device 300 in association with the photographed image of the face. As will be described later, the association between the diagnostic information and the photographed image of the face is performed based on the user ID and photographing date and time information. The details of the function of the server device 300 to provide diagnostic information will be described later.
 診断情報記録部17は、診断情報受信部16により受信された診断情報を、その診断の対象とされた顔の撮影画像に関連付けて顔画像記憶部10Aに記憶させる。診断の対象とされた顔の撮影画像は、撮影日時情報により特定される。 The diagnostic information recording unit 17 stores the diagnostic information received by the diagnostic information receiving unit 16 in the face image storage unit 10A in association with the photographed image of the face to be diagnosed. The photographed image of the face to be diagnosed is specified by the photographing date and time information.
 振動付与部18は、振動情報記憶部10Bに記憶された振動情報を、振動を発生させるためのバイブレータに供給することにより、ユーザの顔に振動を付与するための処理を行う。バイブレータは、ユーザ端末100が装備するものであってもよいし、ユーザ端末100に有線または無線で接続される振動供給体が装備するものであってもよい。振動供給体は、ユーザの顔に被せて使用するタイプのシートまたはマスクのようなものであってもよいし、ユーザの顔に貼って使用するタイプのパッドのようなものであってもよい。また、振動供給体は、ウーハーのような低音再生を行うスピーカユニットでもよく、ユーザに対して直接振動を当てることに代えて、ユーザがいる空間に低周波の振動を放出するようにしてもよい。 The vibration applying unit 18 performs processing for applying vibration to the user's face by supplying the vibration information stored in the vibration information storage unit 10B to a vibrator for generating vibration. The vibrator may be equipped with the user terminal 100, or may be equipped with a vibration supplier connected to the user terminal 100 by wire or wirelessly. The vibration supplier may be a sheet or mask that is put on the user's face, or a pad that is put on the user's face. Also, the vibration supplier may be a speaker unit that reproduces low frequencies, such as a woofer, and instead of applying vibrations directly to the user, low-frequency vibrations may be emitted into the space where the user is present. .
 ユーザ端末100がバイブレータを装備する場合は、例えば振動関連アプリにより提供される振動発生を指示するためのユーザインタフェースを操作することにより、振動情報記憶部10Bに記憶されている振動情報をバイブレータに供給することによって振動を発生させた状態にして、ユーザ端末100をユーザの顔に接触させる。一方、ユーザ端末100に振動供給体を接続する場合は、振動供給体をユーザの顔に接触させた状態にして、振動発生を指示するためのユーザインタフェースを操作することにより、振動情報記憶部10Bに記憶されている振動情報をバイブレータに供給することによって振動供給体を振動させる。振動供給体がスピーカユニットの場合は、空間内の任意の場所にスピーカユニットを設置して、振動情報記憶部10Bに記憶されている振動情報をバイブレータに供給することによって振動を空間に放出する。 When the user terminal 100 is equipped with a vibrator, the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator by operating a user interface for instructing vibration generation provided by a vibration-related application, for example. By doing so, the user terminal 100 is brought into contact with the user's face in a state of generating vibration. On the other hand, when connecting the vibration supplier to the user terminal 100, the vibration information storage unit 10B is operated by operating the user interface for instructing vibration generation while the vibration supplier is in contact with the user's face. vibrate the vibration supplier by supplying the vibration information stored in the vibrator to the vibrator. When the vibration supplier is a speaker unit, the speaker unit is installed at an arbitrary place in the space, and the vibration information stored in the vibration information storage unit 10B is supplied to the vibrator, thereby emitting vibration to the space.
 顔画像表示部19は、顔画像記憶部10Aに記憶されたユーザの顔画像および当該顔画像に関連付けられた診断情報をディスプレイに表示させる。図4は、顔画像表示部19によりディスプレイに表示される画面の一例を示す図である。図4に示すように、顔画像表示部19は、顔の撮影画像と、その撮影日時を示す情報と、サーバ装置300から取得した診断情報とを1つのセット情報として、1以上のセット情報を画面に一覧表示する。 The facial image display unit 19 causes the display to display the user's facial image stored in the facial image storage unit 10A and diagnostic information associated with the facial image. FIG. 4 is a diagram showing an example of a screen displayed on the display by the face image display section 19. As shown in FIG. As shown in FIG. 4, the face image display unit 19 displays one or more pieces of set information, with a photographed image of the face, information indicating the photographing date and time of the photographed face, and diagnosis information acquired from the server device 300 as one piece of set information. List on screen.
 ユーザは、ある日時において顔の画像を撮影してその撮影画像をサーバ装置300に送信する。その結果、ユーザ端末100は、その撮影画像を解析して得られる振動情報と診断情報とをサーバ装置300から取得する。このとき得られる顔の撮影画像、撮影日時情報および診断情報が1つのセット情報であり、互いに関連付けて顔画像記憶部10Aに記憶される。ユーザは、このとき取得した振動情報を用いてバイブレータにより振動を発生させ、顔に振動を与える。 The user captures an image of the face on a certain date and time and transmits the captured image to the server device 300 . As a result, the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image. The photographed image of the face obtained at this time, photographing date and time information, and diagnostic information constitute one set of information, which are stored in the face image storage unit 10A in association with each other. Using the vibration information acquired at this time, the user causes the vibrator to generate vibration and vibrate the face.
 その後、ユーザはまた別の日時において顔の画像を撮影してその撮影画像をサーバ装置300に送信する。その結果、ユーザ端末100は、その撮影画像を解析して得られる振動情報と診断情報とをサーバ装置300から取得する。このとき得られる顔の撮影画像、撮影日時情報および診断情報が別のセット情報であり、互いに関連付けて顔画像記憶部10Aに記憶される。ユーザは、このとき取得した振動情報を用いて顔に振動を与える。 After that, the user takes an image of the face on another date and time and transmits the taken image to the server device 300 . As a result, the user terminal 100 acquires from the server device 300 vibration information and diagnosis information obtained by analyzing the captured image. The photographed image of the face obtained at this time, photographing date and time information, and diagnostic information are separate set information, and are stored in the face image storage unit 10A in association with each other. The user applies vibration to the face using the vibration information acquired at this time.
 このようにして顔画像記憶部10Aに順次蓄積される1以上のセット情報が、顔画像表示部19により図4のように一覧として表示される。この一覧表示により、ユーザは、振動を付与した結果として生じる顔画像の変化と診断情報の変化とを確認することが可能である。 One or more pieces of set information sequentially accumulated in the face image storage unit 10A in this manner are displayed by the face image display unit 19 as a list as shown in FIG. This list display allows the user to confirm changes in the facial image and diagnostic information resulting from the application of vibration.
 図3は、第1の実施形態によるサーバ装置300の機能構成例を示すブロック図である。図3に示すように、本実施形態のサーバ装置300は、機能構成として、診断情報取得部31、振動情報取得部32、振動情報提供部33、診断情報提供部34および診察用情報提供部35を備えている。診断情報取得部31は、より具体的な機能構成として、顔画像取得部31aおよび画像解析部31bを備えている。また、サーバ装置300は、記憶媒体として、顔画像記憶部30Aおよび振動情報記憶部30Bを備えている。 FIG. 3 is a block diagram showing a functional configuration example of the server device 300 according to the first embodiment. As shown in FIG. 3, the server device 300 of the present embodiment includes a diagnostic information acquiring unit 31, a vibration information acquiring unit 32, a vibration information providing unit 33, a diagnostic information providing unit 34, and a diagnostic information providing unit 35 as functional configurations. It has The diagnostic information acquisition unit 31 has a face image acquisition unit 31a and an image analysis unit 31b as more specific functional configurations. The server device 300 also includes a face image storage section 30A and a vibration information storage section 30B as storage media.
 上記各機能ブロック31~35は、ハードウェア、DSP、ソフトウェアの何れによっても構成することが可能である。例えばソフトウェアによって構成する場合、上記各機能ブロック31~35は、実際にはコンピュータのCPU、RAM、ROMなどを備えて構成され、RAMやROM、ハードディスクまたは半導体メモリ等の記憶媒体に記憶された振動情報提供用プログラムが動作することによって実現される。 Each of the functional blocks 31 to 35 can be configured by hardware, DSP, or software. For example, when configured by software, each of the functional blocks 31 to 35 is actually configured with a computer CPU, RAM, ROM, etc., and vibration stored in a storage medium such as RAM, ROM, hard disk, or semiconductor memory. It is realized by running the information providing program.
 診断情報取得部31は、ユーザの身体状態または精神状態に関する診断情報を取得する。具体的には、診断情報取得部31は、顔画像取得部31aおよび画像解析部31bによって診断情報を取得する。顔画像取得部31aは、ユーザ端末100の顔画像送信部13により送信されたユーザの顔の撮影画像を取得し、撮影日時情報およびユーザIDと共に顔画像記憶部30Aに記憶させる。 The diagnostic information acquisition unit 31 acquires diagnostic information regarding the user's physical condition or mental condition. Specifically, the diagnostic information acquisition unit 31 acquires diagnostic information by the face image acquisition unit 31a and the image analysis unit 31b. The facial image acquiring unit 31a acquires the photographed image of the user's face transmitted by the facial image transmitting unit 13 of the user terminal 100, and stores it in the facial image storage unit 30A together with the photographing date/time information and the user ID.
 画像解析部31bは、顔画像取得部31aにより取得された顔の撮影画像に基づいて、ユーザの身体状態または精神状態を解析することによって診断情報を取得し、顔の撮影画像(撮影日時情報)およびユーザIDに関連付けて顔画像記憶部30Aに記憶させる。第1の実施形態では、一例として、画像解析部31bは、顔画像取得部31aにより取得された顔の撮影画像を画像解析することにより、表情の硬さの程度を示す硬さ情報および顔の歪みの程度を示す歪み情報を診断情報として取得する。 The image analysis unit 31b acquires diagnostic information by analyzing the physical state or mental state of the user based on the photographed image of the face acquired by the face image acquisition unit 31a, and obtains the photographed image of the face (image date and time information). and stored in the face image storage unit 30A in association with the user ID. In the first embodiment, as an example, the image analysis unit 31b analyzes the photographed image of the face acquired by the face image acquisition unit 31a to obtain hardness information indicating the degree of hardness of the facial expression and facial expression. Distortion information indicating the degree of distortion is acquired as diagnostic information.
 ここで、表情の硬さの程度を示す硬さ情報は、表情の硬さの程度を複数のレベルに分類する情報である。本実施形態では最もシンプルな例として、表情の硬さの程度を2つのレベルに分類する。この場合、表情の硬さ情報は、表情が“硬い”または“柔らかい”の何れかを示す情報となる。なお、分類するレベルの数は2つに限定されるものではない。 Here, the hardness information indicating the degree of hardness of facial expressions is information that classifies the degree of hardness of facial expressions into a plurality of levels. In this embodiment, as the simplest example, the degrees of facial expression hardness are classified into two levels. In this case, the facial expression hardness information is information indicating whether the facial expression is "hard" or "soft." Note that the number of classification levels is not limited to two.
 これと同様に、顔の歪みの程度を示す歪み情報は、顔の歪みの程度を複数のレベルに分類する情報である。本実施形態では最もシンプルな例として、顔の歪みの程度を2つのレベルに分類する。この場合、顔の歪み情報は、歪みが“大きい”または“小さい”の何れかを示す情報となる。なお、分類するレベルの数は2つに限定されるものではない。 Similarly, the distortion information indicating the degree of facial distortion is information that classifies the degree of facial distortion into a plurality of levels. In this embodiment, as the simplest example, the degree of facial distortion is classified into two levels. In this case, the facial distortion information is information indicating whether the distortion is "large" or "small." Note that the number of classification levels is not limited to two.
 一般的に人は、皮膚や眉、目、唇、頬などを動かして表情を作り出すときに作用する表情筋が強ばって硬くなると、表情が硬くなる。表情筋が硬くなるのは、人が緊張している状況、疲れが溜まっている状況、不快を感じている状況、悩みやストレスがある状況などであることが知られている。従って、表情の硬さは、人の緊張、疲れ、不快、悩みやストレスの程度を示唆する情報として利用することが可能である。 In general, when people move their skin, eyebrows, eyes, lips, cheeks, etc. to create facial expressions, their facial muscles become stiff and stiff, and their facial expressions become stiff. It is known that facial muscles become stiff when a person is tense, tired, uncomfortable, worried or stressed. Therefore, the firmness of the facial expression can be used as information suggesting the degree of tension, fatigue, discomfort, worries, and stress of a person.
 一例として、顔の撮影画像に基づく表情の硬さの解析は、機械学習によって生成した分類モデルを用いて行うことが可能である。例えば、複数の人について撮影される複数の顔画像を学習用データとして用いて機械学習を行うことにより、顔画像が入力された際に表情の硬さ情報が出力されるようにニューラルネットワークのパラメータが調整された分類モデルを生成することが可能である。ここで用いる学習用データは、人が緊張、疲れ、不快、悩みやストレスがあると感じているときの顔の撮影画像と、同じ人が緊張、疲れ、不快、悩みやストレスがないと感じているときの顔の撮影画像とし、それぞれに対して表情が“硬い”または“柔らかい”の何れかを示すラベル情報を付与したものとする。なお、生成する分類モデルの形態は、ニューラルネットワークモデルに限定されない。例えば、回帰モデル、木モデル、ベイズモデル、クラスタリングモデルなどのうち何れかとすることも可能である。 As an example, it is possible to analyze the hardness of facial expressions based on captured images of faces using a classification model generated by machine learning. For example, by performing machine learning using multiple face images taken of multiple people as learning data, the neural network parameter It is possible to generate an adjusted classification model. The learning data used here are images of a person's face when he/she feels tense, tired, uncomfortable, worried, or stressed, and images of the same person's face when he/she feels tense, tired, uncomfortable, worried, or stress-free. It is assumed that the photographed image of the face when the person is in the room is provided with label information indicating whether the facial expression is "hard" or "soft". Note that the form of the generated classification model is not limited to the neural network model. For example, any one of a regression model, a tree model, a Bayesian model, a clustering model, etc. is also possible.
 また、一般的に、加齢と共に顔や頭の筋肉が衰えたり、緊張、疲れ、不快、悩みやストレスのなどによって表情筋が硬くなったりすると、顔が左右非対称に歪んでしまうことが知られている。そこで、画像解析部31bは、顔の左右の対称性を解析することによって、顔の歪みの程度を示す歪み情報を取得する。 In addition, it is generally known that the face becomes distorted asymmetrically when the muscles of the face and head weaken with age, or when the facial muscles become stiff due to tension, fatigue, discomfort, worries and stress. ing. Therefore, the image analysis unit 31b acquires distortion information indicating the degree of distortion of the face by analyzing the left-right symmetry of the face.
 例えば、画像解析部31bは、鼻の中心点を通る上下方向の仮想線を設定し、仮想線より左側の顔の各部位と、仮想線より右側の顔の各部位とのズレの大きさを解析し、ズレの大きさが閾値以上の場合は歪みが“大きい”と判定し、ズレの大きさが閾値未満の場合は歪みが“小さい”と判定するようにすることが可能である。各部位のズレの大きさは、例えば、仮想線より左側の顔画像から線対称な鏡像画像を生成し、この鏡像画像を仮想線より右側の顔画像に重ねたときに、各部位ごとに輪郭に相当する画素位置がどの程度離間しているかの差分量として検出することが可能である。 For example, the image analysis unit 31b sets a vertical virtual line passing through the center point of the nose, and determines the amount of deviation between each part of the face on the left side of the virtual line and each part of the face on the right side of the virtual line. It is possible to analyze and determine that the distortion is "large" if the magnitude of the deviation is greater than or equal to the threshold, and that the distortion is "small" if the magnitude of the deviation is less than the threshold. For example, the magnitude of the deviation of each part is obtained by generating a line-symmetrical mirror image from the facial image on the left side of the virtual line, and superimposing this mirror image on the facial image on the right side of the virtual line. can be detected as the amount of difference between the pixel positions corresponding to .
 ここで、眉、目、唇、頬などの各部位ごとにズレの大きさを検出してスコアリングし、各部位ごとに求められたスコアを重み付け加算することによってトータルスコアを算出し、このトータルスコアが閾値以上か否かによって、顔の歪みが“大きい”または“小さい”の何れかに分類するようにしてもよい。 Here, the amount of misalignment is detected and scored for each part such as the eyebrows, eyes, lips, cheeks, etc., and the total score is calculated by weighting and adding the scores obtained for each part. The distortion of the face may be classified as either "large" or "small" depending on whether it is equal to or greater than the threshold.
 以上のようにして取得される顔の表情の硬さ情報と顔の歪み情報との組み合わせによって、ユーザの顔画像を4つのタイプに分類することが可能である。すなわち、図5(a)に示すように、表情が“硬い”かつ歪みが“大きい”第1タイプF1、表情が“柔らかい”かつ歪みが“大きい”第2タイプF2、表情が“柔らかい”かつ歪みが“小さい”第3タイプF3、および、表情が“硬い”かつ歪みが“小さい”第4タイプF4の何れかに顔画像を分類することが可能である。以下、これらの各タイプF1~F4を診断結果タイプという。 It is possible to classify the user's facial image into four types by combining the facial expression hardness information and the facial distortion information acquired as described above. That is, as shown in FIG. 5A, a first type F1 with a “hard” facial expression and “large” distortion, a second type F2 with a “soft” facial expression and a “large” distortion, and a “soft” and “large” facial expression It is possible to classify face images into either a third type F3 with "small distortion" or a fourth type F4 with a "hard" facial expression and a "small distortion". These types F1 to F4 are hereinafter referred to as diagnostic result types.
 振動情報取得部32は、触質(特定の触感を生じさせる振動の性質)が異なる振動に関する複数の振動情報のうち、診断情報取得部31により取得された診断情報(顔の撮影画像と関連付けて顔画像記憶部30Aに記憶された診断情報)に応じた触質を有する振動に関する振動情報を取得する。すなわち、振動情報取得部32は、画像解析部31bにより解析されたユーザの顔の硬さ情報および歪み情報に応じた触質を有する振動に関する振動情報を取得する。 The vibration information acquisition unit 32 selects the diagnostic information acquired by the diagnostic information acquisition unit 31 from among a plurality of pieces of vibration information related to vibrations with different tactile qualities (characteristics of vibration that cause a specific tactile sensation). Vibration information related to vibration having a tactile quality corresponding to the diagnostic information stored in the face image storage unit 30A is acquired. That is, the vibration information acquisition unit 32 acquires vibration information about vibration having a tactile quality corresponding to the hardness information and distortion information of the user's face analyzed by the image analysis unit 31b.
 ここで、触質が異なる振動に関する複数の振動情報は、当該振動が有する触感の硬さ(柔らかさ)および粗さ(滑らかさ)に関連付けて分類された態様で振動情報記憶部30Bにあらかじめ記憶されている。すなわち、図5(b)に示すように、振動情報は、触感の硬さと粗さとの組み合わせによって、触感が“硬い”かつ“粗い”第1タイプV1、触感が“柔らかい”かつ“粗い”第2タイプV2、触感が“柔らかい”かつ“滑らか”な第3タイプV3、および、触感が“硬い”かつ“滑らか”な第4タイプV4に分類されている。以下、これらの各タイプV1~V4を振動触感タイプという。 Here, a plurality of pieces of vibration information about vibrations with different tactile qualities are stored in advance in the vibration information storage unit 30B in a manner classified in association with the hardness (softness) and roughness (smoothness) of the tactile sensation of the vibration. It is That is, as shown in FIG. 5(b), the vibration information includes a first type V1 with a “hard” and “coarse” tactile sensation, a first type V1 with a “soft” and “coarse” tactile sensation, and a second type V1 with a “soft” and “coarse” tactile sensation. It is classified into two types V2, a third type V3 having a "soft" and "smooth" tactile sensation, and a fourth type V4 having a "hard" and "smooth" tactile sensation. These types V1 to V4 are hereinafter referred to as vibration tactile sensation types.
 例えば、触感の硬さおよび粗さに関連付けて4つの振動触感タイプV1~V4に分類された態様で、1つのタイプにつき1つの振動情報が振動情報記憶部30Bにあらかじめ記憶されている。振動情報は、40Hz以下(好ましくは20Hz以下、更に好ましくは10Hz以下、更に好ましくは5Hz以下)の低周波の振動情報とするのが好ましい。なお、音楽などの可聴音に対して低周波の振動情報をミキシングしたものを用いてもよい。 For example, one piece of vibration information is stored in the vibration information storage unit 30B in advance for each type in a mode classified into four vibration tactile sensation types V1 to V4 in association with the hardness and roughness of the tactile sensation. The vibration information is preferably low-frequency vibration information of 40 Hz or less (preferably 20 Hz or less, more preferably 10 Hz or less, still more preferably 5 Hz or less). It should be noted that audible sound such as music may be mixed with low-frequency vibration information.
 振動情報取得部32は、ユーザの顔の表情の硬さ情報および顔の歪み情報の組み合わせによって特定される診断結果タイプF1~F4と、振動が有する触感の硬さ(柔らかさ)および粗さ(滑らかさ)の組み合わせによって特定される振動触感タイプV1~V4との対応関係をもとに、診断情報に対応する触質の振動情報を振動情報記憶部30Bから取得する。なお、この振動情報記憶部30Bに記憶される振動情報および振動情報取得部32の動作の詳細については後述する。 The vibration information acquisition unit 32 obtains the diagnosis result types F1 to F4 specified by the combination of the hardness information of the user's facial expression and the facial distortion information, and the hardness (softness) and roughness of the tactile sensation of the vibration ( Vibration information of the tactile quality corresponding to the diagnosis information is acquired from the vibration information storage unit 30B based on the correspondence relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the smoothness). Details of the vibration information stored in the vibration information storage unit 30B and the operation of the vibration information acquisition unit 32 will be described later.
 振動情報提供部33は、振動情報取得部32により取得された振動情報をユーザ端末100に提供する。この振動情報提供部33により提供された振動情報が、ユーザ端末100の振動情報受信部14により受信され、振動情報記録部15により振動情報記憶部10Bに記憶される。 The vibration information providing unit 33 provides the user terminal 100 with the vibration information acquired by the vibration information acquiring unit 32 . The vibration information provided by the vibration information providing unit 33 is received by the vibration information receiving unit 14 of the user terminal 100, and is stored by the vibration information recording unit 15 in the vibration information storage unit 10B.
 診断情報提供部34は、画像解析部31bにより取得された診断情報を、顔画像取得部31aにより取得された顔の撮影画像に関連付けてユーザ端末100に提供する。すなわち、診断情報提供部34は、顔の撮影画像と関連付けて顔画像記憶部30Aに記憶された診断情報を、当該顔の撮影画像を特定することが可能な撮影日時情報と共にユーザ端末100に提供する。この診断情報提供部34により撮影日時情報と共に提供された診断情報が、ユーザ端末100の診断情報受信部16により受信され、診断情報記録部17によって撮影日時情報に対応する顔画像に関連付けて顔画像記憶部10Aに記憶される。 The diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information acquired by the image analysis unit 31b in association with the photographed image of the face acquired by the face image acquisition unit 31a. That is, the diagnostic information providing unit 34 provides the user terminal 100 with the diagnostic information stored in the face image storage unit 30A in association with the photographed image of the face, together with the photographing date and time information capable of specifying the photographed image of the face. do. The diagnostic information provided by the diagnostic information providing unit 34 together with the photographing date/time information is received by the diagnostic information receiving unit 16 of the user terminal 100, and the diagnostic information recording unit 17 associates the face image with the face image corresponding to the photographing date/time information. It is stored in the storage unit 10A.
 診察用情報提供部35は、例えば医療機関端末200からの要求に応じて、顔画像取得部31aにより取得されたユーザの顔の撮影画像および画像解析部31bにより取得された診断情報(顔画像記憶部30Aに記憶された顔の撮影画像とそれに関連付けられた診断情報)を診察用情報として医療機関端末200に提供する。 For example, in response to a request from the medical institution terminal 200, the diagnostic information providing unit 35 obtains the photographed image of the user's face obtained by the face image obtaining unit 31a and the diagnostic information (facial image storage) obtained by the image analysis unit 31b. The photographed image of the face stored in the unit 30A and diagnostic information associated therewith) is provided to the medical institution terminal 200 as diagnostic information.
 例えば、医療機関端末200は、サーバ装置300にアクセスし、ユーザから知らされたユーザIDを指定して診察用情報の取得を要求する。この要求に応じて、診察用情報提供部35は、ユーザIDに関連付けて顔画像記憶部30Aに記憶されているユーザの顔の撮影画像および診断情報を一覧として含む画面を生成して医療機関端末200に提供する。これにより、医療機関端末200においても、図4に示した画面例と同様の情報を含む画面がディスプレイに表示される。 For example, the medical institution terminal 200 accesses the server device 300, designates the user ID provided by the user, and requests acquisition of diagnostic information. In response to this request, the medical examination information providing unit 35 generates a screen containing a list of photographed images of the user's face and diagnostic information stored in the facial image storage unit 30A in association with the user ID, and sends it to the medical institution terminal. 200 offers. As a result, a screen including information similar to the screen example shown in FIG. 4 is displayed on the display of the medical institution terminal 200 as well.
 ここで、振動情報記憶部30Bに記憶されている振動情報の詳細について説明する。上述したように、触質が異なる複数の振動情報は、当該振動情報が有する触感の硬さ(柔らかさ)および粗さ(滑らかさ)に関連付けて分類されている。すなわち、振動情報記憶部30Bに記憶される振動情報は、触感の硬さおよび粗さに関して固有の触覚効果を有している。振動情報の触覚効果は、振動情報が有する固有の触質パラメータによって特定される。 Details of the vibration information stored in the vibration information storage unit 30B will now be described. As described above, a plurality of pieces of vibration information with different tactile qualities are classified in association with the hardness (softness) and roughness (smoothness) of the tactile sensation of the vibration information. That is, the vibration information stored in the vibration information storage section 30B has a unique tactile effect with respect to the hardness and roughness of the tactile sensation. The haptic effect of vibration information is specified by the inherent tactile parameters of the vibration information.
 触質パラメータは、<硬い-柔らかい>、<粗い-滑らか>のように対立する触質(以下、触質対という)の程度を表すパラメータであり、触感の一要素を成すものである。例えば、<硬い-柔らかい>という触質対に関する触質パラメータとして、振動波形の強度(第1の触質パラメータh)を用いることが可能である。すなわち、強度を表す値が大きいほど硬いことを表し、強度を表す値が小さいほど柔らかいことを表す。また、<粗い-滑らか>という触質対に関する触質パラメータとして、振動波形の分割区間の長さ(第2の触質パラメータt)を用いることが可能である。すなわち、分割区間の長さを表す値が大きいほど滑らかであることを表し、長さを表す値が小さいほど粗いことを表す。 The tactile parameter is a parameter representing the degree of opposing tactile qualities (hereinafter referred to as a tactile pair) such as <hard-soft> and <rough-smooth>, and constitutes one element of tactile sensation. For example, it is possible to use the intensity of the vibration waveform (first tactile parameter h) as the tactile parameter for the <hard-soft> tactile pair. That is, the larger the strength value, the harder the material, and the smaller the strength value, the softer the material. Further, as a tactile parameter related to the <rough-smooth> tactile pair, it is possible to use the length of the divided section of the vibration waveform (second tactile parameter t). That is, the larger the value representing the length of the divided section, the smoother it is, and the smaller the value representing the length, the rougher it is.
 図6は、2つの触質パラメータh,tを説明するための図であり、振動情報のエンベロープ波形(以下、振動波形という)を模式的に示している。図6に示すように、振動波形を時間軸方向に複数に分割する。図6では一例として、振動波形の振幅が極小となる時間毎に分割している。すなわち、振動波形の開始点から1つ目の極小値までを第1の分割区間T1、1つ目の極小値から2つ目の極小値までを第2の分割区間T2、2つ目の極小値から3つ目の極小値までを第3の分割区間T3、・・・のように、振動波形を時間軸方向に複数に分割している。 FIG. 6 is a diagram for explaining two tactile parameters h and t, and schematically shows an envelope waveform (hereinafter referred to as a vibration waveform) of vibration information. As shown in FIG. 6, the vibration waveform is divided into a plurality of parts along the time axis. In FIG. 6, as an example, it is divided for each time when the amplitude of the vibration waveform becomes minimum. That is, the first divided section T1 is from the start point of the vibration waveform to the first minimum value, the second divided section T2 is from the first minimum value to the second minimum value, and the second minimum The vibration waveform is divided into a plurality of sections in the direction of the time axis such as a third division section T3, . . .
 なお、振動波形の分割のし方は、図6に示した例に限定されない。例えば、振幅が極大となる時間毎に振動波形を複数の区間に分割するようにしてもよい。あるいは、正の値の振幅および負の値の振幅がある振動情報の場合は、振幅値がゼロとなる時間毎に振動波形を複数の区間に分割するようにしてもよい。 It should be noted that the method of dividing the vibration waveform is not limited to the example shown in FIG. For example, the vibration waveform may be divided into a plurality of sections at each time when the amplitude reaches its maximum. Alternatively, in the case of vibration information having a positive amplitude and a negative amplitude, the vibration waveform may be divided into a plurality of sections each time the amplitude value becomes zero.
 あるいは、図7(a)に示すように、振動波形において他の箇所と区別し得る複数の特徴箇所を抽出し、当該特徴箇所ごとに振動波形を複数の区間に分割するようにしてもよい。一例として、振動波形において所定時間(例えば、0.1秒)の間に振幅値が所定値以上大きくなる箇所(振動波形の振幅値が急激に大きくなる箇所)を特徴箇所F,F,F,・・・として抽出し、当該特徴箇所F,F,F,・・・ごとに複数の分割区間T1,T2,T3,・・・を設定するようにしてもよい。 Alternatively, as shown in FIG. 7A, a plurality of characteristic points that can be distinguished from other points in the vibration waveform may be extracted, and the vibration waveform may be divided into a plurality of sections for each characteristic point. As an example, characteristic points F 1 , F 2 , F 3 , . . . may be extracted, and a plurality of divided sections T1, T2, T3, .
 別の例として、図7(b)に示すように、同じ振動波形を複数回繰り返すパターンによって振動情報を構成し、繰り返される同じ振動波形の単位で複数の分割区間T1,T2,T3,・・・を設定するようにしてもよい。 As another example, as shown in FIG. 7B, vibration information is configured by a pattern in which the same vibration waveform is repeated multiple times, and a plurality of divided sections T1, T2, T3, .・ may be set.
 そして、それぞれの分割区間T1,T2,T3,・・・から、第1の触質パラメータとして代表振幅h1,h2,h3,・・・を特定するとともに、第2の触質パラメータとして分割区間の時間の長さt1,t2,t3,・・・を特定する。ここで、代表振幅h1,h2,h3,・・・は、それぞれの分割区間T1,T2,T3,・・・における開始点の極小値または終了点の極小値のうち値が大きい方と、分割区間T1,T2,T3,・・・における極大値との差分の値を示している。 Then, representative amplitudes h1, h2, h3, . Identify the lengths of time t1, t2, t3, . Here, the representative amplitudes h1, h2, h3, . The values of the differences from the maximum values in the intervals T1, T2, T3, . . . are shown.
 すなわち、分割区間T1に関しては、極小値が1つしかないので、この極小値と極大値との差分が代表振幅h1となる。分割区間T2に関しては、当該区間の開始点の極小値の方が終了点の極小値よりも大きいので、開始点の極小値と極大値との差分が代表振幅h2となる。また、分割区間T3に関しては、当該区間の開始点の極小値よりも終了点の極小値の方が大きいので、終了点の極小値と極大値との差分が代表振幅h3となる。 That is, since there is only one local minimum value for the divided section T1, the difference between this local minimum value and the local maximum value is the representative amplitude h1. Regarding the divided section T2, since the minimum value at the start point of the section is larger than the minimum value at the end point, the difference between the minimum value and the maximum value at the start point becomes the representative amplitude h2. As for the divided section T3, the minimum value at the end point is larger than the minimum value at the start point of the section, so the difference between the minimum value and the maximum value at the end point is the representative amplitude h3.
 なお、ここで示した代表振幅の特定方法は一例であり、これに限定されるものではない。例えば、それぞれの分割区間T1,T2,T3,・・・における開始点の極小値または終了点の極小値のうち値が小さい方と、分割区間T1,T2,T3,・・・における極大値との差分を代表振幅として特定するようにしてもよい。 It should be noted that the method of specifying the representative amplitude shown here is an example, and is not limited to this. For example, the smaller one of the minimum value of the start point or the minimum value of the end point in each divided section T1, T2, T3, . . . and the maximum value in the divided section T1, T2, T3, may be specified as the representative amplitude.
 また、正の値の振幅および負の値の振幅がある振動情報を、振幅値がゼロとなる時間毎に分割した場合は、各分割区間における正の極大値または負の極小値を第1の触質パラメータhの代表振幅として特定するようにしてもよい。ここで、負の極小値に関しては、その絶対値を第1の触質パラメータhの代表振幅として特定するようにしてもよい。 Further, when the vibration information having a positive amplitude value and a negative value amplitude is divided for each time when the amplitude value is zero, the positive maximum value or negative minimum value in each divided section is the first It may be specified as a representative amplitude of the tactile parameter h. Here, the absolute value of the negative minimum value may be specified as the representative amplitude of the first tactile parameter h.
 第1の触質パラメータhは、例えば、以上のように複数の分割区間ごとに特定した代表振幅h1,h2,h3,・・・の平均値、最大値、最小値または中央値とする。また、第2の触質パラメータtは、以上のように複数の分割区間ごとに特定した時間の長さt1,t2,t3,・・・の平均値、最大値、最小値または中央値とする。 The first tactile parameter h is, for example, the average value, maximum value, minimum value, or median value of the representative amplitudes h1, h2, h3, . The second tactile parameter t is the average value, maximum value, minimum value, or median value of the time lengths t1, t2, t3, . .
 別の例として、代表振幅h1,h2,h3,・・・のうち閾値を超えるものの数または全体数に対する割合を第1の触質パラメータhとし、分割区間の時間の長さt1,t2,t3,・・・のうち閾値を超えるものの数または全体数に対する割合を第2の触質パラメータtとするようにしてもよい。 As another example, the number of representative amplitudes h1, h2, h3, . , . . . exceeding a threshold value or the ratio to the total number may be used as the second tactile parameter t.
 このように、振動情報は、振動の触感の硬さを表す第1の触質パラメータh(振動波形の強度)と、振動の触感の粗さを表す第2の触質パラメータt(分割区間の長さ)とにより特徴付けることが可能であり、第1の触質パラメータhと第2の触質パラメータtとの組み合わせによって4つの振動触感タイプV1~V4の何れかに分類することが可能である。 In this way, the vibration information includes the first tactile parameter h (intensity of the vibration waveform) representing the hardness of the tactile sensation of vibration, and the second tactile parameter t (the number of divided sections) representing the roughness of the tactile sensation of vibration. length), and can be classified into any of four vibrotactile sensation types V1 to V4 by the combination of the first tactile parameter h and the second tactile parameter t. .
 例えば、振動情報は、第1の触質パラメータhが第1閾値Th1以上かつ第2の触質パラメータtの逆数(1/t)が第2閾値Th2以上の第1タイプV1、第1の触質パラメータhが第1閾値Th1未満かつ第2の触質パラメータtの逆数が第2閾値Th2以上の第2タイプV2、第1の触質パラメータhが第1閾値Th1未満かつ第2の触質パラメータtの逆数が第2閾値Th2未満の第3タイプV3、および、第1の触質パラメータhが第1閾値Th1以上かつ第2の触質パラメータtの逆数が第2閾値Th2未満の第4タイプV4の4タイプに分類される。 For example, the vibration information is a first type V1 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal (1/t) of the second tactile parameter t is equal to or greater than the second threshold Th2. A second type V2 in which the quality parameter h is less than the first threshold Th1 and the reciprocal of the second tactile parameter t is equal to or greater than the second threshold Th2, the first tactile parameter h is less than the first threshold Th1 and the second tactile A third type V3 in which the reciprocal of the parameter t is less than the second threshold Th2, and a fourth type V3 in which the first tactile parameter h is equal to or greater than the first threshold Th1 and the reciprocal of the second tactile parameter t is less than the second threshold Th2 It is classified into four types of type V4.
 上述したように、振動情報取得部32は、ユーザの顔の表情の硬さ情報および顔の歪み情報の組み合わせによって特定される診断結果タイプF1~F4と、振動情報が有する触感の硬さおよび粗さの組み合わせによって特定される振動触感タイプV1~V4との対応関係をもとに、ユーザの顔の診断情報に対応する触質の振動情報を振動情報記憶部30Bから取得する。 As described above, the vibration information acquisition unit 32 acquires the diagnosis result types F1 to F4 specified by the combination of the facial expression hardness information and the facial distortion information of the user, and the hardness and coarseness of the tactile sensation of the vibration information. Vibration information of the tactile quality corresponding to the diagnosis information of the user's face is obtained from the vibration information storage unit 30B based on the corresponding relationship with the vibration tactile sensation types V1 to V4 specified by the combination of the tactile sensations.
 例えば、振動情報取得部32は、ユーザの顔の表情の硬さおよび顔の歪みが何れも中庸な状態(図5(a)においてFcで示す位置の状態)に近づくように、診断結果タイプF1~F4に対応付けられた振動触感タイプV1~V4に属する振動情報を振動情報記憶部30Bから取得する。ここで、診断結果タイプF1~F4と振動触感タイプV1~V4は、<F1-V3>、<F2-V4>、<F3-V1>および<F4-V2>のように対応付けられている。 For example, the vibration information acquisition unit 32 selects the diagnosis result type F1 so that both the hardness of the facial expression and the facial distortion of the user approach a moderate state (the state of the position indicated by Fc in FIG. 5A). Vibration information belonging to the vibration tactile sensation types V1 to V4 associated with F4 is obtained from the vibration information storage unit 30B. Here, the diagnosis result types F1 to F4 and the vibration tactile sensation types V1 to V4 are associated as <F1-V3>, <F2-V4>, <F3-V1> and <F4-V2>.
 これは、診断情報が第1タイプF1の場合は対極側にある第3タイプV3の振動情報を提供し、診断情報が第2タイプF2の場合は対極側にある第4タイプV4の振動情報を提供し、診断情報が第3タイプF3の場合は対極側にある第1タイプV1の振動情報を提供し、診断情報が第4タイプF4の場合は対極側にある第2タイプV2の振動情報を提供することを意味する。 This provides vibration information of a third type V3 on the opposite side when the diagnostic information is of the first type F1, and provides vibration information of a fourth type V4 on the opposite side when the diagnostic information is of the second type F2. When the diagnostic information is the third type F3, the vibration information of the first type V1 on the opposite electrode side is provided, and when the diagnostic information is the fourth type F4, the vibration information of the second type V2 on the opposite electrode side is provided. means to provide.
 これにより、例えば診断情報が第1タイプF1の場合、つまり、画像解析部31bによってユーザの顔の表情が“硬い”かつ顔の歪みが“大きい”と診断された場合は、第3タイプV3の振動情報、つまり、“柔らかい”かつ“滑らか”な触感を有する振動情報がユーザに提供される。この振動情報は、ユーザの顔の硬い表情を柔らかくし、ユーザの顔の歪みを滑らかにして小さくするように作用するものである。ユーザがこの振動を実際に顔に与えることにより、第1タイプF1に属する顔の状態が中庸の状態Fcに近づくことが期待される。 As a result, for example, when the diagnostic information is of the first type F1, that is, when the image analysis unit 31b diagnoses that the user's facial expression is "hard" and the facial distortion is "large", the diagnostic information is of the third type V3. Vibration information, that is, vibration information having a "soft" and "smooth" tactile sensation, is provided to the user. This vibration information acts to soften the hard expression of the user's face and to smooth and reduce the distortion of the user's face. When the user actually applies this vibration to the face, it is expected that the state of the face belonging to the first type F1 approaches the moderate state Fc.
 以上詳しく説明したように、第1の実施形態では、ユーザの身体状態または精神状態に関する診断情報として、ユーザの顔の表情の硬さの程度を示す硬さ情報および顔の歪みの程度を示す歪み情報を取得し、触質が異なる複数の振動情報の中から、診断情報に応じた触質を有する振動情報を取得してユーザに提供するようにしている。 As described above in detail, in the first embodiment, hardness information indicating the degree of hardness of the user's facial expression and distortion indicating the degree of facial distortion are used as diagnostic information regarding the user's physical state or mental state. Information is acquired, and vibration information having a tactile quality corresponding to the diagnostic information is acquired from among a plurality of pieces of vibration information having different tactile qualities, and is provided to the user.
 このように構成した第1の実施形態によれば、ユーザの身体状態または精神状態に応じて、適切な触感を生じさせる振動に関する振動情報をユーザに提供することができる。例えば、ある症状を有するユーザに対して、その症状に起因して顔に現れる身体状態または精神状態に関する診断情報に応じて、その症状を緩和あるいは改善するという観点において有効な触感を生じさせる振動情報を提供することができる。 According to the first embodiment configured in this way, it is possible to provide the user with vibration information regarding vibrations that produce an appropriate tactile sensation according to the user's physical or mental state. For example, for a user with a certain symptom, vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information on the physical condition or mental condition that appears on the face due to the symptom. can be provided.
 また、第1の実施形態では、ユーザの顔の撮影画像と診断情報とをユーザと医療機関とで共有する仕組みを提供している。これにより、ユーザは、振動情報を使って不調や症状の改善を図る際に、共有された情報を用いて医療機関と対話を行うことができ、医療機関から適切なアドバイスを受けることも可能である。 In addition, the first embodiment provides a mechanism for sharing the photographed image of the user's face and diagnostic information between the user and the medical institution. As a result, the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
 なお、上記第1の実施形態では、ユーザの顔の表情の硬さおよび顔の歪みの大きさの2つの観点から診断情報を4つの診断結果タイプF1~F4に分類するとともに、触感の硬さおよび粗さの2つの観点から振動情報を4つの振動触感タイプV1~V4に分類する例について説明したが、本発明はこれに限定されない。例えば、図8に示すように、ユーザの顔の表情の硬さの観点から診断情報を2つの診断結果タイプF1’,F2’に分類するとともに、触感の硬さの観点から振動情報を2つの振動触感タイプV1’,V2’に分類するようにしてもよい。また、図9に示すように、ユーザの顔の歪みの大きさの観点から診断情報を2つの診断結果タイプF1”,F2”に分類するとともに、触感の粗さの観点から振動情報を2つの振動触感タイプV1”,V2”に分類するようにしてもよい。 In the above-described first embodiment, the diagnostic information is classified into four diagnostic result types F1 to F4 from the two viewpoints of the hardness of the user's facial expression and the degree of facial distortion, and the hardness of the tactile sensation is classified into four types. Although an example in which vibration information is classified into four vibration tactile sensation types V1 to V4 from the two viewpoints of roughness and roughness has been described, the present invention is not limited to this. For example, as shown in FIG. 8, the diagnostic information is classified into two diagnostic result types F1′ and F2′ from the viewpoint of the hardness of the facial expression of the user, and the vibration information is divided into two types from the viewpoint of the hardness of the tactile sensation. It may be classified into vibration tactile sensation types V1' and V2'. Further, as shown in FIG. 9, the diagnostic information is classified into two diagnostic result types F1'' and F2'' from the viewpoint of the degree of distortion of the user's face, and the vibration information is divided into two types from the viewpoint of the roughness of the tactile sensation. It may be classified into vibration tactile sensation types V1″ and V2″.
 また、上記第1の実施形態において、顔画像撮影部11により撮影する顔画像は動画像であってもよい。動画像の場合、例えば、上記実施形態で示した画像解析をフレームごとに実行し、表情の硬さの程度の変化および顔の歪みの程度の変化の少なくとも一方に関する指標値を、ユーザの身体状態または精神状態に関する診断情報として取得するようにしてもよい。 Also, in the first embodiment, the face image captured by the face image capturing unit 11 may be a moving image. In the case of moving images, for example, the image analysis shown in the above embodiment is executed for each frame, and index values relating to at least one of changes in the degree of facial expression hardness and changes in the degree of facial distortion are calculated based on the physical state of the user. Or you may make it acquire as diagnostic information regarding a mental state.
(第2の実施形態)
 次に、本発明の第2の実施形態を図面に基づいて説明する。第2の実施形態による振動情報提供システムの全体構成は、図1と同様である。図10は、第2の実施形態によるユーザ端末100の機能構成例を示すブロック図である。この図10において、図2に示した符号と同一の符号を付したものは同一の機能を有するものであるので、ここでは重複する説明を省略する。
(Second embodiment)
Next, a second embodiment of the present invention will be described with reference to the drawings. The overall configuration of the vibration information providing system according to the second embodiment is the same as in FIG. FIG. 10 is a block diagram showing a functional configuration example of the user terminal 100 according to the second embodiment. In FIG. 10, the parts denoted by the same reference numerals as those shown in FIG. 2 have the same functions, and redundant description will be omitted here.
 図10に示すように、第2の実施形態によるユーザ端末100は、図2に示した顔画像撮影部11、顔画像記録部12、顔画像送信部13、診断情報受信部16、診断情報記録部17、顔画像表示部19および顔画像記憶部10Aに代えて、問診実行部21、回答情報記録部22、回答情報送信部23、診断情報受信部26、診断情報記録部27、回答情報表示部29および回答情報記憶部20Aを備えている。 As shown in FIG. 10, the user terminal 100 according to the second embodiment includes the face image photographing unit 11, the face image recording unit 12, the face image transmitting unit 13, the diagnostic information receiving unit 16, and the diagnostic information recording unit shown in FIG. In place of the unit 17, the face image display unit 19, and the face image storage unit 10A, an inquiry execution unit 21, an answer information recording unit 22, an answer information transmission unit 23, a diagnosis information reception unit 26, a diagnosis information recording unit 27, and an answer information display It has a section 29 and an answer information storage section 20A.
 問診実行部21は、特定の症状について自己診断を行うための問診情報をユーザに提供し、当該問診情報に対してユーザが入力した回答を示す回答情報を取得する。特定の症状は、例えばうつ病、認知症、不眠症などである。自己診断は、例えば、SDS(Self-rating Depression Scale)、SDC(Symbol Digit Coding)、EQ5D(EuroQol 5 Dimension)、QIDS(Quick Inventory of Depressive Symptomatology)、BDI(Beck Depression Inventory)、CES-D(Center for Epidemiologic Studies Depression Scale)、DSM-IV(The Diagnostic and Statistical Manual of Mental Disorders)、EPDS(Edinburgh Postnatal Depression Scale)といったうつ症状に関する自己評価尺度、MMSE(Mini Mental State Examination:)、HDS-R(Hasegawa's Dementia Scale-Revised:改訂版長谷川式簡易知能評価スケール)、ADAS-cog(Alzheimer's Disease Assessment Scale-cognitive subscale)、CDR(Clinical Dementia Rating)、CDT(Clock Drawing Test)、COGNISTAT(Neurobehavioral Cognitive Status Examination)、セブンミニッツスクリーニングといった認知症に関する自己評価尺度、アテネ不眠尺度、エプワース眠気尺度(ESS)といった不眠症に関する自己評価尺度を用いて実施する。 The medical inquiry execution unit 21 provides the user with medical inquiry information for performing self-diagnosis on a specific symptom, and acquires answer information indicating the answer input by the user to the medical inquiry information. Particular symptoms are, for example, depression, dementia, insomnia, and the like. Self-diagnosis includes, for example, SDS (Self-rating Depression Scale), SDC (Symbol Digit Coding), EQ5D (EuroQol 5 Dimension), QIDS (Quick Inventory of Depressive Symptomatology), BDI (Beck Depression Inventory), CES-D (Center for Epidemiologic Studies Depression Scale), DSM-IV (The Diagnostic and Statistical Manual of Mental Disorders), EPDS (Edinburgh Postnatal Depression Scale), MMSE (Mini Mental State Examination:), HDS-R (Hasegawa's) Dementia Scale-Revised: Revised Hasegawa Simple Intelligence Assessment Scale), ADAS-cog (Alzheimer's Disease Assessment Scale-cognitive subscale), CDR (Clinical Dementia Rating), CDT (Clock Drawing Test), COGNISTAT (Neurobehavioral Cognitive Status Examination), Self-rating scales for dementia such as Seven Minute Screening, Athens Insomnia Scale, Epworth Sleepiness Scale (ESS) are used for insomnia self-rating scales.
 例えば、問診実行部21は、問診において使用する自己評価尺度に応じて適切な所定サイクル(例えば、1日置きのサイクル)に従って、所定のタイミングで問診の実施を促すメッセージをディスプレイに出力する。そして、問診実行部21は、このメッセージを見たユーザによる指示操作に応じて、所定の問診画面を通じて問診情報をユーザに提示する。問診実行部21は、問診画面を通じて提示された問診情報に対するユーザの回答情報を入力する。 For example, the medical interview execution unit 21 outputs a message prompting the medical interview to the display at a predetermined timing in accordance with an appropriate predetermined cycle (for example, every other day cycle) according to the self-assessment scale used in the medical interview. Then, the medical inquiry execution unit 21 presents the medical inquiry information to the user through a predetermined medical inquiry screen in accordance with the instruction operation by the user who has seen this message. The inquiry executing unit 21 inputs the user's answer information to the inquiry information presented through the inquiry screen.
 回答情報記録部22は、問診実行部21による問診の実行により得られた回答情報を、問診の実施日時を示す情報と共に回答情報記憶部20Aに記憶させる。実施日時情報と共に回答情報を記憶することは、回答のテキストデータ(テキストファイル)に付随する属性情報の1つとして実施日時情報が含まれる態様で回答情報を記憶することを含む。回答情報送信部23は、問診実行部21により取得された回答情報を、問診の実施日時情報およびユーザIDと共にサーバ装置300に送信する。 The answer information recording unit 22 stores the answer information obtained by executing the inquiry by the inquiry execution unit 21 in the answer information storage unit 20A together with the information indicating the date and time of the inquiry. Storing the answer information together with the implementation date/time information includes storing the answer information in a manner in which the implementation date/time information is included as one of the attribute information accompanying the text data (text file) of the answer. The response information transmission unit 23 transmits the response information acquired by the medical inquiry execution unit 21 to the server device 300 together with the date and time information of the medical inquiry and the user ID.
 診断情報受信部26は、サーバ装置300から問診の回答情報に関連付けて提供された診断情報を受信する。診断情報と問診の回答情報との関連付けは、ユーザIDおよび問診の実施日時情報に基づいて行われる。 The diagnostic information receiving unit 26 receives the diagnostic information provided from the server device 300 in association with the answer information of the inquiry. Diagnosis information and inquiry response information are associated based on the user ID and inquiry date and time information.
 診断情報記録部27は、診断情報受信部26により受信された診断情報を、その診断の対象とされた問診の回答情報に関連付けて回答情報記憶部20Aに記憶させる。診断の対象とされた問診の回答情報は、問診の実施日時情報に基づいて特定される。 The diagnostic information recording unit 27 causes the diagnostic information received by the diagnostic information receiving unit 26 to be stored in the answer information storage unit 20A in association with the answer information of the medical interview targeted for diagnosis. Questionnaire response information that is the target of diagnosis is specified based on the date and time information of the inquiry.
 回答情報表示部29は、回答情報記憶部20Aに記憶された問診の回答情報および当該回答情報に関連付けられた診断情報(後述する症状度合情報)をディスプレイに表示させる。図12は、回答情報表示部29によりディスプレイに表示される画面の一例を示す図である。図12に示すように、回答情報表示部29は、問診の回答情報と、問診の実施日時を示す情報と、サーバ装置300から取得した診断情報とを1つのセット情報として、1以上のセット情報を画面に一覧表示する。 The answer information display unit 29 causes the display to display the inquiry answer information stored in the answer information storage unit 20A and diagnostic information (symptom degree information described later) associated with the answer information. FIG. 12 is a diagram showing an example of a screen displayed on the display by the answer information display section 29. As shown in FIG. As shown in FIG. 12 , the answer information display unit 29 displays one or more sets of information including the answer information of the medical inquiry, the information indicating the date and time of the medical inquiry, and the diagnostic information acquired from the server device 300 as one piece of set information. are listed on the screen.
 図11は、第2の実施形態によるサーバ装置300の機能構成例を示すブロック図である。この図11において、図3に示した符号と同一の符号を付したものは同一の機能を有するものであるので、ここでは重複する説明を省略する。 FIG. 11 is a block diagram showing a functional configuration example of the server device 300 according to the second embodiment. In FIG. 11, the same reference numerals as those shown in FIG. 3 have the same functions, and redundant description will be omitted here.
 図11に示すように、第2の実施形態によるサーバ装置300は、図3に示した診断情報取得部31、振動情報取得部32、診断情報提供部34、診察用情報提供部35、顔画像記憶部30Aおよび振動情報記憶部30Bに代えて、診断情報取得部41、振動情報取得部42、診断情報提供部44、診察用情報提供部45、回答情報記憶部40Aおよび振動情報記憶部40Bを備える。また、第2の実施形態によるサーバ装置300は、回答情報取得部46を更に備えている。 As shown in FIG. 11, the server device 300 according to the second embodiment includes the diagnostic information acquiring unit 31, the vibration information acquiring unit 32, the diagnostic information providing unit 34, the diagnostic information providing unit 35, and the facial image shown in FIG. Instead of the storage section 30A and the vibration information storage section 30B, a diagnostic information acquisition section 41, a vibration information acquisition section 42, a diagnostic information provision section 44, a medical examination information provision section 45, an answer information storage section 40A and a vibration information storage section 40B are provided. Prepare. Moreover, the server device 300 according to the second embodiment further includes an answer information acquisition unit 46 .
 回答情報取得部46は、ユーザ端末100の回答情報送信部23により送信された回答情報を取得し、問診の実施日時情報およびユーザIDと共に回答情報記憶部40Aに記憶させる。 The response information acquisition unit 46 acquires the response information transmitted by the response information transmission unit 23 of the user terminal 100, and stores it in the response information storage unit 40A together with the date and time information of the medical interview and the user ID.
 診断情報取得部41は、回答情報取得部46により取得された回答情報を解析することにより、ユーザの身体状態または精神状態に関する特定の症状の度合いについて、ユーザの自己診断の結果を示す症状度合情報を診断情報として取得し、問診の回答情報およびユーザIDに関連付けて回答情報記憶部40Aに記憶させる。 The diagnostic information acquisition unit 41 analyzes the response information acquired by the response information acquisition unit 46, and obtains symptom degree information indicating the user's self-diagnosis result regarding the degree of specific symptoms related to the user's physical condition or mental condition. is acquired as diagnostic information, and stored in the answer information storage unit 40A in association with the answer information of the inquiry and the user ID.
 すなわち、診断情報取得部41は、回答情報をもとに自己評価尺度の点数を算出し、その点数の高さに応じて症状の度合いを複数のレベルに分類する。本実施形態では最もシンプルな例として、自己評価尺度の点数が閾値以上か否かに応じて、症状の度合いが“大きい”または“小さい”の2つのレベルに分類する。このように分類した何れかのレベルを示す情報が症状度合情報である。 That is, the diagnostic information acquisition unit 41 calculates the score of the self-assessment scale based on the answer information, and classifies the degree of symptoms into a plurality of levels according to the score. In this embodiment, as the simplest example, the degree of symptoms is classified into two levels of "large" and "small" depending on whether the score of the self-assessment scale is equal to or higher than the threshold. Information indicating any level classified in this way is symptom degree information.
 振動情報取得部42は、触質が異なる振動に関する複数の振動情報のうち、診断情報取得部41により取得された診断情報(問診の回答情報と関連付けて回答情報記憶部40Aに記憶された診断情報)に応じた触質を有する振動に関する振動情報を取得する。すなわち、振動情報取得部42は、振動情報取得部42により取得された症状度合情報に応じた触質を有する振動に関する振動情報を取得する。 The vibration information acquiring unit 42 selects diagnostic information acquired by the diagnostic information acquiring unit 41 (diagnostic information stored in the answer information storage unit 40A in association with question answer information) from among a plurality of pieces of vibration information related to vibrations having different tactile properties. ) to acquire vibration information about vibration having a tactile quality according to That is, the vibration information acquisition unit 42 acquires vibration information regarding vibration having a tactile quality corresponding to the symptom degree information acquired by the vibration information acquisition unit 42 .
 ここで、触質が異なる振動に関する複数の振動情報は、当該振動が有する触感の硬さ(柔らかさ)または粗さ(滑らかさ)の何れかに関連付けて分類された態様で振動情報記憶部40Bにあらかじめ記憶されている。例えば、図8(b)と同様に、振動情報は、触感が“硬い”第1タイプV1’および触感が“柔らかい”第2タイプV2’に分類された態様で振動情報記憶部40Bにあらかじめ記憶されている。あるいは、図9(b)と同様に、振動情報は、触感が“粗い”第1タイプV1”および触感が“滑らか”な第2タイプV2”に分類された態様で振動情報記憶部40Bにあらかじめ記憶されている。 Here, the plurality of pieces of vibration information related to vibrations with different tactile qualities are classified in association with the hardness (softness) or roughness (smoothness) of the tactile sensation of the vibrations. is pre-stored in For example, similar to FIG. 8B, the vibration information is pre-stored in the vibration information storage unit 40B in a manner classified into a first type V1′ with a “hard” tactile sensation and a second type V2′ with a “soft” tactile sensation. It is Alternatively, as in FIG. 9B, the vibration information is stored in advance in the vibration information storage unit 40B in a manner classified into a first type V1 with a "rough" tactile sensation and a second type V2 with a "smooth" tactile sensation. remembered.
 振動情報取得部42は、診断情報取得部41により取得された症状度合情報により症状の度合いが“大きい”ことが示されている場合は、比較的大きな刺激をユーザに与えるために、図8(b)の場合は第1タイプV1’の振動情報、図9(b)の場合は第1タイプV1”の振動情報を振動情報記憶部40Bから取得する。一方、診断情報取得部41により取得された症状度合情報により症状の度合いが“小さい”ことが示されている場合、振動情報取得部42は、比較的小さな刺激をユーザに与えるために、図8(b)の場合は第2タイプV2’の振動情報、図9(b)の場合は第2タイプV2”の振動情報を振動情報記憶部40Bから取得する。 When the symptom degree information acquired by the diagnostic information acquisition unit 41 indicates that the degree of the symptom is "high", the vibration information acquisition unit 42 performs the operation shown in FIG. 8 ( In the case of b), the vibration information of the first type V1′ is acquired from the vibration information storage unit 40B, and in the case of FIG. If the symptom level information indicates that the symptom level is "small", the vibration information acquisition unit 42 selects the second type V2 in the case of FIG. ', and in the case of FIG. 9B, the vibration information of the second type V2'' is acquired from the vibration information storage unit 40B.
 診断情報提供部44は、診断情報取得部41により取得された診断情報(症状度合情報)を、回答情報取得部46により取得された回答情報に関連付けてユーザ端末100に提供する。すなわち、診断情報提供部44は、問診の回答情報と関連付けて回答情報記憶部40Aに記憶された診断情報を、当該回答情報を特定することが可能な問診の実施日時情報と共にユーザ端末100に提供する。この診断情報提供部44により実施日時情報と共に提供された診断情報が、ユーザ端末100の診断情報受信部26により受信され、診断情報記録部27により実施日時情報に対応する回答情報に関連付けて回答情報記憶部20Aに記憶される。 The diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 41 in association with the answer information acquired by the answer information acquiring unit 46 . That is, the diagnostic information providing unit 44 provides the user terminal 100 with the diagnostic information stored in the answer information storage unit 40A in association with the answer information of the medical inquiry, together with the date and time information of the medical inquiry that can specify the answer information. do. The diagnostic information provided by the diagnostic information providing unit 44 together with the implementation date and time information is received by the diagnostic information receiving unit 26 of the user terminal 100, and the diagnostic information recording unit 27 associates the response information with the response information corresponding to the implementation date and time information. It is stored in the storage unit 20A.
 診察用情報提供部45は、例えば医療機関端末200からの要求に応じて、回答情報取得部46により取得された回答情報および診断情報取得部41により取得された診断情報(回答情報記憶部40Aに記憶された回答情報とそれに関連付けられた症状度合情報)を診察用情報として医療機関端末200に提供する。 The medical examination information providing unit 45 stores the answer information acquired by the answer information acquisition unit 46 and the diagnostic information acquired by the diagnostic information acquisition unit 41 (stored in the answer information storage unit 40A), for example, in response to a request from the medical institution terminal 200. The stored answer information and symptom degree information associated therewith) are provided to the medical institution terminal 200 as examination information.
 以上詳しく説明したように、第2の実施形態では、ユーザの身体状態または精神状態に関する診断情報として、特定の症状に対応する評価尺度に基づく自己診断結果を示す症状度合情報を取得し、触質が異なる複数の振動情報の中から、症状度合情報に応じた触質を有する振動情報を取得してユーザに提供するようにしている。 As described in detail above, in the second embodiment, symptom degree information indicating a self-diagnosis result based on a rating scale corresponding to a specific symptom is acquired as diagnostic information about the user's physical condition or mental condition. Vibration information having a tactile quality corresponding to the symptom degree information is acquired from among a plurality of pieces of vibration information having different degrees of symptoms, and is provided to the user.
 このように構成した第2の実施形態によれば、ユーザの身体状態または精神状態に関する症状の度合い(自己診断の結果)に応じた触質を有する適切な振動情報をユーザに提供することができる。例えば、ある症状を有するユーザに対して、その症状の度合いに関する診断情報に応じて、その症状を緩和あるいは改善するという観点において有効な触感を生じさせる振動情報を提供することができる。 According to the second embodiment configured in this way, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms (results of self-diagnosis) related to the user's physical condition or mental condition. . For example, for a user who has a certain symptom, it is possible to provide vibration information that produces an effective tactile sensation from the viewpoint of alleviating or improving the symptom according to diagnostic information regarding the degree of the symptom.
 また、第2の実施形態では、問診の回答情報と診断情報とをユーザと医療機関とで共有する仕組みを提供している。これにより、ユーザは、振動情報を使って不調や症状の改善を図る際に、共有された情報を用いて医療機関と対話を行うことができ、医療機関から適切なアドバイスを受けることも可能である。 In addition, the second embodiment provides a mechanism for sharing the answer information and diagnosis information of the medical interview between the user and the medical institution. As a result, the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
(第3の実施形態)
 次に、本発明の第3の実施形態を図面に基づいて説明する。上記第2の実施形態では、身体状態または精神状態に関する特定の症状について、ユーザの自己診断の結果を示す症状度合情報を取得する例について説明した。これに対し、第3の実施形態は、医師による診断結果を示す症状度合情報を診断情報として取得するようにしたものである。
(Third Embodiment)
Next, a third embodiment of the present invention will be described with reference to the drawings. In the above-described second embodiment, an example of acquiring symptom degree information indicating the user's self-diagnosis result for a specific symptom related to physical condition or mental condition has been described. On the other hand, in the third embodiment, symptom degree information indicating the diagnosis result by a doctor is acquired as diagnosis information.
 第3の実施形態による振動情報提供システムの全体構成は、図1と同様である。図13は、第3の実施形態によるユーザ端末100の機能構成例を示すブロック図である。この図13において、図10に示した符号と同一の符号を付したものは同一の機能を有するものであるので、ここでは重複する説明を省略する。図14は、第3の実施形態によるサーバ装置300の機能構成例を示すブロック図である。この図14において、図11に示した符号と同一の符号を付したものは同一の機能を有するものであるので、ここでは重複する説明を省略する。図15は、第3の実施形態による医療機関端末200の機能構成例を示すブロック図である。 The overall configuration of the vibration information providing system according to the third embodiment is the same as in FIG. FIG. 13 is a block diagram showing a functional configuration example of the user terminal 100 according to the third embodiment. In FIG. 13, the components denoted by the same reference numerals as those shown in FIG. 10 have the same functions, and redundant description will be omitted here. FIG. 14 is a block diagram showing a functional configuration example of the server device 300 according to the third embodiment. In FIG. 14, the same reference numerals as those shown in FIG. 11 have the same functions, and redundant description will be omitted here. FIG. 15 is a block diagram showing a functional configuration example of the medical institution terminal 200 according to the third embodiment.
 図15に示すように、第3の実施形態による医療機関端末200機能構成として、診察情報入力部71、診察情報記録部72、診察情報送信部73、診断情報受信部76、診断情報記録部77および診察情報表示部79を備えている。また、第3の実施形態による医療機関端末200は、記憶媒体として、診察情報記憶部70Aを備えている。 As shown in FIG. 15, the functional configuration of the medical institution terminal 200 according to the third embodiment includes a medical examination information input section 71, a medical examination information recording section 72, a medical examination information transmitting section 73, a diagnostic information receiving section 76, and a diagnostic information recording section 77. and an examination information display section 79 . Further, the medical institution terminal 200 according to the third embodiment includes a medical examination information storage unit 70A as a storage medium.
 診察情報入力部71は、特定の症状について医療機関の医師がユーザに対して行った診察の結果を示す診察情報を入力する。医師が行う診察は、ユーザに対する問診による診察、医療機器を用いてユーザの身体の内部または外部を観察して行う診察、測定機器を用いてユーザの生体情報を測定することによって行う診察などである。診察情報入力部71は、医師がキーボードなどの入力デバイスを操作することによって入力した診察情報を取得する。また、診察情報入力部71は、測定機器により測定された生体情報を診察情報の一部として入力してもよい。 The medical examination information input unit 71 inputs medical examination information indicating the results of medical examinations performed on the user by doctors of medical institutions regarding specific symptoms. Medical examinations performed by a doctor include medical examinations by interviewing the user, medical examinations by observing the inside or outside of the user's body using medical equipment, and medical examinations by measuring the user's biometric information using measuring equipment. . The medical examination information input unit 71 acquires medical examination information input by a doctor by operating an input device such as a keyboard. Further, the medical examination information input unit 71 may input biological information measured by a measuring device as part of the medical examination information.
 診察情報記録部72は、診察情報入力部71により入力された診察情報を、診察日時情報と共に診察情報記憶部70Aに記憶させる。診察情報送信部73は、診察情報入力部71により入力された診察情報を診察日時情報およびユーザIDと共にサーバ装置300に送信する。 The medical examination information recording unit 72 stores the medical examination information input by the medical examination information input unit 71 in the medical examination information storage unit 70A together with the medical examination date and time information. The medical examination information transmission unit 73 transmits the medical examination information input by the medical examination information input unit 71 to the server device 300 together with the medical examination date/time information and the user ID.
 診断情報受信部76は、サーバ装置300から診察情報に関連付けて提供された診断情報(後述する症状度合情報)を受信する。診断情報と診察情報との関連付けは、ユーザIDおよび診察日時情報に基づいて行われる。診断情報記録部77は、診断情報受信部76により受信された診断情報を、その診断の対象とされた診察情報に関連付けて診察情報記憶部70Aに記憶させる。診断の対象とされた診察情報は、診察日時情報に基づいて特定される。 The diagnostic information receiving unit 76 receives diagnostic information (symptom degree information described later) provided from the server device 300 in association with the medical examination information. Diagnosis information and consultation information are associated based on the user ID and consultation date/time information. The diagnostic information recording section 77 stores the diagnostic information received by the diagnostic information receiving section 76 in the medical examination information storage section 70A in association with the medical examination information that is the target of the diagnosis. The examination information targeted for diagnosis is specified based on the examination date and time information.
 診察情報表示部79は、診察情報記憶部70Aに記憶された診察情報および当該診察情報に関連付けられた診断情報(症状度合情報)をディスプレイに表示させる。診察情報表示部79によりディスプレイに表示される画面は、例えば、診察情報と診察日時情報と診断情報とを1つのセット情報として、1以上のセット情報を画面に一覧表示したものである。あるいは、診察情報と診察日時情報とを表示させるカルテ画面に診断情報を付加した態様としてもよい。 The medical examination information display unit 79 displays the medical examination information stored in the medical examination information storage unit 70A and the diagnostic information (symptom degree information) associated with the medical examination information on the display. The screen displayed on the display by the examination information display unit 79 is a list of one or more pieces of set information, for example, including examination information, examination date and time information, and diagnosis information. Alternatively, the diagnosis information may be added to the chart screen displaying the consultation information and the consultation date/time information.
 図14に示すように、第3の実施形態によるサーバ装置300は、図11に示した回答情報取得部46、診断情報取得部41、診断情報提供部44、診察用情報提供部45および回答情報記憶部40Aに代えて、診察情報取得部66、診断情報取得部61、診断情報提供部64、診断結果情報提供部65および診察情報記憶部60Aを備えている。 As shown in FIG. 14, the server device 300 according to the third embodiment includes the answer information acquiring unit 46, the diagnostic information acquiring unit 41, the diagnostic information providing unit 44, the diagnostic information providing unit 45, and the answer information shown in FIG. In place of the storage unit 40A, a diagnosis information acquisition unit 66, a diagnosis information acquisition unit 61, a diagnosis information provision unit 64, a diagnosis result information provision unit 65, and a diagnosis information storage unit 60A are provided.
 診察情報取得部66は、医療機関端末200の診察情報送信部73により送信された診察情報を取得し、診察日時情報およびユーザIDと共に診察情報記憶部60Aに記憶させる。 The consultation information acquisition unit 66 acquires the consultation information transmitted by the consultation information transmission unit 73 of the medical institution terminal 200, and stores it in the consultation information storage unit 60A together with the consultation date/time information and the user ID.
 診断情報取得部61は、診察情報取得部66により取得された診察情報を解析することにより、ユーザの身体状態または精神状態に関する特定の症状の度合いを示す症状度合情報を診断情報として取得し、診察情報およびユーザIDに関連付けて診察情報記憶部60Aに記憶させる。例えば、診断情報取得部61は、診察情報をもとに症状の度合いを複数のレベルに分類し、分類した何れかのレベルを示す情報を症状度合情報として取得する。本実施形態では最もシンプルな例として、症状の度合いが“大きい”または“小さい”の2つのレベルに分類する。 The diagnostic information acquisition unit 61 analyzes the medical examination information acquired by the medical examination information acquisition unit 66 to acquire symptom level information indicating the degree of specific symptoms related to the user's physical condition or mental condition as diagnostic information. The information is stored in the medical examination information storage unit 60A in association with the information and the user ID. For example, the diagnostic information acquisition unit 61 classifies the degree of symptoms into a plurality of levels based on the medical examination information, and acquires information indicating one of the classified levels as symptom degree information. In this embodiment, as the simplest example, the degree of symptoms is classified into two levels of "large" and "small".
 診察情報に基づく症状の度合いに応じた分類は、例えば、機械学習によって生成した分類モデルを用いて行うことが可能である。例えば、複数の人について得られる診察情報を学習用データとして用いて機械学習を行うことにより、診察情報が入力された際に症状度合情報が出力されるようにニューラルネットワークのパラメータが調整された分類モデルを生成することが可能である。ニューラルネットワークモデルに代えて回帰モデル、木モデル、ベイズモデル、クラスタリングモデルなどのうち何れかの形態で分類モデルを生成することも可能である。 Classification according to the degree of symptoms based on medical examination information can be performed, for example, using a classification model generated by machine learning. For example, by performing machine learning using medical examination information obtained from multiple people as learning data, a classification in which neural network parameters are adjusted so that symptom degree information is output when medical examination information is input. It is possible to generate a model. Instead of the neural network model, it is also possible to generate a classification model in any form of a regression model, a tree model, a Bayesian model, a clustering model, and the like.
 診断情報提供部64は、診断情報取得部61により取得された診断情報(症状度合情報)を、診察情報取得部66により取得された診察情報に関連付けて医療機関端末200に提供する。すなわち、診断情報提供部64は、診察情報と関連付けて診察情報記憶部60Aに記憶された診断情報を、当該診察情報を特定することが可能な診察日時情報と共に医療機関端末200に提供する。この診断情報提供部64により診察日時情報と共に提供された診断情報が、医療機関端末200の診断情報受信部76により受信され、診断情報記録部77により診察日時情報に対応する診察情報に関連付けて診察情報記憶部70Aに記憶される。 The diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information (symptom degree information) acquired by the diagnostic information acquiring unit 61 in association with the medical examination information acquired by the medical examination information acquiring unit 66 . That is, the diagnostic information providing unit 64 provides the medical institution terminal 200 with the diagnostic information stored in the medical examination information storage unit 60A in association with the medical examination information together with the medical examination date and time information that can specify the medical examination information. The diagnostic information provided by the diagnostic information providing unit 64 together with the examination date and time information is received by the diagnostic information receiving unit 76 of the medical institution terminal 200, and the diagnostic information recording unit 77 associates the examination date and time information with the examination information corresponding to the examination date information. It is stored in the information storage unit 70A.
 診断結果情報提供部65は、診察情報取得部66により取得された診察情報および診断情報取得部61により取得された診断情報(診察情報記憶部60Aに記憶された診察情報とそれに関連付けられた症状度合情報)を診断結果情報としてユーザ端末100に提供する。 The diagnostic result information providing unit 65 receives the medical examination information acquired by the medical examination information acquiring unit 66 and the diagnostic information acquired by the diagnostic information acquiring unit 61 (the medical examination information stored in the medical examination information storage unit 60A and the symptom degree associated therewith). information) is provided to the user terminal 100 as diagnosis result information.
 図13に示すように、第3の実施形態によるユーザ端末100は、機能構成として、図10に示した問診実行部21、回答情報記録部22、回答情報送信部23は備えていない。また、第3の実施形態によるユーザ端末100は、図10に示した診断情報受信部26、診断情報記録部27、回答情報表示部29および回答情報記憶部20Aに代えて、診断結果情報受信部56、診断結果情報記録部57、診断結果情報表示部25および診断結果情報記憶部50Aを備えている。 As shown in FIG. 13, the user terminal 100 according to the third embodiment does not include the medical interview execution unit 21, answer information recording unit 22, and answer information transmission unit 23 shown in FIG. 10 as functional configurations. Further, in the user terminal 100 according to the third embodiment, instead of the diagnostic information receiving section 26, the diagnostic information recording section 27, the answer information display section 29, and the answer information storage section 20A shown in FIG. 56, a diagnostic result information recording unit 57, a diagnostic result information display unit 25, and a diagnostic result information storage unit 50A.
 診断結果情報受信部56は、サーバ装置300から提供された診断結果情報を受信する。診断結果情報記録部57は、診断結果情報受信部56により受信された診断結果情報を診断結果情報記憶部50Aに記憶させる。診断結果情報表示部59は、診断結果情報記憶部50Aに記憶された診断結果情報をディスプレイに表示させる。 The diagnostic result information receiving unit 56 receives diagnostic result information provided from the server device 300 . The diagnostic result information recording unit 57 stores the diagnostic result information received by the diagnostic result information receiving unit 56 in the diagnostic result information storage unit 50A. The diagnostic result information display unit 59 displays the diagnostic result information stored in the diagnostic result information storage unit 50A on the display.
 以上のように構成した第3の実施形態においても、ユーザの身体状態または精神状態に関する症状の度合い(医師による診察の結果)に応じた触質を有する適切な振動情報をユーザに提供することができる。また、第3の実施形態では、医師による診察の結果を示す診察情報と、その診察結果に基づき解析された診断情報(症状度合情報)とをユーザと医療機関とで共有する仕組みを提供している。これにより、ユーザは、振動情報を使って不調や症状の改善を図る際に、共有された情報を用いて医療機関と対話を行うことができ、医療機関から適切なアドバイスを受けることも可能である。 Also in the third embodiment configured as described above, it is possible to provide the user with appropriate vibration information having a tactile quality corresponding to the degree of symptoms related to the user's physical condition or mental condition (result of medical examination by a doctor). can. Further, in the third embodiment, a mechanism is provided for sharing between the user and the medical institution the examination information indicating the examination result by the doctor and the diagnostic information (symptom degree information) analyzed based on the examination result. there is As a result, the user can use the shared information to have a dialogue with the medical institution when using the vibration information to improve the disorder and symptoms, and it is also possible to receive appropriate advice from the medical institution. be.
 なお、上記第1~第3の実施形態では、振動の発生源となる振動波形(振動の実体)を振動情報記憶部30B,40Bが振動情報として記憶し、サーバ装置300からユーザ端末100に振動の実体を提供する例について説明したが、本発明はこれに限定されない。例えば、振動情報記憶部30B,40Bは、振動の実体を特定するための振動識別情報を振動情報として記憶し、振動識別情報をサーバ装置300からユーザ端末100に提供するようにしてもよい。この場合、ユーザ端末100は、サーバ装置300またはこれとは別のサーバ装置にアクセスし、振動識別情報を用いて要求を行うことにより、振動の実体をダウンロードする。振動識別情報は、ダウンロード先にアクセスするためのURL(Uniform Resource Locator)であってもよい。 In the above-described first to third embodiments, the vibration waveform (substance of vibration) that is the source of vibration is stored as vibration information in the vibration information storage units 30B and 40B, and the vibration is transmitted from the server device 300 to the user terminal 100. Although examples have been described that provide the substance of the invention, the invention is not so limited. For example, the vibration information storage units 30</b>B and 40</b>B may store vibration identification information for specifying the entity of vibration as vibration information, and provide the vibration identification information from the server device 300 to the user terminal 100 . In this case, the user terminal 100 accesses the server device 300 or another server device and downloads the substance of the vibration by making a request using the vibration identification information. The vibration identification information may be a URL (Uniform Resource Locator) for accessing the download destination.
 また、上記第1~第3の実施形態では、触感の硬さおよび粗さの少なくとも一方に関連付けて分類される複数の振動触感タイプについて、1つのタイプにつき1つの振動情報を振動情報記憶部30B,40Bにあらかじめ記憶しておき、診断結果タイプに対応付けられた振動触感タイプに属する1つの振動情報を振動情報記憶部30B,40Bから取得する例について説明したが、本発明はこれに限定されない。 Further, in the first to third embodiments described above, for a plurality of vibration tactile sensation types classified in association with at least one of the hardness and roughness of the tactile sensation, one piece of vibration information is stored in the vibration information storage unit 30B for each type. , 40B, and obtains one piece of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type from the vibration information storage units 30B and 40B, but the present invention is not limited to this. .
 例えば、1つの振動触感タイプにつき複数の振動情報を振動情報記憶部30B,40Bにあらかじめ記憶しておき、診断結果タイプに対応付けられた振動触感タイプに属する複数の振動情報の中から何れか1つを選択して振動情報記憶部30B,40Bから取得するようにしてもよい。この場合の選択は、振動情報取得部32,42が自動的に行うようにしてもよいし、ユーザが手動で行うようにしてもよい。 For example, a plurality of pieces of vibration information are stored in advance in the vibration information storage units 30B and 40B for one vibration tactile sensation type, and any one of the plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type is selected. One may be selected and acquired from the vibration information storage units 30B and 40B. The selection in this case may be automatically performed by the vibration information acquisition units 32 and 42, or may be manually performed by the user.
 振動情報取得部32,42が自動的に選択する場合、例えばランダムに選択するようにすることが可能である。あるいは、1人のユーザから顔の撮影画像または問診の回答情報が送られてくるたびに、同じユーザに対して同じ振動情報が連続して提供されることがないように選択を行うようにしてもよい。例えば、診断情報が同じ診断結果タイプに分類されることが連続した場合に、その診断結果タイプに対応する振動触感タイプに属する複数の振動情報の中から、前回選択した振動情報とは異なる振動情報を取得するようにする。 When the vibration information acquisition units 32 and 42 automatically select, for example, it is possible to select randomly. Alternatively, selection is made so that the same vibration information is not continuously provided to the same user every time a photographed image of the face or answer information to an inquiry is sent from one user. good too. For example, if the diagnostic information is continuously classified into the same diagnostic result type, vibration information that is different from the previously selected vibration information among a plurality of vibration information belonging to the vibration tactile sensation type corresponding to the diagnostic result type. to get
 一方、ユーザが手動で振動情報の選択を行う場合、例えば、診断結果タイプに対応付けられた振動触感タイプに属する複数の振動情報を表す振動識別情報をユーザ端末100に提示し、その中からユーザにより選択された振動識別情報に対応する振動情報(振動の実体)を選択するようにすることが可能である。振動識別情報と共に、振動の触感等を説明するコメント情報をユーザに提示するようにしてもよい。 On the other hand, when the user manually selects the vibration information, for example, the user terminal 100 is presented with vibration identification information representing a plurality of pieces of vibration information belonging to the vibration tactile sensation type associated with the diagnosis result type. It is possible to select vibration information (substance of vibration) corresponding to the vibration identification information selected by . Along with the vibration identification information, comment information describing the tactile sensation of the vibration and the like may be presented to the user.
 また、上記第1~第3の実施形態では、ユーザの身体状態または精神状態に関する診断情報として、表情の硬さの程度を示す硬さ情報および顔の歪みの程度を示す歪み情報を用いる例と、特定の症状の度合いについて自己または医師による診断結果を示す症状度合情報を用いる例について説明したが、本発明はこれに限定されない。例えば、ユーザの身体状態または精神状態に関して測定されるユーザの生体情報について、当該生体情報の標準値または正常値からの乖離度合を示す乖離度合情報を診断情報として用い、乖離度合情報に応じた触質を有する振動情報を提供するようにしてもよい。あるいは、ユーザの声の音声情報に基づいて、ユーザの身体状態または精神状態を解析することによって、症状度合情報などを診断情報として取得するようにしてもよい。 Further, in the above-described first to third embodiments, the hardness information indicating the degree of facial expression hardness and the distortion information indicating the degree of facial distortion are used as diagnostic information regarding the user's physical condition or mental condition. , an example of using symptom degree information indicating a diagnosis result by oneself or by a doctor regarding the degree of a specific symptom has been described, but the present invention is not limited to this. For example, regarding the user's biometric information measured regarding the user's physical condition or mental condition, deviation degree information indicating the degree of deviation from the standard value or normal value of the biometric information is used as diagnostic information, and a touch corresponding to the deviation degree information is used. Qualitative vibration information may be provided. Alternatively, symptom degree information or the like may be acquired as diagnostic information by analyzing the user's physical condition or mental condition based on the audio information of the user's voice.
 また、上記第1~第3の実施形態では、サーバ装置300が診断情報取得部31,41、振動情報取得部32,42および振動情報提供部33を備える構成について説明したが、本発明はこれに限定されない。例えば、ユーザ端末100が診断情報取得部31,41を備え、診断情報をユーザ端末100からサーバ装置300に送信するようにしてもよい。または、サーバ装置300が備える機能構成を全てユーザ端末100が備えるようにしてもよい。この場合、振動情報提供部33は、振動識別情報をユーザに提示し、ユーザ端末100からサーバ装置300またはこれとは別のサーバ装置にアクセスし、振動識別情報を用いて要求を行うことにより、振動の実体をダウンロードするようにしてよい。 Further, in the first to third embodiments described above, the configuration in which the server device 300 includes the diagnostic information acquiring units 31, 41, the vibration information acquiring units 32, 42, and the vibration information providing unit 33 has been described. is not limited to For example, the user terminal 100 may include the diagnostic information acquisition units 31 and 41 and transmit diagnostic information from the user terminal 100 to the server device 300 . Alternatively, the user terminal 100 may be provided with all the functional configurations provided by the server device 300 . In this case, the vibration information providing unit 33 presents the vibration identification information to the user, accesses the server device 300 or another server device from the user terminal 100, and makes a request using the vibration identification information, You may make it download the substance of a vibration.
 また、上記第1~第3の実施形態では、医療機関端末200を備える構成について説明したが、第1および第2の実施形態については、医療機関端末200を省いた構成としてもよい。 Also, in the first to third embodiments, the configuration including the medical institution terminal 200 has been described, but the first and second embodiments may be configured without the medical institution terminal 200.
 その他、上記第1~第3の実施形態は、何れも本発明を実施するにあたっての具体化の一例を示したものに過ぎず、これらによって本発明の技術的範囲が限定的に解釈されてはならないものである。すなわち、本発明はその要旨、またはその主要な特徴から逸脱することなく、様々な形で実施することができる。 In addition, the above-described first to third embodiments are merely examples of specific implementations of the present invention, and the technical scope of the present invention should not be construed to be limited by these. It should not be. Thus, the invention may be embodied in various forms without departing from its spirit or essential characteristics.
 10A 顔画像記憶部
 10B 振動情報記憶部
 11 顔画像撮影部
 12 顔画像記録部
 13 顔画像送信部
 14 振動情報受信部
 15 振動情報記録部
 16 診断情報受信部
 17 診断情報記録部
 18 振動付与部
 19 顔画像表示部
 20A 顔画像記憶部
 21 問診実行部
 22 回答情報記録部
 23 回答情報送信部
 26 診断情報受信部
 27 診断情報記録部
 29 回答情報表示部
 30A 顔画像記憶部
 30B 振動情報記憶部
 31 診断情報取得部
 31a 顔画像取得部
 31b 画像解析部
 32 振動情報取得部
 33 振動情報提供部
 34 診断情報提供部
 35 診察用情報提供部
 40A 回答情報記憶部
 40B 振動情報記憶部
 41 診断情報取得部
 42 振動情報取得部
 44 診断情報提供部
 45 診察用情報提供部
 46 回答情報取得部
 50A 診断結果情報記憶部
 56 診断結果情報受信部
 57 診断結果情報記録部
 59 診断結果情報表示部
 60A 診察情報記憶部
 61 診断情報取得部
 64 診断情報提供部
 65 診断結果情報提供部
 66 診察情報取得部
 70A 診察情報記憶部
 71 診察情報入力部
 72 診察情報記録部
 73 診察情報送信部
 76 診断情報受信部
 77 診断情報記録部
 79 診察情報表示部
 100 ユーザ端末
 200 医療機関端末
 300 サーバ装置
10A face image storage unit 10B vibration information storage unit 11 face image photographing unit 12 face image recording unit 13 face image transmission unit 14 vibration information reception unit 15 vibration information recording unit 16 diagnosis information reception unit 17 diagnosis information recording unit 18 vibration imparting unit 19 Face image display unit 20A Face image storage unit 21 Interview execution unit 22 Answer information recording unit 23 Answer information transmission unit 26 Diagnosis information reception unit 27 Diagnosis information recording unit 29 Answer information display unit 30A Face image storage unit 30B Vibration information storage unit 31 Diagnosis Information Acquisition Part 31a Face Image Acquisition Part 31b Image Analysis Part 32 Vibration Information Acquisition Part 33 Vibration Information Provision Part 34 Diagnosis Information Provision Part 35 Diagnostic Information Provision Part 40A Answer Information Storage Part 40B Vibration Information Storage Part 41 Diagnosis Information Acquisition Part 42 Vibration Information acquisition unit 44 Diagnosis information provision unit 45 Diagnosis information provision unit 46 Answer information acquisition unit 50A Diagnosis result information storage unit 56 Diagnosis result information reception unit 57 Diagnosis result information recording unit 59 Diagnosis result information display unit 60A Diagnosis information storage unit 61 Diagnosis Information acquisition unit 64 Diagnosis information provision unit 65 Diagnosis result information provision unit 66 Diagnosis information acquisition unit 70A Diagnosis information storage unit 71 Diagnosis information input unit 72 Diagnosis information recording unit 73 Diagnosis information transmission unit 76 Diagnosis information reception unit 77 Diagnosis information recording unit 79 Medical examination information display unit 100 User terminal 200 Medical institution terminal 300 Server device

Claims (26)

  1.  ユーザの身体状態または精神状態に関する診断情報を取得する診断情報取得部と、
     特定の触感を生じさせる振動の性質である触質が異なる振動に関する複数の振動情報のうち、上記診断情報取得部により取得された診断情報に応じた触質を有する振動に関する振動情報を取得する振動情報取得部と、
     上記振動情報取得部により取得された上記振動情報を上記ユーザに提供する振動情報提供部とを備えた
    ことを特徴とする振動情報提供システム。
    a diagnostic information acquisition unit that acquires diagnostic information about the user's physical condition or mental condition;
    Vibration for obtaining vibration information about vibration having a tactile quality corresponding to the diagnostic information obtained by the diagnostic information obtaining unit, among a plurality of pieces of vibration information about vibration with different tactile quality, which is a property of vibration that causes a specific tactile sensation. an information acquisition unit;
    and a vibration information providing unit that provides the user with the vibration information acquired by the vibration information acquiring unit.
  2.  上記診断情報取得部は、
     上記ユーザの顔の撮影画像を取得する顔画像取得部と、
     上記顔画像取得部により取得された上記顔の撮影画像に基づいて、上記ユーザの身体状態または精神状態を解析することによって上記診断情報を得る画像解析部とを備えた
    ことを特徴とする請求項1に記載の振動情報提供システム。
    The above diagnostic information acquisition unit
    a face image acquisition unit that acquires a photographed image of the user's face;
    and an image analysis unit for obtaining the diagnostic information by analyzing the user's physical condition or mental condition based on the photographed image of the face acquired by the face image acquisition unit. 2. The vibration information providing system according to 1.
  3.  上記画像解析部は、上記顔画像取得部により取得された上記顔の撮影画像を画像解析することにより、表情の硬さの程度を示す硬さ情報および顔の歪みの程度を示す歪み情報の少なくとも一方を上記診断情報として取得し、
     上記振動情報取得部は、上記硬さ情報および上記歪み情報の少なくとも一方に応じた触質を有する振動に関する振動情報を取得する
    ことを特徴とする請求項2に記載の振動情報提供システム。
    The image analysis unit analyzes the photographed image of the face acquired by the face image acquisition unit to obtain at least hardness information indicating the degree of hardness of expression and distortion information indicating the degree of distortion of the face. Acquire one as the above diagnostic information,
    3. The vibration information providing system according to claim 2, wherein the vibration information acquiring unit acquires vibration information relating to vibration having a tactile quality corresponding to at least one of the hardness information and the strain information.
  4.  上記診断情報取得部は、上記身体状態または上記精神状態に関する特定の症状の度合いについて、自己または医師による診断結果を示す症状度合情報を上記診断情報として取得し、
     上記振動情報取得部は、上記症状度合情報に応じた触質を有する振動に関する振動情報を取得する
    ことを特徴とする請求項1に記載の振動情報提供システム。
    The diagnostic information acquiring unit acquires, as the diagnostic information, symptom degree information indicating a diagnosis result by the patient or a doctor regarding the degree of a specific symptom related to the physical condition or the mental condition,
    2. The vibration information providing system according to claim 1, wherein the vibration information acquiring unit acquires vibration information relating to vibration having a tactile quality corresponding to the symptom degree information.
  5.  上記ユーザが使用するユーザ端末と、上記ユーザ端末からのアクセスを受けて情報を提供するサーバ装置とを備え、
     上記サーバ装置は、上記診断情報取得部、上記振動情報取得部および上記振動情報提供部を備え、
     上記振動情報提供部は、上記振動情報取得部により取得された上記振動情報を上記ユーザ端末に提供する
    ことを特徴とする請求項1~4の何れか1項に記載の振動情報提供システム。
    A user terminal used by the user, and a server device that receives access from the user terminal and provides information,
    The server device includes the diagnostic information acquisition unit, the vibration information acquisition unit, and the vibration information provision unit,
    5. The vibration information providing system according to claim 1, wherein the vibration information providing unit provides the user terminal with the vibration information acquired by the vibration information acquiring unit.
  6.  上記ユーザが使用するユーザ端末と、上記ユーザ端末からのアクセスを受けて情報を提供するサーバ装置とを備え、
     上記ユーザ端末は、
     上記ユーザの顔を撮影する顔画像撮影部と、
     上記顔画像撮影部による撮影により得られた顔の撮影画像を上記サーバ装置に送信する顔画像送信部と、
     上記サーバ装置から提供された上記振動情報を受信する振動情報受信部とを備え、
     上記サーバ装置は、上記顔画像取得部、上記画像解析部、上記振動情報取得部および上記振動情報提供部を備え、
     上記顔画像取得部は、上記ユーザ端末の上記顔画像送信部により送信された上記顔の撮影画像を取得し、
     上記振動情報提供部は、上記振動情報取得部により取得された上記振動情報を上記ユーザ端末に提供する
    ことを特徴とする請求項2または3に記載の振動情報提供システム。
    A user terminal used by the user, and a server device that receives access from the user terminal and provides information,
    The above user terminal is
    a face image capturing unit that captures the face of the user;
    a face image transmission unit configured to transmit a photographed image of the face obtained by photographing by the face image photographing unit to the server device;
    a vibration information receiving unit that receives the vibration information provided from the server device;
    The server device includes the face image acquisition unit, the image analysis unit, the vibration information acquisition unit, and the vibration information provision unit,
    The face image acquisition unit acquires the photographed image of the face transmitted by the face image transmission unit of the user terminal,
    4. The vibration information providing system according to claim 2, wherein the vibration information providing unit provides the user terminal with the vibration information acquired by the vibration information acquiring unit.
  7.  上記サーバ装置は、上記画像解析部により取得された上記診断情報を、上記顔画像取得部により取得された上記顔の撮影画像に関連付けて上記ユーザ端末に提供する診断情報提供部を更に備えたことを特徴とする請求項6に記載の振動情報提供システム。 The server device further includes a diagnostic information providing unit that provides the user terminal with the diagnostic information acquired by the image analysis unit in association with the photographed image of the face acquired by the face image acquisition unit. 7. The vibration information providing system according to claim 6, characterized by:
  8.  医療機関にて使用する医療機関端末を更に備え、
     上記サーバ装置は、上記顔画像取得部により取得された上記顔の撮影画像および上記画像解析部により取得された上記診断情報を診察用情報として上記医療機関端末に提供する診察用情報提供部を更に備えたことを特徴とする請求項6または7に記載の振動情報提供システム。
    Further equipped with a medical institution terminal used in medical institutions,
    The server device further includes a diagnostic information providing unit that provides the medical institution terminal with the photographed image of the face acquired by the facial image acquisition unit and the diagnostic information acquired by the image analysis unit as diagnostic information. 8. The vibration information providing system according to claim 6 or 7, comprising:
  9.  上記ユーザが使用するユーザ端末と、上記ユーザ端末からのアクセスを受けて情報を提供するサーバ装置とを備え、
     上記ユーザ端末は、
     上記特定の症状について自己診断を行うための問診情報を上記ユーザに提供し、当該問診情報に対して上記ユーザが入力した回答を示す回答情報を取得する問診実行部と、
     上記問診実行部により取得された上記回答情報を上記サーバ装置に送信する回答情報送信部と、
     上記サーバ装置から提供された上記振動情報を受信する振動情報受信部とを備え、
     上記サーバ装置は、
     上記回答情報送信部により送信された上記回答情報を取得する回答情報取得部と、
     上記診断情報取得部、上記振動情報取得部および上記振動情報提供部とを備え、
     上記診断情報取得部は、上記回答情報取得部により取得された上記回答情報を解析することによって上記症状度合情報を取得し、
     上記振動情報提供部は、上記振動情報取得部により取得された上記振動情報を上記ユーザ端末に提供する
    ことを特徴とする請求項4に記載の振動情報提供システム。
    A user terminal used by the user, and a server device that receives access from the user terminal and provides information,
    The above user terminal is
    an inquiry execution unit that provides the user with inquiry information for performing self-diagnosis on the specific symptom, and acquires answer information indicating an answer input by the user to the inquiry information;
    an answer information transmission unit that transmits the answer information acquired by the medical interview execution unit to the server device;
    a vibration information receiving unit that receives the vibration information provided from the server device;
    The above server device
    a response information acquisition unit that acquires the response information transmitted by the response information transmission unit;
    comprising the diagnostic information acquiring unit, the vibration information acquiring unit, and the vibration information providing unit;
    The diagnostic information acquiring unit acquires the symptom degree information by analyzing the answer information acquired by the answer information acquiring unit,
    5. The vibration information providing system according to claim 4, wherein the vibration information providing unit provides the user terminal with the vibration information acquired by the vibration information acquiring unit.
  10.  上記サーバ装置は、上記診断情報取得部により取得された上記診断情報を、上記回答情報取得部により取得された上記回答情報に関連付けて上記ユーザ端末に提供する診断情報提供部を更に備えたことを特徴とする請求項9に記載の振動情報提供システム。 The server device further includes a diagnostic information providing unit that provides the user terminal with the diagnostic information acquired by the diagnostic information acquiring unit in association with the answer information acquired by the answer information acquiring unit. 10. The vibration information providing system according to claim 9.
  11.  医療機関にて使用する医療機関端末を更に備え、
     上記サーバ装置は、上記回答情報取得部により取得された上記回答情報および上記診断情報取得部により取得された上記診断情報を診察用情報として上記医療機関端末に提供する診察用情報提供部を更に備えたことを特徴とする請求項9または10に記載の振動情報提供システム。
    Further equipped with a medical institution terminal used in medical institutions,
    The server device further includes a diagnostic information providing unit that provides the medical institution terminal with the reply information acquired by the reply information acquiring unit and the diagnostic information acquired by the diagnostic information acquiring unit as diagnostic information. 11. The vibration information providing system according to claim 9 or 10, characterized in that:
  12.  上記ユーザが使用するユーザ端末と、医療機関にて使用する医療機関端末と、上記ユーザ端末および上記医療機関端末からのアクセスを受けて情報を提供するサーバ装置とを備え、
     上記ユーザ端末は、
     上記サーバ装置から提供された上記振動情報を受信する振動情報受信部を備え、
     上記医療機関端末は、
     上記特定の症状について医師が上記ユーザに対して行った診察の結果を示す診察情報を入力する診察情報入力部と、
     上記診察情報入力部により入力された上記診察情報を上記サーバ装置に送信する診察情報送信部とを備え、
     上記サーバ装置は、
     上記診察情報送信部により送信された上記診察情報を取得する診察情報取得部と、
     上記診断情報取得部、上記振動情報取得部および上記振動情報提供部とを備え、
     上記診断情報取得部は、上記診察情報取得部により取得された上記診察情報を解析することによって上記症状度合情報を取得し、
     上記振動情報提供部は、上記振動情報取得部により取得された上記振動情報を上記ユーザ端末に提供する
    ことを特徴とする請求項4に記載の振動情報提供システム。
    A user terminal used by the user, a medical institution terminal used in a medical institution, and a server device that receives access from the user terminal and the medical institution terminal and provides information,
    The above user terminal is
    A vibration information receiving unit that receives the vibration information provided from the server device,
    The above medical institution terminal,
    a medical examination information input unit for inputting medical examination information indicating the result of medical examination performed by a doctor on the user regarding the specific symptom;
    a medical examination information transmission unit for transmitting the medical examination information input by the medical examination information input unit to the server device;
    The above server device
    a medical examination information obtaining unit for obtaining the medical examination information transmitted by the medical examination information transmitting unit;
    comprising the diagnostic information acquiring unit, the vibration information acquiring unit, and the vibration information providing unit;
    The diagnostic information acquiring unit acquires the symptom degree information by analyzing the medical examination information acquired by the medical examination information acquiring unit,
    5. The vibration information providing system according to claim 4, wherein the vibration information providing unit provides the user terminal with the vibration information acquired by the vibration information acquiring unit.
  13.  上記サーバ装置は、上記診断情報取得部により取得された上記診断情報を、上記診察情報取得部により取得された上記診察情報に関連付けて上記医療機関端末に提供する診断情報提供部を更に備えたことを特徴とする請求項12に記載の振動情報提供システム。 The server device further includes a diagnostic information providing unit that provides the medical institution terminal with the diagnostic information acquired by the diagnostic information acquiring unit in association with the medical examination information acquired by the medical examination information acquiring unit. 13. The vibration information providing system according to claim 12, characterized by:
  14.  上記サーバ装置は、上記診察情報取得部により取得された上記診察情報および上記診断情報取得部により取得された上記診断情報を診断結果情報として上記ユーザ端末に提供する診断結果情報提供部を更に備えたことを特徴とする請求項12または13に記載の振動情報提供システム。 The server device further includes a diagnosis result information providing unit that provides the user terminal with the diagnosis information acquired by the diagnosis information acquisition unit and the diagnosis information acquired by the diagnosis information acquisition unit as diagnosis result information. 14. The vibration information providing system according to claim 12 or 13, characterized in that:
  15.  ユーザの身体状態または精神状態に関する診断情報を取得する診断情報取得部と、
     特定の触感を生じさせる振動の性質である触質が異なる振動に関する複数の振動情報のうち、上記診断情報取得部により取得された診断情報に応じた触質を有する振動に関する振動情報を取得する振動情報取得部と、
     上記振動情報取得部により取得された上記振動情報を上記ユーザに提供する振動情報提供部とを備えた
    ことを特徴とする振動情報提供サーバ装置。
    a diagnostic information acquisition unit that acquires diagnostic information about the user's physical condition or mental condition;
    Vibration for obtaining vibration information about vibration having a tactile quality corresponding to the diagnostic information obtained by the diagnostic information obtaining unit, among a plurality of pieces of vibration information about vibration with different tactile quality, which is a property of vibration that causes a specific tactile sensation. an information acquisition unit;
    and a vibration information providing unit for providing the user with the vibration information acquired by the vibration information acquiring unit.
  16.  上記診断情報取得部は、
     上記ユーザの顔の撮影画像を取得する顔画像取得部と、
     上記顔画像取得部により取得された上記顔の撮影画像に基づいて、上記ユーザの身体状態または精神状態を解析することによって上記診断情報を得る画像解析部とを備えた
    ことを特徴とする請求項15に記載の振動情報提供サーバ装置。
    The above diagnostic information acquisition unit
    a face image acquisition unit that acquires a photographed image of the user's face;
    and an image analysis unit for obtaining the diagnostic information by analyzing the user's physical condition or mental condition based on the photographed image of the face acquired by the face image acquisition unit. 16. The vibration information providing server device according to 15.
  17.  上記画像解析部は、上記顔画像取得部により取得された上記顔の撮影画像を画像解析することにより、表情の硬さの程度を示す硬さ情報および顔の歪みの程度を示す歪み情報の少なくとも一方を上記診断情報として取得し、
     上記振動情報取得部は、上記硬さ情報および上記歪み情報の少なくとも一方に応じた触質を有する振動情報を取得する
    ことを特徴とする請求項16に記載の振動情報提供サーバ装置。
    The image analysis unit analyzes the photographed image of the face acquired by the face image acquisition unit to obtain at least hardness information indicating the degree of hardness of expression and distortion information indicating the degree of distortion of the face. Acquire one as the above diagnostic information,
    17. The vibration information providing server apparatus according to claim 16, wherein the vibration information acquisition unit acquires vibration information having a tactile quality corresponding to at least one of the hardness information and the strain information.
  18.  上記診断情報取得部は、上記身体状態または上記精神状態に関する特定の症状の度合いについて、自己または医師による診断結果を示す症状度合情報を上記診断情報として取得し、
     上記振動情報取得部は、上記症状度合情報に応じた触質を有する振動に関する振動情報を取得する
    ことを特徴とする請求項15に記載の振動情報提供サーバ装置。
    The diagnostic information acquiring unit acquires, as the diagnostic information, symptom degree information indicating a diagnosis result by the patient or a doctor regarding the degree of a specific symptom related to the physical condition or the mental condition,
    16. The vibration information providing server apparatus according to claim 15, wherein the vibration information acquisition unit acquires vibration information relating to vibration having a tactile quality corresponding to the symptom degree information.
  19.  請求項6に記載の振動情報提供システムにおける上記ユーザ端末に実装される振動情報取得装置であって、
     上記顔画像撮影部、上記顔画像送信部および上記振動情報受信部を備えたことを特徴とする振動情報取得装置。
    A vibration information acquisition device mounted on the user terminal in the vibration information providing system according to claim 6,
    A vibration information acquisition device comprising the face image capturing unit, the face image transmission unit, and the vibration information reception unit.
  20.  請求項7に記載の振動情報提供システムにおける上記ユーザ端末に実装される振動情報取得装置であって、
     上記顔画像撮影部と、
     上記顔画像送信部と、
     上記振動情報受信部と、
     上記診断情報提供部により上記サーバ装置から上記顔の撮影画像に関連付けて提供された上記診断情報を受信する診断情報受信部とを備えた
    ことを特徴とする振動情報取得装置。
    A vibration information acquisition device mounted on the user terminal in the vibration information providing system according to claim 7,
    the face image capturing unit;
    the face image transmission unit;
    the vibration information receiving unit;
    and a diagnostic information receiving section for receiving the diagnostic information provided by the diagnostic information providing section from the server device in association with the photographed image of the face.
  21.  請求項9に記載の振動情報提供システムにおける上記ユーザ端末に実装される振動情報取得装置であって、
     上記問診実行部、上記回答情報送信部および上記振動情報受信部とを備えたことを特徴とする振動情報取得装置。
    A vibration information acquisition device mounted on the user terminal in the vibration information providing system according to claim 9,
    A vibration information acquisition device comprising the medical interview execution unit, the response information transmission unit, and the vibration information reception unit.
  22.  請求項10に記載の振動情報提供システムにおける上記ユーザ端末に実装される振動情報取得装置であって、
     上記問診実行部と、
     上記回答情報送信部と、
     上記振動情報受信部と、
     上記診断情報提供部により上記サーバ装置から上記回答情報に関連付けて提供された上記診断情報を受信する診断情報受信部とを備えた
    ことを特徴とする振動情報取得装置。
    A vibration information acquisition device mounted on the user terminal in the vibration information providing system according to claim 10,
    the medical interview execution unit;
    the answer information transmission unit;
    the vibration information receiving unit;
    and a diagnostic information receiving section for receiving the diagnostic information provided by the diagnostic information providing section from the server device in association with the reply information.
  23.  コンピュータの診断情報取得部が、ユーザの身体状態または精神状態に関する診断情報を取得する第1のステップと、
     上記コンピュータの振動情報取得部が、特定の触感を生じさせる振動の性質である触質が異なる振動に関する複数の振動情報のうち、上記診断情報取得部により取得された診断情報に応じた触質を有する振動に関する振動情報を取得する第2のステップと、
     上記コンピュータの振動情報提供部が、上記振動情報取得部により取得された上記振動情報を上記ユーザに提供する第3のステップとを有する
    ことを特徴とする振動情報提供方法。
    a first step in which a diagnostic information acquisition unit of a computer acquires diagnostic information regarding the user's physical or mental state;
    The vibration information acquisition unit of the computer selects a tactile quality corresponding to the diagnostic information acquired by the diagnostic information acquisition unit from among a plurality of pieces of vibration information related to vibrations having different tactile qualities, which are vibration characteristics that cause a specific tactile sensation. a second step of obtaining vibration information about vibrations having
    and a third step in which the vibration information providing unit of the computer provides the vibration information acquired by the vibration information acquiring unit to the user.
  24.  ユーザの身体状態または精神状態に関する診断情報を取得する診断情報取得手段、
     特定の触感を生じさせる振動の性質である触質が異なる振動に関する複数の振動情報のうち、上記診断情報取得手段により取得された診断情報に応じた触質を有する振動に関する振動情報を取得する振動情報取得手段、および
     上記振動情報取得手段により取得された上記振動情報を上記ユーザに提供する振動情報提供手段
    としてコンピュータを機能させるための振動情報提供用プログラム。
    diagnostic information acquiring means for acquiring diagnostic information regarding the user's physical condition or mental condition;
    Vibration for obtaining vibration information about vibration having a tactile quality corresponding to the diagnostic information obtained by the diagnostic information obtaining means, among a plurality of vibration information about vibrations having different tactile qualities, which are vibration properties that cause a specific tactile sensation. Information obtaining means, and a vibration information providing program for causing a computer to function as vibration information providing means for providing the user with the vibration information obtained by the vibration information obtaining means.
  25.  請求項6に記載の振動情報提供システムにおける上記ユーザ端末にインストールされる振動情報取得用プログラムであって、
     上記顔画像撮影部の機能を提供する顔画像撮影手段、
     上記顔画像送信部の機能を提供する顔画像送信手段、および
     上記振動情報受信部の機能を提供する振動情報受信手段
    としてコンピュータを機能させるための振動情報取得用プログラム。
    A vibration information acquisition program installed in the user terminal in the vibration information providing system according to claim 6,
    face image capturing means for providing the function of the face image capturing unit;
    A vibration information acquisition program for causing a computer to function as face image transmission means that provides the function of the face image transmission section, and vibration information reception means that provides the function of the vibration information reception section.
  26.  請求項9に記載の振動情報提供システムにおける上記ユーザ端末にインストールされる振動情報取得用プログラムであって、
     上記問診実行部の機能を提供する問診実行手段、
     上記回答情報送信部の機能を提供する回答情報送信手段、および
     上記振動情報受信部の機能を提供する振動情報受信手段
    としてコンピュータを機能させるための振動情報取得用プログラム。
     
    A vibration information acquisition program installed in the user terminal in the vibration information providing system according to claim 9,
    an inquiry execution means for providing the function of the inquiry execution unit;
    A vibration information acquisition program for causing a computer to function as response information transmission means that provides the function of the response information transmission section, and vibration information reception means that provides the function of the vibration information reception section.
PCT/JP2021/009158 2021-03-09 2021-03-09 Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition WO2022190187A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2021/009158 WO2022190187A1 (en) 2021-03-09 2021-03-09 Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition
JP2023504902A JP7554515B2 (en) 2021-03-09 2021-03-09 Vibration information providing system, vibration information providing server device, vibration information acquiring device, vibration information providing method, vibration information providing program, and vibration information acquiring program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/009158 WO2022190187A1 (en) 2021-03-09 2021-03-09 Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition

Publications (1)

Publication Number Publication Date
WO2022190187A1 true WO2022190187A1 (en) 2022-09-15

Family

ID=83227539

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/009158 WO2022190187A1 (en) 2021-03-09 2021-03-09 Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition

Country Status (2)

Country Link
JP (1) JP7554515B2 (en)
WO (1) WO2022190187A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004357821A (en) * 2003-06-03 2004-12-24 Tanita Corp Stress diagnosis system
JP2010530281A (en) * 2007-06-21 2010-09-09 イマージョン コーポレーション Health monitoring using tactile feedback
JP2016523139A (en) * 2013-06-06 2016-08-08 トライコード ホールディングス,エル.エル.シー. Modular physiological monitoring system, kit, and method
JP2020120908A (en) * 2019-01-30 2020-08-13 パナソニックIpマネジメント株式会社 Mental state estimation system, mental state estimation method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8004391B2 (en) * 2008-11-19 2011-08-23 Immersion Corporation Method and apparatus for generating mood-based haptic feedback
US20160262690A1 (en) * 2015-03-12 2016-09-15 Mediatek Inc. Method for managing sleep quality and apparatus utilizing the same
JP6742196B2 (en) * 2016-08-24 2020-08-19 Cyberdyne株式会社 Life activity detection device and life activity detection system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004357821A (en) * 2003-06-03 2004-12-24 Tanita Corp Stress diagnosis system
JP2010530281A (en) * 2007-06-21 2010-09-09 イマージョン コーポレーション Health monitoring using tactile feedback
JP2016523139A (en) * 2013-06-06 2016-08-08 トライコード ホールディングス,エル.エル.シー. Modular physiological monitoring system, kit, and method
JP2020120908A (en) * 2019-01-30 2020-08-13 パナソニックIpマネジメント株式会社 Mental state estimation system, mental state estimation method, and program

Also Published As

Publication number Publication date
JP7554515B2 (en) 2024-09-20
JPWO2022190187A1 (en) 2022-09-15

Similar Documents

Publication Publication Date Title
Gürkök et al. Brain–computer interfaces for multimodal interaction: a survey and principles
US10475351B2 (en) Systems, computer medium and methods for management training systems
US20170352283A1 (en) Self-administered evaluation and training method to improve mental state
Kim A SWOT analysis of the field of virtual reality rehabilitation and therapy
US9814423B2 (en) Method and system for monitoring pain of users immersed in virtual reality environment
JP2019513516A (en) Methods and systems for acquiring, aggregating and analyzing visual data to assess human visual performance
US11779275B2 (en) Multi-sensory, assistive wearable technology, and method of providing sensory relief using same
Ramadan et al. Unraveling the potential of brain-computer interface technology in medical diagnostics and rehabilitation: A comprehensive literature review
Meusel Exploring mental effort and nausea via electrodermal activity within scenario-based tasks
Fortin et al. Laughter and tickles: Toward novel approaches for emotion and behavior elicitation
Akdağ et al. Measuring tactile sensitivity and mixed-reality-assisted exercise for carpal tunnel syndrome by ultrasound mid-air haptics
WO2022190187A1 (en) Vibration information-providing system, vibration information-providing server device, vibration information acquisition device, vibration information-providing method, program for vibration information provision and program for vibration information acquisition
Elor Development and evaluation of intelligent immersive virtual reality games to assist physical rehabilitation
Antunes et al. Modeling serious games design towards engaging children with special needs in therapy
Rosenkranz et al. Sound perception in realistic surgery scenarios: Towards EEG-based auditory work strain measures for medical personnel
KR20160109101A (en) System diagnosing adhd using touchscreen
Şahinol Collecting Data and the Status of the Research Subject in Brain-Machine Interface Research in Chronic Stroke Rehabilitation
Dinh et al. NeuResonance: Exploring Feedback Experiences for Fostering the Inter-brain Synchronization
Teruel et al. Exploiting awareness for the development of collaborative rehabilitation systems
Grant Behavioural and neurophysiological measures of haptic feedback during a drilling simulation
JP2021083654A (en) Attention function training system, attention function training method, training processing apparatus, and computer program
Mohamad et al. Study on elicitation and detection of emotional states with disabled users
Templeton Towards a Comprehensive Mobile-Based Neurocognitive Digital Health Assessment System (NDHAS)
Miri et al. Emotion Regulation in the Wild: The WEHAB Approach
McDowell et al. OPEN-SOURCE SOFTWARE FOR DATA ACQUISITION AND CONTROL IN HUMAN MOTION: A CASE STUDY ON THE ACTIVE BALANCE BOARD

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21929431

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023504902

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21929431

Country of ref document: EP

Kind code of ref document: A1