[go: up one dir, main page]

GB2639263A - A booth - Google Patents

A booth

Info

Publication number
GB2639263A
GB2639263A GB2403713.7A GB202403713A GB2639263A GB 2639263 A GB2639263 A GB 2639263A GB 202403713 A GB202403713 A GB 202403713A GB 2639263 A GB2639263 A GB 2639263A
Authority
GB
United Kingdom
Prior art keywords
booth
occupant
health state
medical data
physical attributes
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2403713.7A
Other versions
GB202403713D0 (en
Inventor
Moodley Chane
Moodley Devan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Health Connect Global Ltd
Original Assignee
Health Connect Global Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Health Connect Global Ltd filed Critical Health Connect Global Ltd
Priority to GB2403713.7A priority Critical patent/GB2639263A/en
Publication of GB202403713D0 publication Critical patent/GB202403713D0/en
Publication of GB2639263A publication Critical patent/GB2639263A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6888Cabins
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Surgery (AREA)
  • Artificial Intelligence (AREA)
  • Business, Economics & Management (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Databases & Information Systems (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • General Business, Economics & Management (AREA)
  • Evolutionary Computation (AREA)
  • Fuzzy Systems (AREA)
  • Mathematical Physics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Medical Treatment And Welfare Office Work (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

A medical booth 10 for accommodating an occupant that has sensors 14 for sensing physical attributes of the occupant; an information retrieval module (26, figure 3), configured to retrieve medical data from a medical data repository based on the physical attributes; and a user interface 12 configured to display the retrieved medical data for the occupant to view. The sensors may include a weighing scale, finger touch electrocardiography, laser for height, sphygmomanometers and a camera. A neural network may be used for machine learning. A health state detection module (224, figure 6), includes a mathematical rules based model (28 figure 6), a comparison module 30 figure 6 and a classification module 34. Machine learning model may be trained to classify the occupant’s health state using the physical attributes and the classification from the classification module (34, figure 6) as inputs.

Description

A BOOTH
FIELD
[1] The field relates to booths and in particular booths dimensioned to occupy a human occupant wishing to retrieve medical data.
BACKGROUND
[2] If an individual currently wishes to seek medical information, they have various options including seeking medical advice from a medical professional and also seeking information themselves for example by searching the internet. The former is only usually done when the individual believes something is wrong and has symptoms. It is having such symptoms that can lead an individual to seek medical information from the internet.
[3] However, the information retrieved is only as good as the information dispensed to the medical professional or the internet.
[4] In addition, one set of information is not relevant for all patients. For example, an Asian patient who is obese may require different information to a Caucasian patient with obesity.
[5] It is an aim of the subject-matter of the present disclosure to improve on the prior art by alleviating such issues.
SUMMARY
[6] The system and method disclosed in this invention aim to provide a novel alternative, user centered, information and education system with a uniquely defined human-computer-interaction (HCI) interface. Embodiments of this invention explain a novel individually-tailored approach based on artificial intelligence in a uniquely developed intelligent information booth, or pod, setting. In this way, the terms booth and pod may be used interchangeably throughout this disclosure.
[7] According to an aspect of the present disclosure, there is provided a booth for accommodating an occupant, the booth comprising: a sensor for sensing physical attributes of the occupant; an information retrieval module configured to retrieve medical data from a medical data repository based on the physical attributes; and a user interface configured to display the retrieved medical data for the occupant to view.
[8] In an embodiment, the sensor comprises one or more sensors selected from a list of sensors including a camera, a laser, a pressure sensor, a weighing scale, a touch electrocardiograph, and a sphygmomanometer.
[9] In an embodiment, the physical attributes are selected from a list of physical attributes including height, weight, body mass index, skin colour, blood pressure, pulse rate, heart rhythm, and beathing pattern.
[10] In an embodiment, the booth further comprises an identification module including a biometrics sensor to sense one or more biometric features, and a permission module configured to grant access to the booth if a user is known and refused permission to the booth if the user is unknown.
[11] In an embodiment, the biometric features include one or more of a face, an eye, and a fingerprint.
[12] In an embodiment, the booth further comprises: a health state detection module configured to detect a health state of the occupant based on the sensed physical attributes, wherein the information retrieval module is configured to retrieve medical data relating to the health state from a medical data repository.
[13] In an embodiment, the health state detection module comprises a mathematical rules-based model configured to receive the sensed physical attributes and compute a health state of the occupant from the physical attributes, a comparison module configured to compare the computed health state to one or more health state thresholds, and a classification module configured to classify the occupant's health state based on the comparison.
[14] In an embodiment, the health state detection module comprises a machine learning model trained to classify the occupant's health state using the sensed physical attributes as inputs.
[15] In an embodiment, the machine learning model is trained to classify the occupant's health state using the sensed physical attributes and the occupant's health state classified by the classification module as inputs.
[16] In an embodiment, the machine learning model comprises a neural network.
[17] In an embodiment, the information retrieval module is configured to retrieve the medical data by parsing one or more of text in the medical data, text labelling the medical data, or metadata associated with the medical data.
[18] In an embodiment, the medical data comprises medical journals, medical videos, medical articles, patient notes, or patient medical images.
[19] In an embodiment, the user interface comprises a user input module for receiving user inputs to interact with the medical data.
[20] In an embodiment, the booth further comprises a communication module for enabling a remote user to access the user interface and/or sensors in the booth for communicating with the occupant during use.
[21] In an embodiment, the booth further comprises a plurality of walls forming an enclosure from an external environment, wherein the sensor and the user interface are mounted to one or more of the plurality of walls, wherein one or the plurality of walls is a door enabling access to the occupant.
BRIEF DESCRIPTION OF DRAWINGS
[22] The subject-matter of the present disclosure is best described with reference to the accompanying figures, in which: [23] Figure 1 shows a schematic of a booth according to one or more embodiments; [24] Figure 2 shows an interior of the booth of Figure 1; [25] Figure 3 shows a block diagram representing functional elements of the booth of Figure 1; [26] Figure 4 shows a block diagram of a health state detection module forming one of the functional elements of the block diagram of Figure 3; [27] Figure 5 shows a block diagram of another health state detection module to the health state detection module of Figure 4; and [28] Figure 6 shows a block diagram of another health state detection module to the health state detection module of Figure 4 and Figure 5.
DESCRIPTION OF EMBODIMENTS
[29] The embodiments described herein are embodied as sets of instructions stored as electronic data in one or more storage media. Specifically, the instructions may be provided on a transitory or non-transitory computer-readable media. When executed by the processor, the processor is configured to perform the various methods described in the following embodiments. In this way, the methods may be computer-implemented methods.
[30] Whilst the following embodiments provide specific illustrative examples, those illustrative examples should not be taken as limiting, and the scope of protection is defined by the claims. Features from specific embodiments may be used in combination with features from other embodiments without extending the subject-matter beyond the content of the present disclosure.
[31] Before describing the embodiments in detail, the present disclosure can be summarised as follows.
[32] The invention being disclosed consists of a system and method for individually tailored information sharing, retrieval and knowledge acquisition. The disclosure pertains to healthcare and wellbeing, while providing a novel system and method for healthcare training and education purposes. In this disclosure the system is composed of many known electronic components working coherently to deliver a unique and novel experience to the user, while the method allows for practical information dissemination. Additionally, through the use of artificial intelligence algorithms such as facial recognition and general computer vision tailored for health assessment, the user can immediately gain access to information by simply walking into the intelligent information pod. The intelligent information pod is fully self-controlled and requires no external staff members to operate it. Remote access can, however, be gained by authorised personnel through a series of functional APIs incorporated into a secure native application. A user will walk into the intelligent information pod consisting of scanning systems and walls built entirely of screens, several scanning systems will initialise and commence scanning. Scanning systems comprise equipment such as finger touch electrocardiography, scales for weight, lasers for height, sphygmomanometers, cameras for: computer vision algorithms; breath sensing; skin paleness detection and several other detection mechanisms detailed in the description and embodiments sections. Upon completion of these scans the data will be cross referenced by the intelligent algorithm system to display useful information to the user based on any anomalies detected, this is detailed further in the embodiments section. The system and method work simultaneously to feed into each other, enhancing the novelty of the disclosure.
[33] With reference to Figures 1 and 2, the intelligent information pod, or booth, although considered as a system and a method, comprises a physical pod, booth, or even room, made up of a plurality of walls forming an enclosure to enclose a human occupant from an external environment. Interior faces of the booth walls may include user interfaces 12 and sensors 14. The user interfaces 12 and sensors 14 may be mounted to one or more of the plurality of walls. One of the walls may be a door 16 having a hinged connection with an adjacent wall to enable the occupant to enter and leave the booth.
[34] Whilst not depicted in this way in Figure 2, in some embodiments, the toom made be made up entirely of user interfaces as the walls.
[35] The sensors 14 may be for sensing physical attributes of the occupant. The sensors 14 may be selected from a list of sensors including a camera, a laser, a pressure sensor, a weighing scale, a touch electrocardiogram, a sphygmomanometer, and a blood pressure monitor. As above, the camera may be linked to a computer vision model trained to detect various visual parameters of the occupant from images captured by the camera. The visual parameters include skin colour (e.g. race, paleness, colour distributions across skin, etc), breathing pattern by breath sensing, etc. The laser may be used to detect a height of the occupant. The pressure sensor may be used to detect the weight of the occupant, which pressure sensor may also be called a scale, or weighing scale. The touch electrocardiograph may be a finger touch electrocardiograph, which may be used to detect pulse rate of the occupant and/or heart rhythm. The sphygmomanometer may be used to detect a blood pressure of the occupant.
[36] In this way, the physical attributes of the occupant that the sensor can detect may be selected from a list of physical attributes including height, weight, body mass index, skin colour, blood pressure, pulse rate, heart rhythm, and beathing pattern.
[37] With reference specifically to Figure 1, the booth 10 may also include an identification module 20. The identification module 20 may be positioned on an exterior of one of the walls, for instance the door of the booth. The identification module may include a biometrics sensor to sense one or more biometric features of someone trying to occupy the booth. The biometrics sensor may include a camera, for example. The biometrics features may include one or more of a face of the individual, an eye of the individual, and a fingerprint of the individual. The identification module may also include a permission module configured to grant access to the booth if the individual is known and refuse permission to the booth if the user is unknown. Access to the booth is enabled by unlocking the door so the individual can open it and become an occupant within the booth. In other words, a handle 22 of the booth may be used to open the booth when the door is unlocked.
[38] With reference to Figure 3, the booth may be considered as a system comprising various functional elements including the sensor, or sensors, 14, a health state detection module 24, an information retrieval module 26, and the user interface, or user interfaces, 12. The health detection module 24 is configured to detect a health state of the occupant based on the sensed physical attributes. The information retrieval module is configured to retrieve medial data relating to the health state from a medical repository. The user interface is configured to display the retrieved medical data for the user, or occupant to view.
[39] Three different health state detection modules are shown in Figures 4 to 6, respectively.
[40] With reference to Figure 4, the health state detection module 24 may comprise a mathematical rules-based model 28, comparison module 30, a repository of health state thresholds 32, and a classification module 34.
[41] The mathematical rules-based model 28 is configured to receive the sensed physical attributes and compute a health state of the occupant from the physical attributes.
[42] The comparison module 30 may compare the computed health state to one or more health state thresholds contained within the repository of health state thresholds 32. For instance, the health state thresholds may include values denoting certain health states such as hypotension, e.g. a threshold of 120/80.
[43] The classification module 34 may classify the occupant's health state based on the comparison. The health state classification may be one or more of overweight, obese, underweight, hypertensive, hypotensive, tachycardic, bradycardic, jaundice, plus many more.
[44] With reference to Figure 5, the health state detection module 124 may comprise a machine learning model 136. The machine learning model may be trained to classify the occupant's health state using the sensed physical attributes as inputs. The machine learning model may be a supervised machine learning model. The supervised machine learning model may be a neural network.
[45] With reference to Figure 6, the health state detection module 224 may comprise the health state detection module 24 from Figure 4 combined with the health state detection module 124 from Figure 5.
[46] For instance, the health state detection module 224 includes a mathematical rules-based model 28, the comparison module 30, the repository of health state thresholds 32, and the classification module 34. The classification by the classification module 34 may be used as an input to the machine learning model 136 in addition to the sensed physical attributes. The machine learning model may be trained to classify the occupant's health state using the sensed physical attributes and the classification from the classification module 34 as inputs.
[47] With further reference to Figure 3, the information retrieval module 26 is configured to retrieve medical data from a medical data repository 40. The information retrieval module 26 is configured to retrieve the medical data by parsing one or more of text in the medical data, text labelling the medical data, or metadata associated with the medical data. The medical data may include medical journals, medical videos (preferably who's narrative has been transcribed to text), medical articles, patient notes, or patient medical images, e.g. X-rays, MRI scans, etc. [48] The user interface of Figure 1 may include a user input module, e.g. a touch screen, for receiving user inputs to interact with the medical data. For example, the touch screen may recognise gestures of the occupant such as a swipe for scrolling, a tap for selection, separating two hand digits, e.g. a finger and a thumb, to zoom in, closing the two hand digits to zoom out, etc. The user input module may also include one or more cameras and a computer vision model to sense and recognise non-contact gestures, e.g. waving.
[49] The booth may also include a communication module (not shown) and a functional application programming interface (API) allowing a remote user to access and communicate with an occupant in the booth. The remote user may be a medical professional, a tutor, or an occupant of another booth, for example.
[50] The operation of the booth may be summarised as follows.
[51] A user will walk up to the intelligent information pod's door, cameras and facial recognition algorithms of the identification module determine if in fact this is a pre-authorised, or known, user. Depending on the outcome the door to the pod will either open or remain closed. Once inside the pod the user will undergo several non-invasive scans. Non-destructive lasers will measure the user's height, pressure sensors the weight, several other systems will retrieve information about the user in a non-invasive manner, using the sensors.
[52] The information retrieved through this non-invasive scanning process is then relayed to the algorithms of the health state detection module. Once the system has received the information it passes the information to the artificial intelligence algorithms of the information retrieval module which cross references the data that is stored within the medical data repository 40. User data, post-cross referencing, is then converted to scientifically accepted and approved information. For instance, the cross-referencing may act as a filter to provide only information to the user related to their physical attributes. For example, if a user has dark skin and is overweight, the information will be specific to that set of attributes. Whereas, when a user is Caucasian and is not overweight, the cross-referencing will generate a different set of information. This is useful because people having different ethnic backgrounds are typically predisposed to different pathologies. Using the intelligent algorithms, the information is displayed on the user interfaces on the screen-walls of the pod to the user. The user can then fully interact with the information by way of hand gestures anywhere near motion sensors linked to the screens.
Motion sensors, e.g. using cameras, pick up hand gestures such as air scrolling, section swiping and pinch zooming to relay information pertinent to the user's gestures.
[53] Whilst the foregoing embodiments have been described to illustrate the subject-matter of the present disclosure, the features of the embodiments are not to be taken as limiting the scope of protection. For the avoidance of doubt, the scope of protection is defined by the following claims.

Claims (15)

  1. CLAIMS1. A booth for accommodating an occupant, the booth comprising: a sensor for sensing physical attributes of the occupant; an information retrieval module configured to retrieve medical data from a medical data repository based on the physical attributes; and a user interface configured to display the retrieved medical data for the occupant to view.
  2. 2. The booth of Claim 1, wherein the sensor comprises one or more sensors selected from a list of sensors including a camera, a laser, a pressure sensor, a weighing scale, a touch electrocardiograph, and a sphygmomanometer.
  3. 3. The booth of Claim 1 or Claim 2, wherein the physical attributes are selected from a list of physical attributes including height, weight, body mass index, skin colour, blood pressure, pulse rate, heart rhythm, and beathing pattern.
  4. 4. The booth of any preceding claim, further comprising an identification module including a biometrics sensor to sense one or more biometric features, and a permission module configured to grant access to the booth if a user is known and refused permission to the booth if the user is unknown.
  5. 5. The booth of any Claim 4, wherein the biometric features include one or more of a face, an eye, and a fingerprint.
  6. 6. The booth of any preceding claim, further comprising: a health state detection module configured to detect a health state of the occupant based on the sensed physical attributes, wherein the information retrieval module is configured to retrieve medical data relating to the health state from a medical data repository.
  7. 7. The booth of Claim 6, wherein the health state detection module comprises a mathematical rules-based model configured to receive the sensed physical attributes and compute a health state of the occupant from the physical attributes, a comparison module configured to compare the computed health state to one or more health state thresholds, and a classification module configured to classify the occupant's health state based on the comparison.
  8. 8. The booth of Claim 6 or Claim 7, wherein the health state detection module comprises a machine learning model trained to classify the occupant's health state using the sensed physical attributes as inputs.
  9. 9. The booth of Claim 8 when dependent upon Claim 7, wherein the machine learning model is trained to classify the occupant's health state using the sensed physical attributes and the occupant's health state classified by the classification module as inputs.
  10. 10. The booth of Claim 8 or Claim 9, wherein the machine learning model comprises a neural network.
  11. 11. The booth of any preceding claim, wherein the information retrieval module is configured to retrieve the medical data by parsing one or more of text in the medical data, text labelling the medical data, or metadata associated with the medical data.
  12. 12. The booth of any preceding claim, wherein the medical data comprises medical journals, medical videos, medical articles, patient notes, or patient medical images.
  13. 13. The booth of any preceding claim, wherein the user interface comprises a user input module for receiving user inputs to interact with the medical data.
  14. 14. The booth of any preceding claim, further comprising a communication module for enabling a remote user to access the user interface and/or sensors in the booth for communicating with the occupant during use.
  15. 15. The booth of any preceding claim, comprising a plurality of walls forming an enclosure from an external environment, wherein the sensor and the user interface are mounted to one or more of the plurality of walls, wherein one or the plurality of walls is a door enabling access to the occupant.
GB2403713.7A 2024-03-14 2024-03-14 A booth Pending GB2639263A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2403713.7A GB2639263A (en) 2024-03-14 2024-03-14 A booth

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2403713.7A GB2639263A (en) 2024-03-14 2024-03-14 A booth

Publications (2)

Publication Number Publication Date
GB202403713D0 GB202403713D0 (en) 2024-05-01
GB2639263A true GB2639263A (en) 2025-09-17

Family

ID=90826101

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2403713.7A Pending GB2639263A (en) 2024-03-14 2024-03-14 A booth

Country Status (1)

Country Link
GB (1) GB2639263A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060241970A1 (en) * 2005-04-21 2006-10-26 Aleksander Winiarski Process and device for health orientation
US20090137882A1 (en) * 2007-09-11 2009-05-28 Franck Baudino System and a method for detecting an endemic or an epidemic disease
US20090143652A1 (en) * 2007-11-30 2009-06-04 Ziehm Medical Llc Apparatus and Method for Measuring, Recording and Transmitting Primary Health Indicators
WO2013119646A1 (en) * 2012-02-07 2013-08-15 HealthSpot Inc. Medical kiosk and method of use
US20200168329A1 (en) * 2012-03-02 2020-05-28 Leonard Solie Medical services kiosk

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060241970A1 (en) * 2005-04-21 2006-10-26 Aleksander Winiarski Process and device for health orientation
US20090137882A1 (en) * 2007-09-11 2009-05-28 Franck Baudino System and a method for detecting an endemic or an epidemic disease
US20090143652A1 (en) * 2007-11-30 2009-06-04 Ziehm Medical Llc Apparatus and Method for Measuring, Recording and Transmitting Primary Health Indicators
WO2013119646A1 (en) * 2012-02-07 2013-08-15 HealthSpot Inc. Medical kiosk and method of use
US20200168329A1 (en) * 2012-03-02 2020-05-28 Leonard Solie Medical services kiosk

Also Published As

Publication number Publication date
GB202403713D0 (en) 2024-05-01

Similar Documents

Publication Publication Date Title
Wang et al. A data fusion-based hybrid sensory system for older people’s daily activity and daily routine recognition
US11694809B2 (en) Patient monitoring systems and methods
CA2631870C (en) Residual-based monitoring of human health
CN108135487A (en) For obtaining the equipment, system and method for the vital sign information of object
KR102435808B1 (en) Healthcare apparatus for measuring stress score
US20250037884A1 (en) Cyber-physical system to enhance usability and quality of telehealth consultation
CN108882853A (en) Measurement physiological parameter is triggered in time using visual context
Klonovs et al. Distributed computing and monitoring technologies for older patients
CN115317304A (en) Intelligent massage method and system based on physiological characteristic detection
TWI837364B (en) Heartbeat analyzing method and heartbeat analyzing system
JP6821364B2 (en) Health management system and its programs
Pogorelc et al. Home-based health monitoring of the elderly through gait recognition
Inoue et al. Bed exit action detection based on patient posture with long short-term memory
US20230414114A1 (en) Disease detection system
GB2639263A (en) A booth
AU2021236462B2 (en) Residual-based monitoring of human health
US12478263B2 (en) Disease detection system
JP2001061817A (en) Recording medium recording personal identification method and personal identification program
TW202221621A (en) Virtual environment training system for nursing education
JPWO2019186955A1 (en) Biometric information estimation device, biometric information estimation method, and biometric information estimation program
US20240260904A1 (en) Electronic device
US20240242790A1 (en) System for assisting with provision of diagnostic information
US20240324882A1 (en) System for assisting with provision of diagnostic information
JP7804399B2 (en) Trained model generation method, trained model generation system, inference device, inference system, and computer program
Urukalo et al. Sensor placement determination for a wearable device in dual-arm manipulation tasks