[go: up one dir, main page]

AU2020102378A4 - MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY - Google Patents

MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY Download PDF

Info

Publication number
AU2020102378A4
AU2020102378A4 AU2020102378A AU2020102378A AU2020102378A4 AU 2020102378 A4 AU2020102378 A4 AU 2020102378A4 AU 2020102378 A AU2020102378 A AU 2020102378A AU 2020102378 A AU2020102378 A AU 2020102378A AU 2020102378 A4 AU2020102378 A4 AU 2020102378A4
Authority
AU
Australia
Prior art keywords
user
information
server
portable device
triggering event
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2020102378A
Inventor
S. B. Chordiya
Ashish Dudhale
Sandeep Kumar Gupta
Arpit Jain
Archana Shirbhate
Prachi labde
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shirbhate Archana Dr
Original Assignee
Shirbhate Archana Dr
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shirbhate Archana Dr filed Critical Shirbhate Archana Dr
Priority to AU2020102378A priority Critical patent/AU2020102378A4/en
Application granted granted Critical
Publication of AU2020102378A4 publication Critical patent/AU2020102378A4/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B31/00Predictive alarm systems characterised by extrapolation or other computation using updated historic data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/182Level alarms, e.g. alarms responsive to variables exceeding a threshold
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B5/00Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied
    • G08B5/22Visible signalling systems, e.g. personal calling systems, remote indication of seats occupied using electric transmission; using electromagnetic transmission
    • G08B5/222Personal calling arrangements or devices, i.e. paging systems
    • G08B5/223Personal calling arrangements or devices, i.e. paging systems using wireless transmission
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • H04W64/003Locating users or terminals or network equipment for network management purposes, e.g. mobility management locating network equipment

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Software Systems (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Theoretical Computer Science (AREA)
  • Computing Systems (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Signal Processing (AREA)
  • Telephonic Communication Services (AREA)

Abstract

Our invention "MT-Family Member Activities and Location" is a device and system are provided for location, distance, audio, video, notifying a user contact of the status of a user of a portable device. The invented technology also provide the all relevant status is determined by the portable device collecting user provided information and device collected information relevant to a user of portable device. The invented technology also the portable device may then transmit the device collected information and the user provided information to a server that in turn performs an analysis on the device collected information and the user provided information to determine whether a triggering event has occurred. The invented technology If it is determined that a triggering event has occurred the local server, global server will proceed to send a status update regarding the user of the portable device to preset user contacts.The invented technology also the triggering event is determined to have occurred based on preset user conditions and algorithms and artificial intelligence being executed at the server and an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.The invented technology can be implemented using any of a number of different platforms, such as the web, email, smartphone, etc. the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact. 27 Company 102 108 Pa-t Function2] AFunesen 3 START FIG.1: I A SSTEMBLOC DIARAM N ACORDACE WTH A EMBDIMET.OFTHE.ISCLSURE

Description

Company 102 108
Pa-t Function2] AFunesen 3
START
FIG.1: I A SSTEMBLOC DIARAM N ACORDACE WTH A EMBDIMET.OFTHE.ISCLSURE
MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY
FIELD OF THE INVENTION
Our Invention "MT-Family Member Activities and Location" is related to family member activities and location management technology.
BACKGROUND OF THE INVENTION
Many people rely on portable communication devices such as smart phones for voice communication and accessing information. These devices can also send and receive text messages, download content from the internet, and interface with various third party software applications. Some communication devices have become portable enough to be worn by a user. For example, some wrist watches now incorporate microprocessors, graphic interfaces, and network connectivity allowing the watch to wirelessly communicate with other devices. Additionally, some other devices incorporate sensors for monitoring an environment surrounding the portable device and/or the user carrying the portable device. As such, the portable device collects a variety of information about the user and the environment of the user.
However, the information collected by the portable device may not always be efficiently used to the benefit of the user. For instance, the user may have certain medical conditions that require health monitoring for a heart condition, or the user may travel frequently and need to update concerned persons regarding their location and status. In the case of the user with the medical condition requiring monitoring, concerned individuals may have to make special phone calls or visits to the user to make sure they are okay. And, in the case of the frequent traveler, the user may have to make phone calls or type out a detailed text message to ensure that any concerned individuals are aware of their location and status. Each of these activities is time consuming and inefficient.
Today's electronic devices are able to access a large, growing, and diverse quantity of functions, services, and information, both via the Internet and from other sources. Functionality for such devices is increasing rapidly, as many consumer devices, smartphones, tablet computers, and the like, are able to run software applications to perform various tasks and provide different types of information. Often, each application, function, website, or feature has its own user interface and its own operational paradigms, many of which can be burdensome to learn or overwhelming for users. In addition, many users may have difficulty even discovering what functionality and/or information is available on their electronic devices or on various websites; thus, such users may become frustrated or overwhelmed, or may simply be unable to use the resources available to them in an effective manner.
In particular, novice users, or individuals who are impaired or disabled in some manner, and/or are elderly, busy, distracted, and/or operating a vehicle may have difficulty interfacing with their electronic devices effectively, and/or engaging online services effectively. Such users are particularly likely to have difficulty with the large number of diverse and inconsistent functions, applications, and websites that may be available for their use.
Accordingly, existing systems are often difficult to use and to navigate, and often present users with inconsistent and overwhelming interfaces that often prevent the users from making effective use of the technology.
PRIOR ART SEARCH
JP2001250183A2000-03-072001-09-14Denso CorpEmergency notification system for vehicle. US6340928B12000-06-222002-01-22Trw Inc.Emergency assistance system using bluetooth technology. W02002035869A12000-10-272002-05-02Christian KazumasaSmart communication interface device, applications and enhanced mobile productivity features. US6509830B12000-06-022003-01-2Bbnt Solutions LlcSystems and methods for providing customizable geo-location tracking services. US20030218539A1*2002-05-222003-11-27Hight Myra R.Location tracking apparatus, system, and method. US20040152441A12002-07-102004-08-O5Wong Wai-See CandyWireless handset emergency location provisioning system (wireless HELPS). US20040203622A12002-12-032004-10-14Brian EsqueAutomatic notification of personal emergency contacts from a wireless communications device. US20050064887A11999-08-122005-03-24Henrik BengtssonSystem and method for sending multimedia attachments to text messages in radiocommunication systems. US20060033615A12004-08-122006-02-16Seong Taeg NouEmergency safety service system and method using telematics system. US20060068753A12004-09-222006-03-30Jim Karen Emergency call handling system. US20070087726A12005-08-172007-04-19Mcgary FaithSystem and method for providing emergency notification services via enhanced directory assistance. S9374554B12014-03-252016-06-2lAmazon Technologies, Inc.Display selection for video conferencing. US9659003B2*2014-03-262017-05-23Lenovo (Singapore) Pte. Ltd.Hybrid language processing. US9373318B12014-03-272016-06-2lAmazon Technologies, Inc.Signal rate synchronization for remote acoustic echo cancellation. US9529794B22014-03-272016-12-27Microsoft Technology Licensing, LlcFlexible schema for language model customization. US9336767B12014-03-282016-05-lOAmazon Technologies, Inc.Detecting device proximities.
US9317873B22014-03-282016-04-19Google Inc.Automatic verification of advertiser identifier in advertisements. US9607207B12014-03-312017-03-28Amazon Technologies, Inc.Plane-fitting edge detection.
OBJECTIVES OF THE INVENTION
1) The objective of the invention is to a device and system are provided for location, distance, audio, video, notifying a user contact of the status of a user of a portable device. The invented technology also provide the all relevant status is determined by the portable device collecting user provided information and device collected information relevant to a user of portable device. 2) The other objective of the invention is to the invented technology also the portable device may then transmit the device collected information and the user provided information to a server that in turn performs an analysis on the device collected information and the user provided information to determine whether a triggering event has occurred. 3) The other objective of the invention is to the invented technology If it is determined that a triggering event has occurred the local server, global server will proceed to send a status update regarding the user of the portable device to preset user contacts. 4) The other objective of the invention is to the invented technology also the triggering event is determined to have occurred based on preset user conditions and algorithms and artificial intelligence being executed at the server and an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. 5) The other objective of the invention is to the invented technology can be implemented using any of a number of different platforms, such as the web, email, smartphone, etc. the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact. 6) The other objective of the invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of communicating the one or more images to the user contact. The invention is to wherein the one or more sensors comprises a microphone, and upon the occurrence of the triggering event, the microphone captures an audio signal in an environment around the portable device. 7) The other objective of the invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of communicating the audio signal to the user contact. 8) The other objective of the invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of causing a device tethered to the portable device to communicate one or more images captured by an image sensor of the tethered device or communicate an audio signal captured by a microphone of the tethered device. 9) The other objective of the invention is to wherein the memory further comprises instructions for causing the processor to perform the further step of indicating that the status of the user has been communicated to the user contact.
SUMMARY OF THE INVENTION
The invention provides a portable electronic device including a processor, and a network interface for communicating with a wireless network. The portable device further includes an input device for accepting user provided information from a user of the portable device and one or more sensors for acquiring device collected information of the user of the portable device. The portable device also includes a memory comprising instructions for causing the processor to perform the steps of: collecting at least one of the user provided information and the device collected information; and transmitting the at least one of the user provided information and the device collected information to a server for communicating a status to a user contact upon occurrence of a triggering event. In this embodiment, the content of the status is based on at least one of the user provided information and the device collected information.
A system for monitoring a user of a portable device. The system includes a portable device associated with the user of the portable device and a server communicatively coupled to the portable device through a wireless network. The portable device includes a processor and a network interface for communicating with the wireless network. The portable device further includes an input device for accepting user provided information of the user of the portable device and one or more sensors for collecting device collected information from the user of the portable device. The portable device also includes a memory comprising instructions for causing the processor to perform the steps of: collecting at least one of the user provided information and the device collected information; and transmitting, by the network interface over the wireless network, at least one of the user provided information and the device collected information to the server.
A method of reporting a status of a user of a portable device to a user contact. The method receives at least one of user provided information and device collected information of the user collected by at least one of an input device and one or more sensors associated with the portable device. The method also determines, based on the at least one of the user provided information and the device collected information, whether a triggering event has occurred and conditionally communicates the status of the user of the portable device to the user contact when the triggering event has occurred.
The invention, an intelligent automated assistant is implemented on an electronic device, to facilitate user interaction with a device, and to help the user more effectively engage with local and/or remote services. In various embodiments, the intelligent automated assistant engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions.
The intelligent automated assistant integrates a variety of capabilities provided by different software components (e.g., for supporting natural language recognition and dialog, multimodal input, personal information management, task flow management, orchestrating distributed services, and the like). Furthermore, to offer intelligent interfaces and useful functionality to users, the intelligent automated assistant of the present invention may, in at least some embodiments, coordinate these components and services. The conversation interface, and the ability to obtain information and perform follow-on task, are implemented, in at least some embodiments, by coordinating various components such as language components, dialog components, task management components, information management components and/or a plurality of external services.
According to various embodiments of the present invention, intelligent automated assistant systems may be configured, designed, and/or operable to provide various different types of operations, functionalities, and/or features, and/or to combine a plurality of features, operations, and applications of an electronic device on which it is installed. In some embodiments, the intelligent automated assistant systems of the present invention can perform any or all of: actively eliciting input from a user, interpreting user intent, disambiguating among competing interpretations, requesting and receiving clarifying information as needed, and performing (or initiating) actions based on the discerned intent. Actions can be performed, for example, by activating and/or interfacing with any applications or services that may be available on an electronic device, as well as services that are available over an electronic network such as the Internet.
In various embodiments, such activation of external services can be performed via APIs or by any other suitable mechanism. In this manner, the intelligent automated assistant systems of various embodiments of the present invention can unify, simplify, and improve the user's experience with respect to many different applications and functions of an electronic device, and with respect to services that may be available over the Internet. The user can thereby be relieved of the burden of learning what functionality may be available on the device and on web-connected services, how to interface with such services to get what he or she wants, and how to interpret the output received from such services; rather, the assistant of the present invention can act as a go-between between the user and such diverse services.
The present invention provides a conversational interface that the user may find more intuitive and less burdensome than conventional graphical user interfaces. The user can engage in a form of conversational dialog with the assistant using any of a number of available input and output mechanisms, such as for example speech, graphical user interfaces (buttons and links), text entry, and the like. The system can be implemented using any of a number of different platforms, such as device APIs, the web, email, and the like, or any combination thereof. Requests for additional input can be presented to the user in the context of such a conversation. Short and long term memory can be engaged so that user input can be interpreted in proper context given previous events and communications within a given session, as well as historical and profile information about the user.
The context information derived from user interaction with a feature, operation, or application on a device can be used to streamline the operation of other features, operations, or applications on the device or on other devices. For example, the intelligent automated assistant can use the context of a phone call (such as the person called) to streamline the initiation of a text message (for example to determine that the text message should be sent to the same person, without the user having to explicitly specify the recipient of the text message). The intelligent automated assistant of the present invention can thereby interpret instructions such as "send him a text message", wherein the "him" is interpreted according to context information derived from a current phone call, and/or from any feature, operation, or application on the device. In various embodiments, the intelligent automated assistant takes into account various types of available context data to determine which address book contact to use, which contact data to use, which telephone number to use for the contact, and the like, so that the user need not re-specify such information manually.
The assistant can also take into account external events and respond accordingly, for example, to initiate action, initiate communication with the user, provide alerts, and/or modify previously initiated action in view of the external events. If input is required from the user, a conversational interface can again be used.
The system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact. In various embodiments, these external services include web-enabled services, as well as functionality related to the hardware device itself. For example, in an embodiment where the intelligent automated assistant is implemented on a smartphone, personal digital assistant, tablet computer, or other device, the assistant can control many operations and functions of the device, such as to dial a telephone number, send a text message, set reminders, add events to a calendar, and the like.
The system of the present invention can be implemented to provide assistance in any of a number of different domains. Examples include:
• Local Services (including location- and time-specific services such as restaurants, movies, automated teller machines (ATMs), events, and places to meet); • Personal and Social Memory Services (including action items, notes, calendar events, shared links, and the like);
• E-commerce (including online purchases of items such as books, DVDs, music, and the like); • Travel Services (including flights, hotels, attractions, and the like).
One skilled in the art will recognize that the above list of domains is merely exemplary. In addition, the system of the present invention can be implemented in any combination of domains.
The intelligent automated assistant systems disclosed herein may be configured or designed to include functionality for automating the application of data and services available over the Internet to discover, find, choose among, purchase, reserve, or order products and services. In addition to automating the process of using these data and services, at least one intelligent automated assistant system embodiment disclosed herein may also enable the combined use of several sources of data and services at once. For example, it may combine information about products from several review sites, check prices and availability from multiple distributors, and check their locations and time constraints, and help a user find a personalized solution to their problem.
Additionally, at least one intelligent automated assistant system embodiment disclosed herein may be configured or designed to include functionality for automating the use of data and services available over the Internet to discover, investigate, select among, reserve, and otherwise learn about things to do (including but not limited to movies, events, performances, exhibits, shows and attractions); places to go (including but not limited to travel destinations, hotels and other places to stay, landmarks and other sites of interest, etc.); places to eat or drink (such as restaurants and bars), times and places to meet others, and any other source of entertainment or social interaction which may be found on the Internet.
The intelligent automated assistant system embodiment disclosed herein may be configured or designed to include functionality for enabling the operation of applications and services via natural language dialog that may be otherwise provided by dedicated applications with graphical user interfaces including search (including location-based search); navigation (maps and directions); database lookup (such as finding businesses or people by name or other properties); getting weather conditions and forecasts, checking the price of market items or status of financial transactions; monitoring traffic or the status of flights; accessing and updating calendars and schedules; managing reminders, alerts, tasks and projects; communicating over email or other messaging platforms.
The operating devices locally or remotely (e.g., dialing telephones, controlling light and temperature, controlling home security devices, playing music or video, etc.). Further, at least one intelligent automated assistant system embodiment disclosed herein may be configured or designed to include functionality for identifying, generating, and/or providing personalized recommendations for activities, products, services, source of entertainment, time management, or any other kind of recommendation service that benefits from an interactive dialog in natural language and automated access to data and services.
The intelligent automated assistant of the present invention can control many features and operations of an electronic device. For example, the intelligent automated assistant can call services that interface with functionality and applications on a device via APIs or by other means, to perform functions and operations that might otherwise be initiated using a conventional user interface on the device. Such functions and operations may include, for example, setting an alarm, making a telephone call, sending a text message or email message, adding a calendar event, and the like. Such functions operations may be performed as add-on functions in the context of a conversational dialog between a user and the assistant. Such functions and operations can be specified by the user in the context of such a dialog, or they may be automatically performed based on the context of the dialog. One skilled in the art will recognize that the assistant can thereby be used as a control mechanism for initiating and controlling various operations on the electronic device, which may be used as an alternative to conventional mechanisms such as buttons or graphical user interfaces.
BRIEF DESCRIPTION OF THE DIAGRAM
FIG. 1: is a system block diagram in accordance with an embodiment of the disclosure.
FIG. 1A: is a system block diagram in accordance with an embodiment of the disclosure.
FIG. 1B: is a system block diagram in accordance with an embodiment of the disclosure.
FIG. 1C: is a system block diagram in accordance with an embodiment of the disclosure.
FIG. 2: is a block diagram illustrating components of the portable device shown of FIG. 1, according to one embodiment;
FIG. 3: is a block diagram illustrating functional components of the server shown in FIG. 1.
FIG. 4: is a diagram illustrating examples of various modes according to one embodiment.
FIG. 5: is a flowchart illustrating the MedicWatch mode according to one embodiment;
FIG. 6: is a flowchart illustrating the SOS mode according to one embodiment;
FIG. 7: is a flowchart illustrating the Going-Out mode according to one embodiment;
DESCRIPTION OF THE INVENTION
Many people rely on portable communication devices such as wearable devices and smart phones for voice communication and accessing information. These devices can also send and receive text messages, download content from the internet, and interface with various third party software applications. Some communication devices have become portable enough to be worn by a user. For example, some wrist watches now incorporate microprocessors, graphic interfaces, and network connectivity allowing the watch to wirelessly communicate with other devices, directly or through direct communication to a wireless network. Additionally, some devices incorporate sensors for monitoring an environment surrounding the portable device and/or the wearable device. As such, the portable device collects a variety of information about the user and the environment of the user.
However, the information collected by the portable device may not always be efficiently used to the benefit of the user. For instance, the user may have certain medical conditions that require health monitoring for a heart condition, or the user may travel frequently and need to update concerned persons regarding their location and status. In the case of the user with the medical condition requiring monitoring, concerned individuals may have to make special phone calls or visits to the user to make sure they are okay. And, in the case of the frequent traveler, the user may have to make phone calls or type out a detailed text message to ensure that any concerned individuals are aware of their location and status. Each of these activities is time consuming and inefficient.
To increase the efficiency at which information about a user of the portable device is shared with individuals concerned with the user's wellbeing or safety, the information collected by the portable device can be shared with these concerned individuals such that specialized messages or visits are not needed. In certain embodiments, the portable device associated with the user is configured with an application that will collect information about the user and provide it to a server hosting a service that determines when to contact the concerned individuals based on the information collected from the portable device. The contact initiated by the server may report a variety of information about the user, such as diagnostic information of the user, location information and general wellbeing. These and other features of the disclosure will now be discussed in relation to the figures.
FIG. 1: illustrates a system level block diagram, in accordance with an embodiment of the disclosure. The system of FIG. 1shows a user monitoring and assistance system 100 that includes a user device 102, a server 106, a third party assistance provider 108, and a user monitoring provider 112.
The user device 102 is generally a portable device of a user, such as a mobile, wearable and/or embedded digital device(s), a watch with a computer operating system, a smart phone, a tablet computer, a laptop computer, a personal digital assistant (PDA), a video game console, or any one of a number of additional devices capable of being transported by a user. The user device 102 is capable of executing an application configured to provide user monitoring and assistance. In an embodiment, the user device 102 executing the application configured for user monitoring and assistance is capable of receiving data or information from at least one input device and is configured to communicate with at least one external device using a network interface capable of wireless communication.
The user device 102 communicates with both the server 106 and the user monitoring provider 112. In this manner, the user device 102 is able to collect information about the user that can then be transmitted to the server 106 for determination of whether to contact the monitoring provider 112 who can then in turn contact a third party assistance provider 108. Generally, the user provided information collected by the user device falls into one of two categories: (1) user provided information, and (2) device collected information.
User provided information is typically information that is provided by the user to the application via an input device on the user device. For example, the application may prompt the user to provide a status update such as an "I'm OK" indicator, which can then be transmitted to the monitoring provider 112, either directly from the user device or to the server 106 and then to the monitoring provider 112. The user's input indicating the "I'm OK" may be considered user provided information.
Further, the status message may be provided to the monitoring provider 112 is a variety of ways, such as one of a pre-recorded robo-call message, a live phone call, an email, a text message, an application notification message, an operating system notification message, and/or a distress signal. The status message provides the user's status and in certain embodiments, optionally including the location of the user device. In certain embodiments, the location of the user device provided along with the status message could further include a web-link to a map providing the user device location and when the status update was provided, which would constitute device collected information.
Device collected information is typically information collected by various sensors and systems residing on or associated with the user device. For example, the user device 102 may be configured to receive location information with the use of a Global Positioning System (GPS) receiver and transmit the location information to a server 106. Further examples of available sensors are a water sensor, thermometer, accelerometer, light sensor, barometer, altimeter, an image sensor and a microphone. The image sensor and the microphone are respectively capable of capturing image data and audio signals in the vicinity of the user device 102. Other examples of sensors included in the user device 102 are sensors for collecting vital sign information from the user such as a heart rate, blood pressure or blood sugar (including a blood glucose level). The above described sensors and systems are not meant to be limiting on the types of information that may be collected by the user device.
The user provided information and the device collected information may be transmitted to the server 106. The server 106 receives, collects and reacts to the transmitted information by performing one or more actions. In this regard, the server 106 is configured to analyze parameters based on information related to the user and react according to pre-determined triggering events. For example, in an embodiment, the server 106 may analyze the location information in combination with other parameters, such as the time of day, a preconfigured setting created by the user, and/or input information obtained from other sensors located on or within the user device 102 to determine whether a triggering event has occurred.
The server 106 hosts a service that functions to receive the information from the application for user monitoring and assistance being executed by the user device 102 and utilizes that information to determine whether to send a notification message to a third party. The service hosted by the server 106 is configured to provide this service to a plurality of subscribers who are users of user devices 102 that include the application for user monitoring and assistance. Each of the subscribers has an account with the service such that unique information regarding that particular subscriber can be provided to help determine system functionality, such as when to send certain notification messages and who or what should receive the messages.
In this regard, the user can set up triggering events that will trigger status updates to be sent in one of a variety of forms to one or more recipients, for example, monitoring provider 112, based on the user provided information and/or the device collected information received from the user device 102. Additionally, the server 106 may use algorithms and artificial intelligence to analyze the user provided information and the device collected information from the user device 102 to determine the occurrence of a triggering event.
The server 106 is generally configured to communicate with external devices via one or more networks. Such networks may include one or more wireless networks, wired networks, fiber optics networks, and other types of networks through which communication between the server 106 and an external device may be established. In certain embodiments, the server 106 may send/receive data to/from the user device 102, the third party assistance provider 108, the user monitoring provider 112, or any combination thereof. The parameter information received by the server 106 generally pertains to the user provided information and the device collected information.
The server 106 is configured to take any number of pre-determined actions based on a particular triggering event. A triggering event is an event that when noticed by the server 106 causes the server 106 to take an action on behalf of the user of the user device 102. Examples of triggering events may include:
(1) one of a variety of measured vital signs, such as heart rate, blood pressure or blood sugar, exceed a predetermined threshold;
(2) expiration of a time period or a preset time interval has expired;
(3) leaving a pre-defined geographic space;
(4) receiving a message from the user device 102 that assistance is needed.
In each of these examples, the server 106 would detect the triggering event and proceed to take a specific action in response to the triggering event. The parameters associated with these triggering events may be present at the server 106 by the user accessing her user account with the service hosted by the server 106. In certain embodiments, this access may be conducted over a web interface with the service.
The third party assistance provider 108 shown in FIG. 1 may be an emergency dispatch service, such as public "911" dispatch or a similar service provided by a private entity or person. The third party assistance provider may also provide non emergency service. For example, the third party assistance provider may try calling the user or visiting the user. To assist in providing its server, in an embodiment, the third party assistance provider 108 receives the user's last known location from the server 106. The location may be provided either through a status update message or the third party assistance provider 108 may be able to access a user account at the server 106 that stores reported locations of the user device 102.
The user monitoring provider 112 functions as a monitoring service for the user and is configured to receive messages from both the user device 102 and the server 106. In one embodiment, the user monitoring provider 112 is a user contact that has been configured within the service hosted by the server. The user contact receives notification messages from the user device 102 and/or the server 106 related to a status of the user of the user device 102 and based on the user provided information and the device collected information. In another embodiment, the user monitoring provider 112 may be an optional service, such as a service requiring a monthly or annual subscription where employees of the service process the notification messages containing a user status provided from the user device 102 or the server 106. In yet another embodiment, the monitoring provider 112 may include both the subscribed to service and one of a number of user contacts of the user of the user device 102. Based on information preset by the user at the service hosted at the server 106, the various status updates and notification messages can be directed to one or more of the monitoring provider as a subscribed to service or a user contact.
The user monitoring provider 112 receives status messages containing user provided information and or device collected information from the server 106 based on the occurrence of a triggering event. For example, the monitoring provider 112 may be informed when a user travels beyond a certain geographic region, thereby allowing the monitoring provider 112 to take further actions for the safety of the user. In an embodiment, the monitoring provider 112 may cause the server 106 to send notification messages to user contacts, which may include a friend or family member of the user. In other embodiments, the monitoring provider 112 may send notifications to the user device 102. In certain embodiments, the server 106 will send a user contact status message back to the user device 102 once the notification message has been sent to the user contact or monitoring provider 112. The user contact status message can be utilized by the user device 102 to display that the notification message has been sent to the user contact.
The user monitoring provider 112 is configured to directly receive information from the user device 102, the server system 106, or any combination thereof. The information may include the same type of information used as parameters for the server 106. In an embodiment, the user monitoring provider 112 may provide a service in which a user can contact the user monitoring provider 112 for assistance. For example, in an emergency situation, the user may use a graphic interface of the user device 102 to contact the user monitoring provider 112 for assistance.
As an aside, FIG. 1 does not illustrate the user contacts as being separate from the monitoring provider 112. Rather, FIG. 1 illustrates the single block monitoring provider 112 as representative of both the subscribed to service monitoring provider 112 discussed above and the preset user contacts. FIG. 1 could be illustrated as replacing the monitoring provider 112 with user contacts or user contacts in parallel with the monitoring provider 112 and also in communication with the server 106.
FIG. 1is not intended to limit the environments that may be used. For example, while FIG. 1 shows the user device 102 directly communicating with the server system 106 and also directly communicating with the user monitoring provider 112. Other system configurations are contemplated herein. For example, as illustrated in FIG. 1A, another configuration may involve the user device 102 communicating with a tethered device 104, whereby the tethered device 104 is then configured to communicate with the user monitoring provider 112 and also with the server 106. In FIGS. 1 and 1A, the user device 102 or tethered device 104 is configured for bidirectional communication between the server system 106 and the user monitoring provider 112 such that data is both sent and received from the user device 102 or tethered device 104.
In other embodiments, as illustrated in FIGS. 1B and 1C, the user device 102 or tethered device 104 is configured for unidirectional communication with the server system 106 and the user monitoring provider 112. The unidirectional communication configuration limits the user device 102 or tethered device 104 from receiving messages back from the server 106 and user monitoring provider 112. In certain embodiments, the bidirectional communication can be converted to unidirectional communication in situations of poor cellular signal strength, when battery power of the user device 102 or tethered device 104 needs to be conserved, or any other circumstances where unidirectional communication is preferred.
The tethered device 104, as shown in FIG. 1A, the tethered device 104 runs the application for providing user monitoring and assistance and communicates with the user device 102 as needed to transmit messages to and receive messages from other network entities. The tethered device 104 may be a mobile device such as a wearable or embedded digital device(s), a watch with a computer operating system, a smart phone, a tablet computer, a laptop computer, a personal digital assistant (PDA), a video game console, or any one of a number of additional devices capable of displaying content and being transported by a user. In an embodiment, the tethered device is configured with a network interface capable of wireless communication with the user device 102 and a server system 106. In an example embodiment, the user device 102 is a wearable device such as a watch and the tethered device 104 is a smart phone or tablet computer that's configured to send and receive data with the user device 102.
FIG. 2: shows a block diagram of basic functional components included in one or more of the user device 102 and the tethered device 104, depending on the system configuration. While the discussion below with respect to FIG. 2 discusses components of the user device 102, the discussion is equally applicable to the tethered device 104.
Generally, the user device 102 is configured to perform certain steps in order to enact various modes of the application executed by the user device 102. In enacting the various modes, the user device 102 typically collects user provided information and the device collected information, as discussed above. Based on the device collected information and/or the user provided information, the user device 102 will also transmit that user provided information and/or the device collected information to the server 106 for communicating a status to a user contact upon occurrence of a triggering event.
As illustrated in FIG. 2, the user device 102 includes one or more processors 202, a memory 204, a network interface 206, one or more storage devices 208, a power source 210, one or more output devices 212, one or more input devices 214, and an operating system 216. Each of the components including the processor 202, memory 204, network interface 206, storage device 208, power source 210, output device 212, input device 214, and the operating system 216 is interconnected physically, communicatively, and/or operatively for inter-component communications.
The processor 202 is configured to process instructions for execution within user device 102. In an embodiment, processor 202 executes instructions stored in memory 204 or instructions stored in a storage device 208. The memory 204 may be a non-transient, computer-readable storage medium, and configured to store information within user device 102 during operation. In some embodiments, the memory 204 includes a temporary memory, an area for information not to be maintained when the user device 102 is turned off. For example, the temporary memory may include volatile memories such as random access memories (RAM), dynamic random access memories (DRAM), and static random access memories
(SRAM). The memory 204 also maintains instructions for execution by the processor 202.
Storage device 208 also includes one or more non-transient computer-readable storage media. The storage device 208 is generally configured to store larger amounts of information than memory 204. In an embodiment, the storage device 208 may further be configured for long-term storage of information. The storage device 208 may include non-volatile storage elements such as magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
The user device 102 uses network interface 206 to communicate with external devices using one or more wireless networks, and other types of networks through which a communication with the user device 102 may be established. In the illustrated embodiment of FIG. 1, the network interface 206 of the user device 102 may communicate directly with the server 106 and monitoring company 112, as described above. In another embodiment, illustrated in FIG. 1A, the network interface 206 of the user device 102 communicates directly with the tethered device 104.
The network interface 206 may be an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. Non limiting examples of network interfaces 206 include near field communication (NFC) interfaces, Bluetooth@, 3G and 4G cellular network interfaces, Satellite-based communication, WiFi@, and USB interfaces.
The user device 102 includes one or more input devices 214. Input devices 214 can be configured to receive input from an environment surrounding a user of from direct interaction with the user. In this regard, many input devices can be characterized as sensors. Examples of input devices may include a touch-sensitive screen, a keyboard, a microphone, and an image sensor. Other input devices 214 may include a proximity sensor, a light sensor, a water sensor, thermometer, altimeter, barometer and an accelerometer. Embodiments may also include input devices 214 configured to track the user device's 102 location, such as using an antenna configured to receive location information from Global Positioning System (GPS) or through use of data networks such as WLAN or WAN to triangulate the user device's 102 position based on a measured signal strength received from at least two access points.
The input devices 214 may additionally or alternatively include diagnostic sensors configured to gather vital sign, diagnostic and other health-related information from a user. For example, input devices 214 may include a heart rate sensor, a glucose sensor, and a blood pressure sensor. The number and type of sensors is not intended to be limited to any particular quantity or combination.
One or more output devices 212 are also included in user device 102. Output devices 212 are configured to provide output to a user using tactile, audio, and/or video stimuli. Output device 212 may include a liquid crystal display (LCD) screen, a sound card, a video graphics adapter card, or any other type of device for converting a signal into an appropriate form understandable to humans or machines.
The user device 102 includes one or more power sources 210 to provide power to the device. Non-limiting examples of power source 210 include single-use power sources, rechargeable power sources, and/or power sources developed from nickel cadmium, lithium-ion, solar, or other suitable material.
The user device 102 includes an operating system 216. The operating system 216 controls operations of the components of the user device 102. For example, the operating system 216 facilitates the interaction of the processor(s) 202, the memory 204, the network interface 206, storage device(s) 208, input devices 214, output devices 212, and power source 210. Additionally, the operating system 216 may provide a user interface that provides user access to the application for providing user monitoring and assistance.
FIG. 3: shows a block diagram of basic functional components for a server 106 according to one aspect of the disclosure. The server 106 is generally configured to send data to and receive data from the user device 102, the tethered device 104, or both. The server 106 is also configured to send data to and receive data from the monitoring provider 112.
Generally, the server 106 is configured to host the service subscribed to by a user of the user device 102. The service enables the server 106 to receive the user provided information and/or device collected information from the user device 102. The service is then configured to determine, based on the user provided information and/or the device collected information, whether a triggering event has occurred. If it is determined that the triggering event has occurred, then the service will cause the server 106 to communicate a status of a user of the user device 102 to a user contact, such as the monitoring provider 112. In certain embodiments, the service may also cause the server 106 to transmit a user contact status message. The user contact status message is a message sent by the server 106 to the user device 102 indicating that the user's status has been provided to the user contact. In certain embodiments, the user device 102, upon receiving the user contact status message, will update the user that the status has been reported to the user contact via an output device on the user device 102.
In FIG. 3, the server 106 is illustrated as a single entity. However, in other embodiments, the server 106 could be implemented as a plurality of servers configured in a server system or as a cloud server.The server 106 includes one or more processors 302, a memory 304, a network interface 306, one or more storage devices 308 and an operating system 310. In some embodiments, each of the components including the processors 302, a memory 304, a network interface 306, storage device 308, and operating system 310 are interconnected physically, communicatively, and/or operatively for inter-component communications.
As illustrated, processors 302 are configured to implement functionality and/or process instructions for execution within server 106. For example, processors 302 execute instructions stored in memory 304 or instructions stored on storage devices 308. The memory 304, which may be a non-transient, computer readable storage medium, is configured to store information within server 106 during operation. In some embodiments, memory 304 includes a temporary memory, i.e. an area for information not to be maintained when the server 106 is turned off. Examples of such temporary memory include volatile memories such as random access memories (RAM), dynamic random access memories (DRAM), and static random access memories (SRAM). Memory 304 also maintains program instructions for execution by the processors 302.
Storage devices 308 also include one or more non-transient computer-readable storage media. Storage devices 308 are generally configured to store larger amounts of information than memory 304. Storage devices 308 may further be configured for long-term storage of information. In some examples, storage devices 308 include non-volatile storage elements. Non-limiting examples of non-volatile storage elements include magnetic hard discs, optical discs, floppy discs, flash memories, or forms of electrically programmable memories (EPROM) or electrically erasable and programmable (EEPROM) memories.
The server 106 uses network interface 306 to communicate with external devices via one or more networks. Such networks may include one or more wireless networks, wired networks, fiber optics networks, and other types of networks through which communication between the server 106 and an external device may be established. Network interface 306 may be a network interface card, such as an Ethernet card, an optical transceiver, a radio frequency transceiver, or any other type of device that can send and receive information. The network interface 306 of the server system 106 may be configured to communicate with the user device 102, the tethered device 104, the third party assistance provider 108, the user monitoring provider 112, or any combination thereof, as described above. The network interface 306 may be used for sending notification messages based on triggering events, as described above. The number, content, and type of notifications and the recipients of the notifications are not intended to be limited to any particular configuration.
The server 106 includes an operating system 310. The operating system 310 controls operations of the components of the server 106. For example, the operating system 310 facilitates the interaction of the processor(s) 302, the memory 304, the network interface 306, and storage device(s) 308.Turning now to FIG. 4, a non-exhaustive list of various functions or modes that can be performed by embodiments of the systems shown in FIG. 1-1C is provided. The example modes are not intended to be limiting, and various other modes may be performed in accordance with the present disclosure. For readability and convenience, the various modes described below may be referred to as brand or trade names or similar.
In one mode referred to as MedicWatch, the user device is configured to collect various diagnostic and health-related information through at least one input device, and the received information is used as parameter information for at least one triggering event, such as contacting third party assistance or a paramedic. In a second mode, referred to as the SOS mode, a countdown timer is initiated, requiring the user to type a preset personal identification number (PIN) number before expiration of a countdown timer.
If the preset PIN number is not entered before the countdown timer expires, then a sequence of notifications may be triggered. In a third mode, referred to as GoingOut, a user can select an activity and an associated time duration, and notification messages may be triggered if the user has not indicated completion of the activity within the time duration. In a fourth mode, referred to as Golden Halo, a series of notification messages may be triggered depending on a user's location and other device collected information. In a fifth mode, referred to as I'm OK, a series of notifications may be triggered based on user input indicating that assistance is needed. In a sixth mode, referred to as LatchKey, a notification message may be triggered when the user enters or exits a preset geographic area.
In a seventh mode, referred to as Check-In, the user can conveniently send notification messages to at least one preset destination, whereby the notification may include location or a configurable message. In an eighth mode, referred to as Breadcrumb, the user device 102 captures its geographical location and transmits it along with a timestamp to the server 106, where the location and time information may then be accessed by the user or a third party with access to the user's private account information stored on the user device 102 and/or at the server 106.
Medic watch Mode
FIG. 5: shows a flowchart describing the various sequence steps of the MedicWatch mode. In this mode, the user device is configured to collect various diagnostic and health-related information through at least one input device, and the received information is used as parameter information for at least one triggering event, such as contacting third party assistance or a paramedic. This example mode may be implemented by the user device 102 or in combination with the tethered device 104, and the mode may be performed within various system environments, such as those shown in FIGS. 1-1C.
At step 506, the graphic interface displays the various modes that a user can select from, including the MedicWatch mode.At step 508, a graphic interface illustrating the MedicWatch start screen is shown. In an embodiment, a touch-sensitive button on the graphic interface permits the user to initiate the mode.At step 510, the graphic interface display permits the user to adjust configuration settings. In general, configuration of settings is optional after an initial setup procedure. As such, each time the user accesses the MedicWatch mode, providing configuration settings for this mode will not be necessary. The configuration settings generally relate to user provided information and device collected information, and in particular relate to diagnostic or vital sign information of the user. Some non limiting examples of user provided information relevant to MedicWatch include the user's age, weight, health conditions, allergies, medications, and blood type. Device collected information can include heart rate, blood pressure, blood sugar or blood glucose level, geographic location, motion activity, and any other information that can be obtained through sensors and/or an input device.
During configuration, the user sets various thresholds, such as a maximum and/or minimum blood pressure level, maximum and/or minimum heart rate and maximum and/or minimum blood sugar level. These thresholds are then utilized as triggering events to determine when the server 106 (see FIG. 1) should send a notification message to the monitoring provider 112. The notification message may include certain user provided information and device collected information, such as the user's age, weight, health conditions, allergies, medications, blood type and current location, which may assist any monitoring provider 112 or third party assistance provider 108 in helping the user in case of a medical emergency. The information provided in the notification message may be configured during the user configuration step 510.
The user provided information and device collected information is transmitted to the server 106 and stored at the server 106. In other embodiments, configuration of settings can be accomplished using the tethered device 104 or a web interface accessible through an internet connected device. The user provided information and the device collected information is accessible by the user when they sign into their account at the service provided by the server 106.
At step 512, the MedicWatch mode generally monitors device collected information, and in particular vital sign and diagnostic information, such as heart rate, blood pressure, blood sugar or blood glucose level. This information can be analyzed to determine whether a triggering event has occurred, such as exceeding a threshold set during the configuration step 510. For example, after a user sets a maximum blood pressure threshold at step 510, a triggering event may occur when a blood pressure measurement exceeds that threshold. Triggering events are not limited to any particular combination of user provided and device collected information. In an embodiment shown in FIG. 5, the server 106 (see FIG. 1) receives the user provided and device collected information, and determines whether a triggering event has occurred. In other embodiments, this may be accomplished by the user device 102 or the tethered device 104.
At step 514, the server 106 (see FIG. 1) has determined that a triggering event has not occurred based on the received user provided and device collected information. In absence of a triggering event, the MedicWatch mode continues monitoring as described in step 512. In an embodiment, the MedicWatch mode may terminate monitoring after a preset time or may terminate based on user provided and collected information. The conditions under which the MedicWatch mode may terminate may be set by the user during the configuration step 510.
At step 516, the server 106 (see FIG. 1) has determined a triggering event has occurred. Upon detecting the triggering event, the server 106 is generally configured to initiate at least one notification message. The notification messages may be sent to the user, the third party assistance provider 108, the monitoring provider 112, or any other recipient. In an embodiment, the monitoring provider 112 may send notification to the third party assistance provider 108 or any other recipient in order to have medical assistance provided to the user. Notification messages may vary depending on the triggering event and various combinations of user provided and device collected information. For example, in an embodiment that uses blood pressure as a triggering event, if the user is within a certain geographic zone, one notification message to the user may indicate that medication should be taken. Additional notification messages may be sent to other recipients if the user exits that geographic zone. The content, type, and destination of notification messages may vary based on user provided information and device collected information.
At step 504, the graphic interface includes a stop button, which then engages an authentication, allowing the user to disable the MedicWatch mode.At step 502, a user authenticates by the user device 102. In certain embodiments, the authentication can be accomplished by entering a personal identification number (PIN) on a touch-sensitive graphic interface of the user device 102. Authentication may be performed by any number of methods, such as biometric, voice, image or password. Successful authentication disables the MedicWatch mode.
At step 506, successful authentication has occurred, disabling the MedicWatch mode. The graphic interface displays the various modes that a user can select from, including the MedicWatch mode. However, if authentication is unsuccessful, then the user device 102 will continue operating the MedicWatch mode.
SOS Mode
FIG. 6: shows a flowchart of the SOS mode. In this mode, a countdown timer is initiated, requiring the user to type a preset personal identification number (PIN) number before expiration of a countdown timer. If the preset PIN number is not entered before the countdown timer expires, then a sequence of notifications may be triggered. In this manner, a user is able to have a help message sent to a third party assistance provider 108 (see FIG. 1), such as the police, in the event they are in danger and unable to directly contact the third party assistance provider 108 themselves.
This mode may be implemented by the user device 102 or in combination with the tethered device 104, and the mode may be performed within various system environments, such as those shown in FIGS. 1-1C. The below description of the SOS mode is made in reference to the system environment shown in FIG. 1 whereby the user device 102 is provided with a touch-screen graphic interface display. However, in other embodiments, use of a touch-screen graphic display is not required as other suitable input devices of the user device 102 may be utilized.
Prior to using the SOS mode, a password, such as a PIN number, that can deactivate a countdown timer by entering the password at the user device 102 (see FIG. 1) must be set. Non-limiting examples of the password can be a combination of text, numbers, and symbols, swipe patterns or a dictated message that can be received by a microphone of the user device 102.After setting a password, the SOS mode may be executed by the user device 102. Step 602 shows a start screen displayed on the touch-screen graphic interface of the user device 102. The start screen displays the various modes that a user can select from, including the SOS mode. In this example, the user has already set a PIN number as a password. In an embodiment, the user can select and initiate the SOS mode using the touch-screen interface. Typically, the user will initiate the SOS mode in a moment of danger such as a carjacking or kidnapping.
At step 604, the SOS mode has commenced, and a countdown timer 604 has started. A graphic interface displayed on the user device 102 generally displays a continuously decreasing time value corresponding to the countdown timer 604 and a touch-sensitive cancel button that allows the user to access a graphic interface display whereby the user can enter a PIN number using the touch-sensitive display on the graphic interface. In an embodiment, the graphic interface may display the decreasing time value in increments of 1 second, 5 seconds, or any other time value. In an embodiment, the time value increment can vary depending on the time remaining in the countdown timer 604.
The countdown timer 604 relates to the maximum amount of time that the preset password must be entered before a triggering event occurs, which in this case is the expiration of the countdown timer 604. The countdown timer 604 can be any time value, such as 15 seconds or 60 seconds. In an embodiment, the time value of the countdown timer 604 can depend on device collected or user provided information. For example, the time value of the countdown timer 604 may automatically decrease when the user is farther away from a preset location or during nighttime hours. In an embodiment, the time value may be preconfigured by the user or by the server 106.
At step 612, the timer 604 has expired prior to successful entry of the password. The expiration of the timer 604 functions as a triggering event that sends device collected information, such as a location of the user device 102, to the server 106 (see FIG. 1), which in turn sends a notification message including the location and a distress message to preset contact(s), and/or the monitoring provider 112 and the third party assistance provider 108. In certain embodiments, upon expiration of the timer 604 the user device 102 will send the location as device collected information more than once. In this embodiment, the location of the user device 102 is sent to the server 106 periodically such that the movement of the user device 102 can be tracked by the monitoring provider 112 and/or the third party assistance provider 108.
At step 606, the user has selected the cancel timer option prior to the expiration of the timer 604. Selecting the cancel timer option brings up the password entering interface that prompts the user to enter the preset PIN number, using the touch sensitive display on the graphic interface. In some embodiments, the graphic interface display may provide ten touch-sensitive buttons with each button corresponding to a number from zero through nine. In some embodiments, the graphic interface display may provide fewer than ten touch-sensitive buttons. For example, in an embodiment the touch-screen graphic interface display only provides a subset of buttons in order to conserve screen space on the touch sensitive display. For instance, if the password is a four-digit PIN number, the password entering interface may only include the four digits of the PIN displayed in a random order such that guessing the order of numbers is not likely to be achieved prior to expiration of the timer 604.
At step 610, the user has entered an incorrect PIN. The graphic interface display indicates unsuccessful entry of the PIN by returning to the home screen of the SOS mode, which places the user back to the timer 604 that is still counting down. If the user desires to attempt to enter the password again, then the cancel option must be selected a second time. In an embodiment, the number of attempts to enter the correct password is limited to the duration of the timer 604 before the alert is sent 612.
Going Out Mode
FIG. 7: shows a flowchart of the Going Out mode. In this mode, a user can select an activity and associated time duration, and notification messages may be triggered if the user has not indicated completion of the activity within the time duration. This example mode may be implemented by the user device 102 or in combination with the tethered device 104, and the mode may be performed within various system environments, such as those shown in FIGS. 1-1C. The below example of the Going Out mode is made in reference to the system environment shown in FIG. 1 whereby the user device 102 is provided with a touch-screen graphic interface display. However, in other embodiments, use of a touch-screen graphic display is not required as other suitable input devices of the user device 102 may be utilized.
Step 702 shows a start screen displayed on the touch-screen graphic interface. The start screen displays the various modes that a user can select from, including the Going Out mode. Upon selection of the Going Out mode, the application proceeds to step 704. At step 704, the graphic interface of the user device 102 prompts the user to choose from a predetermined list of various activities or input a custom activity. The list of various activities generally pertains to activities in which a user may temporarily leave her current destination, such as various sports or outdoor related activities. Some non-limiting examples include biking, a date, going to a friend's house, jogging, walking, and hiking. In an embodiment, the graphic interface enables the user to input an activity using the touch screen graphic interface input device 214 (see FIG. 2).
At step 706, the application of the user device 102 (see FIG. 1) prompts the user to enter a time duration. In an embodiment, the time duration relates to the duration in which the user expects to complete the activity selected in step 704. The time duration may be expressed as a specific time of the day, such as 3:05 PM, or may be expressed as an absolute value of minutes and hours. In an embodiment, the user is permitted to input other information relating to the selected activity, such as location information of where the selected activity may take place.
At step 708, the application causes the user device 102 (see FIG. 1) to prompt the user to enter a check-in time. The check-in time may be expressed as a specific time of the day or as an absolute value of minutes and hours. In an embodiment, the check-in time corresponds to an upper time limit in which the user must complete a check-in step.
The server 106 proceeds to monitor for the expiration of the time duration set in step 706 or for whether the preset check in time is reached as set at step 708. Upon expiration of the time duration or coming to the preset time, at step 710, the application causes the user device 102 (see FIG. 1) to prompt the user to check in by selecting a check in option displayed on the user device 102. FIG. 7A illustrates this prompt, in accordance with one embodiment and upon the user checking in, at step 712, the user device 102 shows a graphic interface displayed by the user device 102 that directs the user back to a home page of the application.
The Going Out mode, a triggering event occurs if the user does not successfully complete the check-in step before the time value corresponding to step 708. In an embodiment, the triggering event varies based on user provided information, device collected information, or any combination thereof. In an example embodiment, the server 106 (see FIG. 1) does not receive the check-in step from the user device 102 indicating the Going Out mode has been completed within the time indicated at step 708, triggering an alert. The alert may include user provided information, such as the selected activity from step 704, the duration from step 706 and the check-in time from step 708. The alert may also include device collected information, such as location information or other information collected through any of the various input devices described above.
The server 106 (see FIG. 1) then triggers any of various notifications to any of various destinations, such as the monitoring provider 112 or the third party assistance provider 108. The content, type, and destinations may vary based on user provided information and device collected information. As an example of user provided information, the user may preset notification message content and one or more recipients of any notification messages. As an example of device collected information, the type of notification message may include location information.
For example, the user may configure, within the user device 102, to initiate only one notification message, whereby the notification is sent to the user's neighbor (functioning as the monitoring provider 112) if the user has not completed the check-in step after a brief walk, and if the location information indicates the user was recently within a certain proximity to the neighbor. Based on the location information sent to the server 106, other user contacts functioning as the monitoring provider 112 may be contacted as well. The number, content, and type of notifications and the recipients of the notifications are not intended to be limited to any particular configuration.
The algorithms and/or artificial intelligence executed by the service provided by the server 106 (see FIG. 1) or the application executed by the user device 102 can tell if the user is in distress and take appropriate action. For instance, if a remote sports related emergency occurs, such as a hiker in trouble in Yosemite, or a sailboat in distress, the user device 102 can ping a distress signal through all available wireless methods, to the server 106 to summon assistance. This can be activated either actively by input from the user or passively, where the application executed by the user device 102 or through the above mentioned algorithm and/or artificial intelligence.
An additional "where am I" feature may be implemented in this mode. This feature allows the user to query the user device 102 to provide the closest address, mile marker, coordinates or any other pertinent information providing an indication of the user's location.

Claims (5)

WE CLAIM
1) Our invention "MT-Family Member Activities and Location" is a device and system are provided for location, distance, audio, video, notifying a user contact of the status of a user of a portable device. The invented technology also provide the all relevant status is determined by the portable device collecting user provided information and device collected information relevant to a user of portable device. The invented technology also the portable device may then transmit the device collected information and the user provided information to a server that in turn performs an analysis on the device collected information and the user provided information to determine whether a triggering event has occurred. The invented technology If it is determined that a triggering event has occurred the local server, global server will proceed to send a status update regarding the user of the portable device to preset user contacts. The invented technology also the triggering event is determined to have occurred based on preset user conditions and algorithms and artificial intelligence being executed at the server and an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The invented technology can be implemented using any of a number of different platforms, such as the web, email, smartphone, etc. the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact.
2) According to claims# The invention is to a device and system are provided for location, distance, audio, video, notifying a user contact of the status of a user of a portable device. The invented technology also provide the all relevant status is determined by the portable device collecting user provided information and device collected information relevant to a user of portable device.
3) According to claiml,2# The invention is to the invented technology also the portable device may then transmit the device collected information and the user provided information to a server that in turn performs an analysis on the device collected information and the user provided information to determine whether a triggering event has occurred.
4) According to claiml,2,3# The invention is to the invented technology If it is determined that a triggering event has occurred the local server, global server will proceed to send a status update regarding the user of the portable device to preset user contacts.
5) According to claiml,2 3,4# The invention is to the invented technology also the triggering event is determined to have occurred based on preset user conditions and algorithms and artificial intelligence being executed at the server and an intelligent automated assistant system engages with the user in an integrated, conversational manner using natural language dialog, and invokes external services when appropriate to obtain information or perform various actions. The invention is to the invented technology can be implemented using any of a number of different platforms, such as the web, email, smartphone, etc. the system is based on sets of interrelated domains and tasks, and employs additional functionally powered by external services with which the system can interact. The invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of communicating the one or more images to the user contact.The invention is to wherein the one or more sensors comprises a microphone, and upon the occurrence of the triggering event, the microphone captures an audio signal in an environment around the portable device. The invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of communicating the audio signal to the user contact. The invention is to wherein the memory further comprises instructions for causing the processor to perform the further steps of causing a device tethered to the portable device to communicate one or more images captured by an image sensor of the tethered device or communicate an audio signal captured by a microphone of the tethered device. The invention is to wherein the memory further comprises instructions for causing the processor to perform the further step of indicating that the status of the user has been communicated to the user contact.
FIG. 1: IS A SYSTEM BLOCK DIAGRAM IN ACCORDANCE WITH AN EMBODIMENT OF THE DISCLOSURE.
FIG. 1A: IS A SYSTEM BLOCK DIAGRAM IN ACCORDANCE WITH AN EMBODIMENT OF THE DISCLOSURE.
FIG. 1B: IS A SYSTEM BLOCK DIAGRAM IN ACCORDANCE WITH AN EMBODIMENT OF THE DISCLOSURE.
FIG. 1C: IS A SYSTEM BLOCK DIAGRAM IN ACCORDANCE WITH AN EMBODIMENT OF THE DISCLOSURE.
FIG. 2: IS A BLOCK DIAGRAM ILLUSTRATING COMPONENTS OF THE PORTABLE DEVICE SHOWN OF FIG. 1
FIG. 3: IS A BLOCK DIAGRAM ILLUSTRATING FUNCTIONAL COMPONENTS OF THE SERVER SHOWN IN FIG. 1.
FIG. 4: IS A DIAGRAM ILLUSTRATING EXAMPLES OF VARIOUS MODES ACCORDING TO ONE EMBODIMENT.
FIG. 5: IS A FLOWCHART ILLUSTRATING THE MEDIC WATCH MODE ACCORDING TO ONE EMBODIMENT;
FIG. 6: IS A FLOWCHART ILLUSTRATING THE SOS MODE.
FIG. 7: IS A FLOWCHART ILLUSTRATING THE GOING-OUT MODE ACCORDING TO ONE EMBODIMENT;
AU2020102378A 2020-09-23 2020-09-23 MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY Ceased AU2020102378A4 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2020102378A AU2020102378A4 (en) 2020-09-23 2020-09-23 MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
AU2020102378A AU2020102378A4 (en) 2020-09-23 2020-09-23 MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY

Publications (1)

Publication Number Publication Date
AU2020102378A4 true AU2020102378A4 (en) 2020-11-05

Family

ID=73016655

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2020102378A Ceased AU2020102378A4 (en) 2020-09-23 2020-09-23 MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY

Country Status (1)

Country Link
AU (1) AU2020102378A4 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114333232A (en) * 2021-12-29 2022-04-12 北京师范大学 Alarm position-indicating or individual distress position-indicating terminal equipment system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114333232A (en) * 2021-12-29 2022-04-12 北京师范大学 Alarm position-indicating or individual distress position-indicating terminal equipment system

Similar Documents

Publication Publication Date Title
US12412659B2 (en) System and method for monitoring activities through portable devices
US20100222645A1 (en) Health and wellness monitoring system
US10673970B2 (en) Notifications of unaddressed events
US11741821B2 (en) Systems and methods for semi-autonomous individual monitoring
US10142487B2 (en) Personalized reminders
US10320913B2 (en) Service content tailored to out of routine events
EP3227803B1 (en) Method and system for providing critical care using wearable devices
US20180293865A1 (en) Comprehensive system and method of universal real-time linking of real objects to a machine, network, internet, or software service
US20170372592A1 (en) Wearable device for safety monitoring of a user
US10182770B2 (en) Smart devices that capture images and sensed signals
US9811997B2 (en) Mobile safety platform
US20250030798A1 (en) Providing health urgency context for incoming calls
US11701007B2 (en) Systems and methods for biometric tamper detection
AU2020102378A4 (en) MT-Family Member Activities and Location: FAMILY MEMBER ACTIVITIES AND LOCATION MANAGEMENT TECHNOLOGY
US20230245549A1 (en) Systems and Methods for Dual Direction Individual Monitoring
Mallat et al. Assist-Me-A Volunteer Mobile Emergency System to Assist Elderly People
US20140038543A1 (en) Personal security system for a mobile device
US20220051552A1 (en) Systems and Methods for Multi-Point Check-In Communication and Processing

Legal Events

Date Code Title Description
FGI Letters patent sealed or granted (innovation patent)
MK22 Patent ceased section 143a(d), or expired - non payment of renewal fee or expiry